|
I presume you are talking about the "Web Application" project type that was introduced in VS2005 SP1, versus the "Web Site" option? AFAIK, there is no difference in performance: they are both ASP.NET applications, just with different solution explorer rules. I may be wrong...
You might have more luck posting this in the ASP.NET forum[^].
----------------------------------
Be excellent to each other
|
|
|
|
|
hai
i m interested in creating a CMS in asp.net for my main project
anybody will tell me how to do it or any TIPS.
|
|
|
|
|
nibinki333 wrote: any TIPS
The first tip that comes to mind is, what if people don't know what your acronym means "CMS"?
led mike
|
|
|
|
|
What have you done or found so far?
"The clue train passed his station without stopping." - John Simmons / outlaw programmer
"Real programmers just throw a bunch of 1s and 0s at the computer to see what sticks" - Pete O'Hanlon
|
|
|
|
|
|
I need to create an application: upon user login, check the user type and decide the database that application will use for that user. Here is what I have:
1: Master database: stores user information and application configuration data, such as which db to use and connection string
2: Database I on server 1: contains all the application data for user group I.
3: Database II on server 2: contains all the application data for user group II.
4: Database I backup on server 2: as backup for user group 1 in case server 1 fails.
5: Database II backup on server 1: as backup for user group 2 in case server 2 fails.
6: Through log shipping between server 1 and server 2 to keep DB I and I and their backup up-to-date.
7: The web site is running on web farm. The session state management is using SQL server session state management, which is on the same server as Master DB.
My questions are:
1: How do I determine if server 1 or server 2 is down and it's time for application switching to use the DB backup on the other server.
2: What to do if the master DB server is down?
3: Is there a better approach than the above design to achieve the same goals: using different DB for different users, and achieving fast recovery by using the backup server?
I would really appreciate any suggestion and help!
Thanks in advance!
|
|
|
|
|
There are commercial products that work with Databases to perform mirroring, monitoring and automatic rollover on failure. Is something like that an option for you?
Trying to roll your own solution will certainly be a sizeable effort and you will likely not achieve the level of confidence a commercial product will have.
led mike
|
|
|
|
|
We just got the second database server and plan to divide users into two two servers. Unless the tool can utilize both servers, we will go with own solution. What product do you recommend?
Thanks!
|
|
|
|
|
and if the master database goes down....? :P
|
|
|
|
|
Hello,
I am designing a .NET database application that uses 3 tier architecture. Starting initially, this application will be desktop application but I will convert it into a website later but design that I am planning should support both version.
Development Environment : VS2008
Currently Database supported are MS SQL Server 2005 and MYSQL 5 and design for database support is extensible.
This application contains several high level modules (HLM) that can either share the data from the database of other high modules and some are totally independent modules.
Like say finance and project automation are not dependent on each other but finance and investments are. These are some examples of the modules that will be used in this the application.
For website version of this app,
Main Website will mainapplication.com and
High level module (HLM) will be having subdomain names like
hlm1.mainapplication.com ,
hlm2.mainapplication.com and so on.
Database Name : mainapplication
Now for desktop version should I create different <hlm> EXE application and linked them to the main application and share the same database with all the apps?
I like the design of MS Money application for desktop application. However, this is only a finance application and it is also that big. Can that design help me for this scenario or are there any references of big desktop application that is extensible and contains support for database ?
Currently I have research enough on desktop application for
1. Interprocess communication can be using named pipes, since it is on same machine.
2. Single Sign on in main application will allow access to other HLM exe application.
3. Only Main application will call the other HLM application. Running HLM Exe alone will give a error .
4. New HLM application will have a module code and store in the database and loaded in the main application once logged in. Using module code, the HLM exe will be loaded.
5. Main application will have framework to support dynamic linking of HLM and have base libaries that will be implemented and use by HLM exe application.
6. Each HLM exe will have its seperate BO and DAO layers.
7. If HLM B depends on HLM A then HLM B will reference HLM A BO layer for calling functionalities that are required.
8. There will be only 1 instance of the any application running at anytime time.
I am not sure of the following
1. Transaction handling between multiple processes.
2. If there are anything else that I need to take care of
Can anyone please help me decide and answer the queries (marked in bold italic) so that I can move with the development of the application. This will be big application should the design and architecture should be finalize first.
modified on Thursday, July 24, 2008 11:58 AM
|
|
|
|
|
abhijitbkulkarni wrote: This will be big application should the design and architecture should be finalize first
I'm pleased to hear it, but this should really be the case with just about any project. Can I suggest that you take a look at Microsoft's Composite Application Block? This will help clarify a lot of the problems/issues you may encounter, and is a good place to start when thinking about a decoupled architecture - especially when the interface can vary so widely.
|
|
|
|
|
That surely help. It gave me a starting point for what I required.
I will go through Composite Application Block and check with the design that will be best for the application.
Thanks for reply.
|
|
|
|
|
Pete O'Hanlon wrote: Microsoft's Composite Application Block
Looks interesting enough
"The clue train passed his station without stopping." - John Simmons / outlaw programmer
"Real programmers just throw a bunch of 1s and 0s at the computer to see what sticks" - Pete O'Hanlon
|
|
|
|
|
I'm currently trying to write LinFu v2.0, but the problem is that one of the components it needs to verify the IL that v2 generates requires a dynamic proxy generator of its own. The problem is that if I have it depend on version 1.0, I'll end up with either a circular dependency on the old framework, and if I rely on someone else's dynamic proxy (and break the circular dependency), it'll be a bit embarrassing on my part.
Is there any way around this, or should I just bite the bullet and try someone else's dynamic proxy generator for the sake of the IL verifier?
|
|
|
|
|
To avoid the circular dependency, you could always dynamically invoke the dynamic proxy generator.
|
|
|
|
|
Pete O'Hanlon wrote: To avoid the circular dependency, you could always dynamically invoke the dynamic proxy generator.
I could do that, but in this case I've changed my mind and decided not to use a dynamic proxy generator at all. I'll just have to bite the bullet and do my IL verification with either PEVerify, or in the worst case, I'll write my own runtime verifier. Thanks anyway, Pete!
|
|
|
|
|
Hi all.
I'm re-developing a Data Access Layer Manager (codeplex.com/dalm) and have some architectural doubts about some of the I/O operations.
The problem
All queries are read from an XML file which is loaded into memory on first run. When a query is requested to be executed, the DALM will read from memory. Now, if the query doesn't exist in memory, the XML file has to be read again. But while reading the XML file, queries that does exist in memory should still be retrieved and executed. I have been using ADO.NET DataSet to read XML until now.
1st solution
When the a query is not found, the thread will continue to a "read xml statement" which will be locked using the lock keyword. This way other treads will still be able to access the DataSet. The problem is that the threads that also didn't find their query will continue to a "read xml statement". A solution could be to load all the threads into an Array, fire an event which will load the XML and then call the "find query" method recursively. The array would function as a Queue.
But how does threads behave when you put them in an array? Won't they continue? And if I continue to use DataSet, won't there be a moment where threads will get access denied exceptions, when I change the reference to the new DataSet? Loading grafs XML is not an option as I want the XML file to be human editable, ie. simple.
2nd solution
Dunno - something better? Maybe there is a pattern that doesn't require any locking at all?? I've looked into immutable stacks (http://blogs.msdn.com/ericlippert/archive/2007/12/04/immutability-in-c-part-two-a-simple-immutable-stack.aspx[^]) to see if I could somehow synchronize without locking but I'm not sure how to apply it to my problem.
I hope this triggers some ideas!
Thanks in advance, Jon
If the world should blow itself up, the last audible voice would be that of an expert saying it can't be done - Peter Ustinov
modified on Monday, July 21, 2008 3:40 AM
|
|
|
|
|
Maybe some code would help? Any questions or is it just too tough?
If the world should blow itself up, the last audible voice would be that of an expert saying it can't be done - Peter Ustinov
|
|
|
|
|
Doesn't matter - I figured it out
If the world should blow itself up, the last audible voice would be that of an expert saying it can't be done - Peter Ustinov
|
|
|
|
|
Does anybody has an idea about the slide sorter in power point. The items inside the slide sorter are thumbnail or an actual live controls that contains different types of shape objects. The reason is that i created 200 slides and when i opend the file it opens instantly. I want to know how powerpoint manages this kind of performance if they are are thumnail images or Controls
Thanks for replying
|
|
|
|
|
I would suspect it uses precached thumbnails. As I don't work on the Powerpoint team, this is just my guess however but it does seem likely.
|
|
|
|
|
Thanks for answering and too bad for me you that you aren't in their team. So you think that each time when we move a shape on the screen a thumnail image is created again for that item in slide sorter. Moreover, what about when we type a word does slide sorter recreate a thumnail.
|
|
|
|
|
netJP12L wrote: Moreover, what about when we type a word does slide sorter recreate a thumnail.
Possibly, and maybe probably.
|
|
|
|
|
I was reading an interesting article[^] in Dr Dobbs on event based architectures. In the article, the author describes an architecture in which objects are almost completely decoupled from each other. This is done through events. Say you have object A that raises an event. Object B is interested in this event. A third party called a binder binds these two objects together without either one of them knowing about the other.
This can be easily accomplished in C# using events and delegates.
public Binder()
{
a = new A();
b = new B();
a.SomethingHappened += b.HandleSomethingHappened;
}
However, in playing around with this type of approach, I've noticed that it may not scale well. The problem is that the binder object has all of the responsibility of connecting the dots. This can lead to a ton of configuration code (usually in the binder class's constructor), and it's easy to forget to hook everything up. "Oops! I forgot to hook B to A or was it B to C?" That sort of thing.
So I've considered modifying the approach in which each class knows what events it needs to hook to; it does so by taking an object passed to its constructor and hooking to the relevant events. This approach involves greater coupling. The target of an event knows about the sender. But on the other hand, it relieves the "binder" object of having to know how to hook everything up. All it's responsible for is passing the event raising object to the receiver.
public Binder()
{
a = new A();
b = new B(a);
}
public B(A a)
{
a.SomethingHappened += HandleSomethingHappened;
}
This also has the advantage of giving B the option of making the method responsible for handling the event private or protected so that it's hidden from the outside world.
We can minimize the coupling by using interfaces that expose only specific parts of a class's overall functionality. Then only a reference to the interface is passed to the target class thus limiting its knowledge of the event raiser.
Thoughts?
|
|
|
|
|
If you're interested in loose coupling in your design, I would definitely take a look into Dependency Injection. BTW - what you described initially sounds to me like a variation on the Mediator pattern.
|
|
|
|