15,795,318 members
Sign in
Sign in
Email
Password
Forgot your password?
Sign in with
home
articles
Browse Topics
>
Latest Articles
Top Articles
Posting/Update Guidelines
Article Help Forum
Submit an article or tip
Import GitHub Project
Import your Blog
quick answers
Q&A
Ask a Question
View Unanswered Questions
View All Questions
View C# questions
View C++ questions
View Javascript questions
View Python questions
View PHP questions
discussions
forums
CodeProject.AI Server
All Message Boards...
Application Lifecycle
>
Running a Business
Sales / Marketing
Collaboration / Beta Testing
Work Issues
Design and Architecture
Artificial Intelligence
ASP.NET
JavaScript
Internet of Things
C / C++ / MFC
>
ATL / WTL / STL
Managed C++/CLI
C#
Free Tools
Objective-C and Swift
Database
Hardware & Devices
>
System Admin
Hosting and Servers
Java
Linux Programming
Python
.NET (Core and Framework)
Android
iOS
Mobile
WPF
Visual Basic
Web Development
Site Bugs / Suggestions
Spam and Abuse Watch
features
features
Competitions
News
The Insider Newsletter
The Daily Build Newsletter
Newsletter archive
Surveys
CodeProject Stuff
community
lounge
Who's Who
Most Valuable Professionals
The Lounge
The CodeProject Blog
Where I Am: Member Photos
The Insider News
The Weird & The Wonderful
help
?
What is 'CodeProject'?
General FAQ
Ask a Question
Bugs and Suggestions
Article Help Forum
About Us
Search within:
Articles
Quick Answers
Messages
Comments by crazedDotNetDev (Top 1 by date)
crazedDotNetDev
18-Apr-12 15:01pm
View
Sure.
1000+ records... that's getting to the point where HTML might have problems. You might want to also allow paging on the report's datagrid. (http://msdn.microsoft.com/en-us/library/5aw1xfh3.aspx)
The technique I used a while back is setup a nightly job that truncates & repopulate a series of "read only" tables dedicated for reporting. (Note: This assumes you do not need to report on items entered after the job executes and its ok for the report to contain data a day old.) The basic idea is to find items in the reports that can be done once, nightly. Most aggregations (i.e. sum(), max(), min(), etc...) are good candidates for this process. SQL Server doesn't have this kind of functionality built in, you'll need to code it.
Example, Let's say you have a year-to-date receipt report that is taking too long. Assuming the query is something like...
select
month(receiptDate), sum(amount)
from
tblReceipts
where
year(receiptDate) = year(getdate())
...pull that query out of the report, create a reporting table, and populate it via a nightly job...
truncate table tblReport_ReceiptsYTD
insert into tblReport_ReceiptsYTD
select
month(receiptDate), sum(amount)
from
tblReceipts
where
year(receiptDate) = year(getdate())
...and the report’s query becomes something like...
select
receiptDate, amount
from
tblReport_ReceiptsYTD
Even if the nightly job takes 10 minutes to finish, it's ok. The time is spent once, during the night. Anyone hitting the report during the day will see the optimized, pre-aggregated report table.
SQL Server allows cross database queries, so just manually reference database. (i.e. instead of "from tblReport_ReceiptsYTD" use "from databaseA.dbo. tblReport_ReceiptsYTD") Setup a linked server and you can even include other servers. The syntax is [servername].[databasename].[owner].[tablename]... or use openquery() to get really crazy.
About the last paragraph...
First off, the database is the typical bottleneck. It's harddrive intensive, whereas all other pieces of the system tend to get cached in RAM. A second reason the database is a typical bottleneck is locking. If a given user is updating a row, then it locks and nobody else can read it. Always put the database on the strongest server possible.
I'd put load balancing the websevers (or put them in a cluster) more for reliability than performance. If one goes down users typically won't know. Clustering the database server would also be nice, but it’s difficult to get right. See http://en.wikipedia.org/wiki/Cluster_%28computing%29