|
Thanks Bernard!
Very good suggestion.... I'll implement some sort of logging.
Thanks also, I did check that the parameters are added in the correct order. I have tested and tested the code over and over on my laptop, and I don't get errors, this is why I'm leaning towards some sort of concurrency issue. I also checked the locking settings on the Database.... Oops - they were set to None!!!!
|
|
|
|
|
I wrote: maybe no locking at all, which would cause your problems.
Richard.Berry100 wrote: I also checked the locking settings on the Database.... Oops - they were set to None!!!!
No further comment required.
Peter
Software rusts. Simon Stephenson, ca 1994.
|
|
|
|
|
I have a TSQL stored procedure that can run for a few minutes and return a few million records, I need to display that data in an ASP.NET Grid (Infragistics WebDataGrid to be precise). Obviously I don't want return all data at once and need to setup some kind of paging options - every time user selects another page - another portion of data is loaded from the DB. But I can't run the SP every time new page is requested - it would take too much time.
What would be the best way to persist data from the SP, so when user selects a new page - new data portion would be loaded by a simple SELECT... WHERE from that temp data storage?
|
|
|
|
|
|
Not exactly. These methods assume that data is stored in a table. In my case to retrieve the data an expensive SP is run, and *after* that user needs to do filtering/paging/sorting on that data. These ad-hoc requests are made from ASP.NET code and needs to be done pretty fast. Do I need to store resultset in an actual table (#temp tables won't work in this scenario) or are there other ways?
|
|
|
|
|
Trekstuff wrote: What would be the best way to persist data from the SP
Put it in a (temporary?) table, and page from there
Bastard Programmer from Hell
|
|
|
|
|
I am coming to this conclusion myself, but normal #temp tables won't persist between connections, if I am to use ##global or permanent tables I'd need to know when to drop them - and I am not sure about that yet.
|
|
|
|
|
Trekstuff wrote: if I am to use ##global or permanent tables I'd need to know when to drop them - and I am not sure about that yet.
After a period of inactivity on the users' session? Keep a list of users and their temp-tables with a timestamp. Update the timestamp on each visit, delete the temp-tables with a timestamp older than two hours.
Anyone waiting for more than two hours before they go the the next page can wait the extra ten minutes that it takes to recreate the temp-table.
It might be handy to have these "temp" tables in a different database; that way they won't pollute the backups of your production-database.
Disclaimer; I'm a winforms-programmer, and there may be easier ways to detect the end of a session.
Bastard Programmer from Hell
|
|
|
|
|
Yup, I think this is what it's shaping out to be: A dedicated DB that'd hold temp permanent tables with cached data.
Thanks!
|
|
|
|
|
You're welcome
|
|
|
|
|
I'll add my support to this one, we have used it successfully. We found that there were only a very limited number of users performing this task and so we would flag a datset with their ID and store the results in a single table. We then gave the user the option to either query the "cache" or refresh it with a new process of the proc.
We then of course indexed the dammed table to death, which killed the insert/delete but made queries acceptably fast.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Yup, looks like this is the way we're going to take as well. Thanks
|
|
|
|
|
Trekstuff wrote: and return a few million records, I need to display that data in an ASP.NET Grid
Your requirements are badly flawed.
If you have one million records and a user looks at each record for only 1 second it would take them 7 weeks (8 hour days at 5 days a week) to look at the list.
|
|
|
|
|
I apologize, I must've not made my intentions clear. User won't need to see all those few million records at once. He/she would need to make ad-hoc request to this dataset: filter/sort/search it on the fly and expect results fast.
Thanks for the math lesson though
|
|
|
|
|
Then the best way is to ask the user for the criteria and then build an appropriate query from that.
|
|
|
|
|
User may have dozes of different criteria he/she would want to run ad-hoc. Those criteria aren't know in advance and need to be passed to data on the fly.
|
|
|
|
|
Trekstuff wrote: <layer>User may have dozes of different criteria he/she would want to run ad-hoc.
May and will are different.
And standard business users will not have those requirements.
For example anyone dealing with customers only needs a couple of criteria to find the the customer/order.
And if you really meant "ad-hoc" then you have a power user and they should be proficient in SQL and have a SQL application, not a created app.
|
|
|
|
|
Yes I am convinced. I can see it now: A bunch of execs at a board meeting firing up SSMS and just querying happily away
|
|
|
|
|
Trekstuff wrote: Yes I am convinced. I can see it now: A bunch of execs at a board meeting firing up SSMS and just querying happily away
If they want unlimited queries that is exactly what is needed and wrapping it in another app doesn't change that.
And that is exactly the situation at a bank that I worked at. The VP in charge of the merchant credit services often (more than once a week) did custom queries directly into the production database.
|
|
|
|
|
jschell wrote: If you have one million records and a user looks at each record for only 1 second it would take them 7 weeks (8 hour days at 5 days a week) to look at the list.
..assuming that they're looking at the records themselves, and not some kind of visualization of those records (like a chart) - that's assuming that they're looking at the data at all; for all I know they could be preparing an export to some Microsoft-Access database.
Bastard Programmer from Hell
|
|
|
|
|
This seems like a bad idea; no one is ever going to look at a million plus records; at best they'll look at a small subset. At the very least get them to filter before going to the database and only ever return as many records as they actually need to see, perhaps restricting to a managabale number.
"If you think it's expensive to hire a professional to do the job, wait until you hire an amateur." Red Adair.
nils illegitimus carborundum
me, me, me
|
|
|
|
|
That's the point, I don't know in advance what filters are going to be used; I can only restrict the data somewhat by initial call too the SP; but I cannot call SP for every filter request - this call is to expensive.
For example: user needs to work with data for the past 3 month - this Date filter I can pass to SP and this call can returns several million of records. Once the data is obtained - user may want to: Get only rows which "Name" field start with "D" or which SSN ends with 74 or group by LastName and get the count etc. etc. - these requests will limit data to a much smaller subset, but are unpredictable and made to the original large resultset returned by the SP - instead of SP itself
|
|
|
|
|
Not getting your reasoning here: if they first filter on the last 3 months why can't they also filter on other criteria at the same time? If it doesn't give them what they want they start over. Again, only retrieve the records they really need - it's more sensible to build a filter screen and get what you need than not and retrieve far more than they'll ever look at.
"If you think it's expensive to hire a professional to do the job, wait until you hire an amateur." Red Adair.
nils illegitimus carborundum
me, me, me
|
|
|
|
|
Again, the only condition they know in advance is "last 3 months". It is passed to SP - which is expensive to run, it may take several minutes (and please don't ask me to optimize it, it's not an option at the moment).
Once the data is there - it needs to present a flexible, dashboard view. User clicks the Name column to filters data, observes result, uses it elsewhere, removes the filter. User searches for SSN that ends in 74, and uses that data, removes the filter. User sorts data by date, jumps to the 1st page, jumps to the last page.
Again, these requests cannot be passed to SP - it will take too long, they need to be done to subset of data, returned by SP (yes those several million of records) that presumably are stored in some local storage.
|
|
|
|
|
Trekstuff: We might have the same case as you are. You didn't describe what 'the expensive SP' is doing, but in our case it is a cross-tab complex query of about 5 million rows of data. In our case, the time periode is always one month, so we have SQL Agent services that would run the query at 23:00 on each last day of the month. The result then stored in permanent tables on different database. The user would then query their requirement to these tables. The result is satisfying. We just have to 'torture' the server for about 3 to 5 hours on that night. Of course the downside is there are always a specific requirement that the available data warehouse couldn't comply. But we have a policy that they have to request their 'custom' need first to the IT department, and wait for the result for at least one day. That way, we can still have our tea time
hth,
foxyland
|
|
|
|