|
Yup, you're right; this is not the right place to ask this. Trouble is, I'm not sure where the right place since the job board seems to have gone. Maybe the collaboration forum?
"If you think it's expensive to hire a professional to do the job, wait until you hire an amateur." Red Adair.
nils illegitimus carborundum
me, me, me
|
|
|
|
|
Thanks, for the suggestion...I was not confident about posting it in the collaborative area however I did a search on the google and found a neat website called elance. It looks like its setup for what I want to do. I haven't researched it too much so at first glance, though it may be unfair, I'm a little wary about what kind of talent pool is there, and I'm wondering if I could pay by the project based on an initial estimate or bid versus an hourly rate.
If anyone knows any pros or cons to sites like that one, please let me know.
TIA
|
|
|
|
|
JohnnyG wrote: a mysql database
JohnnyG wrote: to handle potentially rapid growth
I recommend SQL Server (Express).
|
|
|
|
|
I'm trying to return two ref cursors from a procedure and having a bit of trouble. What I'm trying to do is grab the info from the first cursor, select a few fields out of it and join to some other info and stick the result into a table variable... then select distinct items from that table into the second cursor. But I can't get this to compile. Can someone tell me what I'm doing wrong please?
type T_CURSOR is REF CURSOR
procedure FetchSL3Details_PRC
(
c_items out T_CURSOR,
c_identifiers out T_CURSOR,
p_niin in char
) as
v_idents IDENTIFIER_TABLE_TYPE:= IDENTIFIER_TABLE_TYPE();
BEGIN
open c_items for
select
its.item_set_id,
its.niin,
its.parent_niin,
its.commodity_id,
its.service_type,
its.sl3_type,
its.qty,
its.created_id,
its.created_dt,
its.modified_id,
its.modified_dt
from
item_set its
start with its.niin = p_niin
connect by prior its.niin = its.parent_niin;
for item in c_items
loop
v_idents.extend;
v_idents(v_idents.LAST) := identifier_row_type(item.commodity_id,
get_group_name_fun(item.commodity_id),
0);
v_idents.extend;
v_idents(v_idents.LAST) := identifier_row_type(item.created_id,
get_formatted_name_fun(item.created_id),
0);
v_idents.extend;
v_idents(v_idents.LAST) := identifier_row_type(item.modified_id,
get_formatted_name_fun(item.modified_id),
0);
end loop;
open c_identifiers for
select
distinct(v.id),
v.name,
v.type
from
v_idents v;
END FetchSL3Details_PRC;
|
|
|
|
|
I have a vb.net stock control app that is installed on about 10 workstations. They read/write data to an Access 2007 DB on a server. For all the database interactions, I use routines like the one below. If the function returns false, the I display the error.
The stock figures are getting all messed up. I have checked that all the transactions work correctly on my pc - no problems at all. The users ASSURE me that they have never seen any Error messages (I have also checked that should a function return false, that the error is displayed.)
My question:
1) What happens if two users try to access the same record at the same time?
2) Would the code below throw an exception if the database trasnaction failed?
Public Function UpdateRecord2(ByVal strUpdate As String, params() As String) As Boolean
Dim RecordsUpdated As Integer = 0
Dim cnStr As String = "Provider=" & Provider & "; Data source= " & _Database
Using cn As New OleDb.OleDbConnection(cnStr)
Dim da As OleDb.OleDbDataAdapter = New OleDb.OleDbDataAdapter
Dim cmd As OleDb.OleDbCommand
cmd = New OleDb.OleDbCommand(strUpdate, cn)
For i As Integer = 0 To params.Length - 1
If params(i) = Nothing Then params(i) = ""
cmd.Parameters.AddWithValue("?", params(i))
Next
da.InsertCommand = cmd
Try
cn.Open()
RecordsUpdated = da.InsertCommand.ExecuteNonQuery()
cn.Close()
Catch ex As Exception
frmMain.Err.DataMethod = "UpdateRecord"
frmMain.Err.DataErr = ex.Message
Return False
Finally
cn.Close()
End Try
End Using
If RecordsUpdated = 1 Then
Return True
Else
frmMain.Err.DataMethod = "UpdateRecord"
frmMain.Err.DataErr = "No Records updated"
Return False
End If
End Function
PS: I am trying to migrate the code to work on SQL Server 2008 Express...
|
|
|
|
|
Your PS indicates you are making the correct decision as to the future.
Access (used to) locks a "Page" of records when required, not sure the size of the page but it will include more than 1 record at the end of the table. Additional attempts to access the locked records will get an exception.
You need to insure that this is likely to be a real world problem, in the 90s I had the same environment and never had a contention issue in a number of years of operation (we went to SQL Server eventually).
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Thanks for your reply. It seems from your answer that an exception would be thrown/caught in my code. I am leaning more towards the idea that MY CODE is causing the problem and has nothing to do with Access. I have to sort this out before I even think of SSE...
Let me visit Google and then possibly post another question on that...
|
|
|
|
|
Further to Mycroft's answer, IIRC Access supports more than page-level locking. At least some versions can also do record level locking (and maybe no locking at all, which would cause your problems.) I suggest a quick google/MSDN search on "access record lock".
[disclaimer: It's been so long since I played with Access that I've forgotten more than I ever knew.]
HTH
Cheers,
Peter
Software rusts. Simon Stephenson, ca 1994.
|
|
|
|
|
I was not aware that they ever achieved record locking, we never had any contention issues so I forgot about it, it was after all the 90s and I suppose even Access has had some improvement since then!
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
My memories are VERY dim, but I'm sure Mr G and Mr W know.
In Jet versions before version 4, a page locking model is used, and in Jet 4, a record locking model is employed. Can't remember how you set it up for locking or not.
Now for something that belongs in Coding Horrors, except it doesn't really qualify as coding:
The weirdest db corruption I ever had to fix was a shared dB4 database, using Codebase. In this setup, index files are "external", and some smartass self-appointed admin managed to have two of the six clients running from the shared db files on the server but with LOCAL C drive index files.
Cheers,
Peter
Software rusts. Simon Stephenson, ca 1994.
|
|
|
|
|
Thanks Peter. After some Googling on record locks, I think MY CODE is the problem, not Access.
|
|
|
|
|
Log the exceptions - that is better than asking (and believing in) the users.
With Access, you must make sure that the parameters are added in the correct sequence, they all share the same name "?".
|
|
|
|
|
Thanks Bernard!
Very good suggestion.... I'll implement some sort of logging.
Thanks also, I did check that the parameters are added in the correct order. I have tested and tested the code over and over on my laptop, and I don't get errors, this is why I'm leaning towards some sort of concurrency issue. I also checked the locking settings on the Database.... Oops - they were set to None!!!!
|
|
|
|
|
I wrote: maybe no locking at all, which would cause your problems.
Richard.Berry100 wrote: I also checked the locking settings on the Database.... Oops - they were set to None!!!!
No further comment required.
Peter
Software rusts. Simon Stephenson, ca 1994.
|
|
|
|
|
I have a TSQL stored procedure that can run for a few minutes and return a few million records, I need to display that data in an ASP.NET Grid (Infragistics WebDataGrid to be precise). Obviously I don't want return all data at once and need to setup some kind of paging options - every time user selects another page - another portion of data is loaded from the DB. But I can't run the SP every time new page is requested - it would take too much time.
What would be the best way to persist data from the SP, so when user selects a new page - new data portion would be loaded by a simple SELECT... WHERE from that temp data storage?
|
|
|
|
|
|
Not exactly. These methods assume that data is stored in a table. In my case to retrieve the data an expensive SP is run, and *after* that user needs to do filtering/paging/sorting on that data. These ad-hoc requests are made from ASP.NET code and needs to be done pretty fast. Do I need to store resultset in an actual table (#temp tables won't work in this scenario) or are there other ways?
|
|
|
|
|
Trekstuff wrote: What would be the best way to persist data from the SP
Put it in a (temporary?) table, and page from there
Bastard Programmer from Hell
|
|
|
|
|
I am coming to this conclusion myself, but normal #temp tables won't persist between connections, if I am to use ##global or permanent tables I'd need to know when to drop them - and I am not sure about that yet.
|
|
|
|
|
Trekstuff wrote: if I am to use ##global or permanent tables I'd need to know when to drop them - and I am not sure about that yet.
After a period of inactivity on the users' session? Keep a list of users and their temp-tables with a timestamp. Update the timestamp on each visit, delete the temp-tables with a timestamp older than two hours.
Anyone waiting for more than two hours before they go the the next page can wait the extra ten minutes that it takes to recreate the temp-table.
It might be handy to have these "temp" tables in a different database; that way they won't pollute the backups of your production-database.
Disclaimer; I'm a winforms-programmer, and there may be easier ways to detect the end of a session.
Bastard Programmer from Hell
|
|
|
|
|
Yup, I think this is what it's shaping out to be: A dedicated DB that'd hold temp permanent tables with cached data.
Thanks!
|
|
|
|
|
You're welcome
|
|
|
|
|
I'll add my support to this one, we have used it successfully. We found that there were only a very limited number of users performing this task and so we would flag a datset with their ID and store the results in a single table. We then gave the user the option to either query the "cache" or refresh it with a new process of the proc.
We then of course indexed the dammed table to death, which killed the insert/delete but made queries acceptably fast.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Yup, looks like this is the way we're going to take as well. Thanks
|
|
|
|
|
Trekstuff wrote: and return a few million records, I need to display that data in an ASP.NET Grid
Your requirements are badly flawed.
If you have one million records and a user looks at each record for only 1 second it would take them 7 weeks (8 hour days at 5 days a week) to look at the list.
|
|
|
|