|
You are right Dave, But I think there is a little difference between SQL and MySQL. In SQL , I am not very conversant but I think, when you close a connection it actually closes the thread. But in MySql, as far as I have been working on it, I find that even if you close the connection it actually does not close it but stays in "SLEEP" mode and hence creates problems. BTW I am using the command MySQLCLIENT.Connection.Close() to close the connection. This command unfortunately leaves the connection as it is as seen using processlist. If I am using a wrong command or there is another command to close the connection, I would really like to know that.
Anyway thanks a lot for the input.
|
|
|
|
|
I don't use MySQL and so am not an expert on it. I just pointed out that the practice of holding open a connection for the life of an application is greatly frowned upon and should never be used in production code.
Fix the problem, don't work around it.
|
|
|
|
|
MySQLCLIENT.Connection.Close()
Try calling Dispose when you're done. Wrap 'em in a using block, and they get cleaned up correctly, without the need to call dispose explicit.
Bastard Programmer from Hell
if you can't read my code, try converting it here[^]
|
|
|
|
|
Maybe I should have been more clear. My applicaiton was a very short lived program which executed about a thousand queries. The whole program lasted less than 3 minutes.
It was kind of a crazy situation where I was processing data from one resultset (MS-SQL) and issuing queries to another server (AS400 DB2 via ODBC). I found that creating one connection, one SqlCommand and changing the value of the parameters gave me the best performance.
I agree that generally you don't want to hold onto a resource any longer than necessary, but in my case the overhead of opening and closing connections to the other server was more than I wanted.
Sorry I can't be more helpful with a mySQL specific answer. I'm just telling you what worked for me.
BTW: After re-reading the requirements of the original post, I believe my method of making one connection at the top of the application would be just fine. The author is loading data from Excel. Right? How long could this program take to run? A few minutes at worst. Having a single connection to the database for that period of time should be no problem.
|
|
|
|
|
Thanks David. It really did work.
|
|
|
|
|
You might want to look at this[^].
and if that doesn't work, go through these[^].
|
|
|
|
|
I need a Visual Basic code for a 25x25 sodoku game...please can anyone help me....
modified 23-Aug-12 21:58pm.
|
|
|
|
|
As a matter of fact this is not urgent at all. Leave your homework to the last minute?
Why is common sense not common?
Never argue with an idiot. They will drag you down to their level where they are an expert.
Sometimes it takes a lot of work to be lazy
Please stand in front of my pistol, smile and wait for the flash - JSOP 2012
|
|
|
|
|
It may be urgent to you but not to anyone who volunteers their time answering questions here.
Simply put, your procrastination has gotten you in trouble. Good luck!
|
|
|
|
|
Not only you've left your homework for the last moment... you've not even attempted to google...
In my first attempt with "visual basic sudolu code" I've found this: Visual Basic Sudoku Solver and Generator[^]
Which is an article here in CP that shows how to do it.
|
|
|
|
|
|
Only 7 hours?? You need to wait at least 24 when you post on sites where people volunteer their time answering questions, including CodeProject.
Why are you doing it this way instead of just binding the grid to the DataTable?? I see no reason to do otherwise since you're using a DataTable as a backing store. If your using a DataTable, I don't see why you're using Virtual mode. DataTables are rather medium weight objects for storing data (not very efficiently) and are good for small-ish tables.
If you're using DataTables with Virtual Mode, your kind of defeating the purpose of Virtual Mode.
|
|
|
|
|
Thank you for your answer.
To answer your question why, I have written an Ad-Hoc query tool for a very large database. I have a query that is returning 300K+ records and it takes the DataGridView forever to load all those records when the DataTable is bound to the DataSource property. Hence why I choose to use Just-In-Time data loading.
Apparently the facts in my post were insufficient in revealing I was dealing with large DataTables. My apologies for being unclear with the facts I provided.
|
|
|
|
|
Phoenix Hawke wrote: ... returning 300K+ records and it takes the DataGridView forever ...
Such a design deserves to fail. Nobody will want to read so many lines on screen.
|
|
|
|
|
Bernhard Hiller wrote: Such a design deserves to fail. Nobody will want to read so many lines on
screen.
Thank you for the enlightening review of my design.
Let me make it clear what an Ad-Hoc Query tool is so the design will make more sense to you. An Ad-Hoc query tool allows users to design queries in a query designer much like MS Access using one or many tables and various fields. Given there is such a large amount of flexiblity with this kind of tool a software designer has to take into account queries that will return a large amount of data. I realize that users aren't going to want to see a large number of of records. However, they aren't going to want an interface that takes 10 minutes to load in order for them to realize that is the issue with the query they designed.
There is one more point I would like to bring is forward. You offered absolutely no kind of solution to the issue. If you were trying to help then you to have failed in your design. However, unlike you I will offer another solution to your design. That is, if you aren't going to provide any kind of useful or well thought out suggestions but rather just criticism then please do me favor and move onto another thread where someone maybe enlisting that kind of feedback.
Thanks
|
|
|
|
|
Also we face sometimes the problem, that a user selects filter values for the data which would yield too many rows to be shown. And what do we do there?
We use a threshold for the row count. Of course, that threshold can be configured. If the row count is more than the threshold, a warning message will be shown, and the user can decide to continue with the crazy amount of data (well, he was shown a warning...) or stop to set new filter values.
|
|
|
|
|
Bernhard Hiller wrote:
Also we face sometimes the problem, that a user
selects filter values for the data which would yield too many rows to be shown.
And what do we do there? We use a threshold for the row count. Of course,
that threshold can be configured. If the row count is more than the threshold, a
warning message will be shown, and the user can decide to continue with the
crazy amount of data (well, he was shown a warning...) or stop to set new filter
values.
Thank you for your reply.
That is indeed a very good suggestion. So how might I put this into pratice? Is there some event within the SqlClient.SqlDataAdapter Class that fires during the execution of the query that tracks the rows returned during execution? Or is this something that is implemented during the loading of the records into the DataGridView? I found that using a For Each statement to load records row by row is very slow and inefficent. Is there another method that can be used to create a way to stop the load after so many records have been loaded into the DataGridView?
Thanks
|
|
|
|
|
I do not know if there are events for that (I do not like the DataAdapters, and normally use a DataReader). Before doing the query to fetch the data, you can do a query to get the row count by a simple query : SELECT COUNT(*) FROM (original select query) . Since Microsoft SQL Servers caches a lot, that's not a costly overhead.
|
|
|
|
|
Thanks for the assistance. I will give that try.
However, I am still left wondering why the DataGridView is working differently for the same query on 2 different attempts. Why it's fast the first time but hangs the second and why the blank rows have data from the DataTable when the DataGridView isn't bound and no data has been looaded.
Perhaps someday I will find out why.
|
|
|
|
|
Phoenix Hawke wrote: I have a query that is returning 300K+ records
Nobody is going to look through 300,000 records for anything. On top of that, you're loading all of those into a DataTable, burning up network bandwidth transmitting all those records and burning up memory because you're storing them all in a DataTable.
Virtual mode is intended to give you an oportunity to retrieve only the visible subset of the larger recordset being viewed. For example, if your displayed grid is only 20 rows high, you can retrieve only the 20 rows you can see from the 300,000 available.
|
|
|
|
|
Dave Kreskowiak wrote: Nobody is going to look through 300,000 records for anything. On top of that,
you're loading all of those into a DataTable, burning up network bandwidth
transmitting all those records and burning up memory because you're storing them
all in a DataTable. Virtual mode is intended to give you an
oportunity to retrieve only the visible subset of the larger recordset being
viewed. For example, if your displayed grid is only 20 rows high, you can
retrieve only the 20 rows you can see from the 300,000 available.
Thank you for your feedback.
Your observation is very interesting. I say this becuse the tool returns 300K+ of records in under 6 seconds and 4 of those seconds is the time it takes the query to return results and the other 2 is the load time into the DataGridView. Please keep in mind we are dealing with an Ad-Hoc Query tool which allows the user to design any query they want to. It is up to the application to properly handle what they have designed and return the results in a reasonable amount of time.
I am really asking for advice on why the DataGridView works one way then another way when dealing with the exact same circumstances being executed twice. I defintiely want to enlist other design concepts but criticism without providing another solution is frankly useless to me.
Thanks
|
|
|
|
|
OK, I've told you what the design goal of virtual mode was and how it was intended to work. What you do with that information is up to you.
|
|
|
|
|
I have to write a code where i have to ask user on production lane for identity card. The code would be insert form light scanner. I would like to block press buttons from 0 to 9 by:
Private Sub op1start_KeyDown(ByVal sender As System.Object, ByVal e As System.Windows.Forms.KeyEventArgs) Handles op1start.KeyDown
If e.KeyCode = Keys.D1 Or e.KeyCode = Keys.D1 Or e.KeyCode = Keys.D2 Or e.KeyCode = Keys.D3 Or e.KeyCode = Keys.D4 Or e.KeyCode = Keys.D5 Or e.KeyCode = Keys.D6 Or e.KeyCode = Keys.D7 Or e.KeyCode = Keys.D8 Or e.KeyCode = Keys.D9 Or e.KeyCode = Keys.D0 Then
e.SuppressKeyPress = True
End If
when the user scans the card. The user has a tablet there he could insert his code by buttons. I have to forbid that.
Could i make events in inputbox?
Please help
|
|
|
|
|
lukaszo44 wrote: The user has a tablet there he could insert his code by buttons.
..and using the clipboard (copy/paste) and probably by entering the ASCII value while holding the alt-key.
Why, isn't the code you posted working?
Bastard Programmer from Hell
if you can't read my code, try converting it here[^]
|
|
|
|
|
sounds like you are using a scanner which emulates a keyboard.
Basically, when you scan a barcode, it generates the characters in the keyboard buffer area so that the application doesn't realize that the data is coming from a scanner versus coming from the scanner.
The problem with this type of scanner is that you have to have the focus set on the proper input box to capture the data.
What you want is a scanner which writes the data to the serial (USB) port and have your application be responsible for capturing the data. Using this method, there is no way for the operator to enter values into the system from the keyboard or copy/paste buffer.
Good luck.
|
|
|
|