|
#1 - it loads all of the rows into memory at once. DataTable.Load | Reference Source[^]
Displaying 150,000 rows on screen in one hit is not a good idea. How is the poor user ever going to find the one row they're interested in? There must be some way to summarise, filter, or page the data to make it easier to consume.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Thanks for your response. I only chose 150,000 as a theoretical number to make my point.
There would never be that many rows, but in our main LOB application, some developers in the past have loaded up around 10,000 rows, making the form rather slow to open!
The difficult we do right away...
...the impossible takes slightly longer.
|
|
|
|
|
I agree with Richard, 150k records to the UI is only valid if you are doing a data dump and there are more efficient way to do that.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Yes, I agree with you and Richard. I'm working on creating a generic way to load the data only on demand. It must be a business object agnostic way.
Thanks.
The difficult we do right away...
...the impossible takes slightly longer.
|
|
|
|
|
If you are using SQL Server I think there is a method of loading a "page" of data which my be valid. Obviously I have never used it, I limit UI lists to a max of 2k. After that filters become mandatory.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
We have one form that displays Purchase Orders, and the data goes back to 2009. There are thousands upon thousands of records.
How can you limit how many rows you load, and yet still allow the user to type in a column filter string in order to find any of the PO's from any point in time?
The difficult we do right away...
...the impossible takes slightly longer.
|
|
|
|
|
Richard Andrew x64 wrote: IOW, I don't want my form to only appear on screen after all 150,000 rows have been pulled from SQL Server. Then how?
You mentioned a "form", so I'd assume WinForms with a DataGridView. In which case you don't use a datatable, but a list. You add to the list from a different thread, in batches of 50. Virtualize the DataGridView. Start the actions as soon as the form is displayed.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
Eddy Vluggen wrote: You add to the list from a different thread, in batches of 50
Thank you. That's one of the avenues I am exploring. I have to find some way to make our application more responsive while still enabling access to thousands of rows.
The difficult we do right away...
...the impossible takes slightly longer.
|
|
|
|
|
|
|
Thank you, this looks very promising.
The difficult we do right away...
...the impossible takes slightly longer.
|
|
|
|
|
SQL server 2014/2016, SMO.
Using SMO to backup a databse to 'D:\SQLBK\Mon\dbc.bak' on server:
public int BackupDB(string serverName, string dbName, string backupFolder)
{
Server oServer = new Server(new ServerConnection(ServerName));
Backup oBackup = new Backup();
oBackup.Action = BackupActionType.Database;
oBackup.Database = dbName;
oBackup.Initialize = false;
oBackup.Incremental = false;
oBackup.LogTruncation = BackupTruncateLogType.Truncate;
string sBackupFilename = Path.Combine(backupFolder, dbName + ".bak");
oBackup.Devices.AddDevice(sBackupFilename, DeviceType.File);
oBackup.SqlBackup(oServer);
return 0;
}
How to get original database information from backup file 'D:\SQLBK\Mon\dbc.bak' through SMO? e.g. database name.
|
|
|
|
|
w14243 wrote: How to get original database information from backup file 'D:\SQLBK\Mon\dbc.bak' through SMO? e.g. database name. The database-name is not part of the backup. You backup the data, not meta-data on the file.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
SQL Server 2014/2016, using SMO.
Want backup database 'dbName' of server 'serverName' to server folder 'backupFolder'.
1.Backup remote database, not backup local database.
2.The 'backupFolder' is on remote server, e.g. "D:\SQLBK\Mon".
public int BackupDB(string serverName, string dbName, string backupFolder)
{
Server oServer = new Server(new ServerConnection(ServerName));
Backup oBackup = new Backup();
oBackup.Action = BackupActionType.Database;
oBackup.Database = dbName;
oBackup.Initialize = false;
oBackup.Incremental = false;
oBackup.LogTruncation = BackupTruncateLogType.Truncate;
string sBackupFilename = Path.Combine(backupFolder, dbName + ".bak");
oBackup.Devices.AddDevice(sBackupFilename, DeviceType.File);
oBackup.SqlBackup(oServer);
return 0;
}
The 'backupFolder' is folder on 'serverName', not local folder.
1.If 'backupFolder' is not exist on 'serverName', the 'SqlBackup' command will cause exception on 'folder not exist'.
2.If use 'Directory.CreateDirectory(backupFolder)', then local folder will be created, not remote folder.
So I want:
1. The 'SqlBackup' command can auto create specified backup folder. How to do it?
2. If above can't be done, then how to create 'backupFolder' on 'serverName' by SMO or other method?
|
|
|
|
|
Hi, I have a report which will produce different numbers of columns depending on the client that is selected. There may be a dozen clients and the columns might be named differently so it's all completely dynamic. I'm not quite sure how to arrange this in an SSRS report or even if it is possible. Can anybody please shed light on how I can do this? Thanks
modified 25-May-17 6:21am.
|
|
|
|
|
SQL server 2014/16.
Want create a table. The table has some columns that need support:
1.Can contain multiple nulls.
2.If not null, then must be unique.
3.The columns has no relationship.
4.Not entire row is unique. Each column is unique.
How to make the CREATE TABLE command?
It seems SQL server can only support one null when using UNIQUE.
Below example is what I wanted:
CREATE TABLE UniqueTest (
col1 int,
col2 int unique null
);
INSERT INTO UniqueTest VALUES (1, 1);
-- SUCCESS
INSERT INTO UniqueTest VALUES (2, 2);
-- SUCCESS
INSERT INTO UniqueTest VALUES (3, 2);
-- FAIL
INSERT INTO UniqueTest VALUES (4, NULL);
-- SUCCESS
INSERT INTO UniqueTest VALUES (5, NULL);
-- SUCCESS
I had searched internet, lots of articles are discussed in old SQL server version, e.g. 2005/2008.
I think the SQL server 2014/2016 has a new CREATE TABLE option to meet my requirement, but I don't know.
The solution from internet would be:
create index on specific column.
But:
1.My table will has 10~30 columns that need unique and nulls. If each column create an index, then will be 10~30 indexes in a table. Is it possible?
2.Lots indexes will lower efficiency. Is it?
|
|
|
|
|
|
That question can't be replied, so I post it here.
From the answer, it is unique on 'col1+col2'. But my requirement is that col1 and col2 have their self unique constraint.
Below example is what I wanted:
CREATE TABLE UniqueTest (
col1 int,
col2 int unique null
);
INSERT INTO UniqueTest VALUES (1, 1);
-- SUCCESS
INSERT INTO UniqueTest VALUES (2, 2);
-- SUCCESS
INSERT INTO UniqueTest VALUES (3, 2);
-- FAIL
INSERT INTO UniqueTest VALUES (4, NULL);
-- SUCCESS
INSERT INTO UniqueTest VALUES (5, NULL);
-- SUCCESS
|
|
|
|
|
Filtered Indexes and IS NOT NULL - Brent Ozar Unlimited®[^]
CREATE TABLE UniqueTest (
col1 int,
col2 int null
);
CREATE UNIQUE INDEX UX_UniqueTest_col2
ON UniqueTest (col2)
WHERE col2 Is Not Null;
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Below is my concern:
1.My table will has 10~30 columns that need unique and nulls. If each column create an index, then will be 10~30 indexes in a table. Is it possible?
2.Lots indexes will lower efficiency. Is it?
|
|
|
|
|
1. Yes.
2. Lots of indexes will potentially decrease the performance of inserts and updates, but no more so that any other method of enforcing your requirements, and with less chance of making a mistake.
A table with 30 columns might be a candidate for further normalization. But you would need to examine the data to determine that, and test what effect that might have on the performance and complexity of your queries.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Just use a trigger and move on. You have custom logic that SQL does not support.
There are two kinds of people in the world: those who can extrapolate from incomplete data.
There are only 10 types of people in the world, those who understand binary and those who don't.
|
|
|
|
|
RyanDev wrote: You have custom logic that SQL does not support.
Are you sure about that?
A filtered unique index can get you around the problem, like this:
CREATE UNIQUE INDEX uq_UsersAllowNulls_DisplayName on dbo.UsersAllowNulls ( DisplayName )
WHERE DisplayName IS NOT NULL;
GO
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Cool.
There are two kinds of people in the world: those who can extrapolate from incomplete data.
There are only 10 types of people in the world, those who understand binary and those who don't.
|
|
|
|
|
MS Access,Oracle,MySQL both support unique-nulls on column, but MS SQL server don't.
It is confusing.
The 'CREATE UNIQUE INDEX' solution is traditional. We need a simpler, better and easier way.
So I think SQL server has new keyword option to support unique-nulls column on latest version, e.g. UNIQUE_NULLS in SQL server 2014/1026. But I don't know exact settings on CREATE TABLE. Maybe anyone knows, maybe the new option is not exist at all.
|
|
|
|