Click here to Skip to main content
15,879,045 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
HI guys,

Just a quick questioN:

I am create an sqlfile with lot of insert queries (ex: 8000 different insert queries)
.

I am running this file through cmd

from cmd:
i will connect with a database run the file with insert queries than i will commit.

Can i do something to make it more faster ?

What I have tried:

I do not know what can i do that's why i am asking you .
Posted
Updated 30-Aug-18 0:48am
Comments
Mike V Baker 27-Aug-18 11:39am    
Can you create the sql file with 8 inserts of 1000 records each rather than 8000 inserts? What I usually do is set a threshhold like 1000 records. As I process items I keep count of how many inserts are in the list and when it reaches the threshold it runs that insert.
Joe Doe234 28-Aug-18 1:04am    
it may be good idea , but when it comes to cmd , the cmd will insert each query one by one and that is taking time. i will try bcp maybe it works
Mike V Baker 28-Aug-18 9:09am    
What I mean is that the cmd will only need to execute 8 queries instead of 8000. Even though each qry has 1000 records it still runs faster than 8000 single record inserts. Perhaps if you posted the SQL that the cmd will run I can show you what I mean
Joe Doe234 4-Sep-18 6:21am    
if i have insert into table1 values ("hello", "bye");

for 8000 times but the values are all different .. how can i split them into 8 queries ?

Thanks you for your help.
Mike V Baker 4-Sep-18 9:09am    
INSERT INTO ComicCharacters (FirstName, LastName) VALUES
("Bugs", "Bunny"),
("Yosemite", "Sam"),
("Elmer", "Fudd"),
("Grape", "Ape");

Start with the INSERT INTO line, then just keep listing the values in parens. When it reaches the batch size you run the cmd, clear it and start over. Use comma between records, semi-colon on the end. 1000 is an arbitrary number I picked so that I could show progress in the program.

Can you please check if there is any Triggers/Indexes is created on that table in which you are going to insert data.If you find that then please disable triggers/indexes then you can try to insert.
 
Share this answer
 
Comments
Mike V Baker 4-Sep-18 9:05am    
Be aware that if you do disable the triggers and/or indexes during the operation then you'll need to rebuild the indexes and perform the operations in the triggers to ensure the data is correct.
when you execute inserting 8000 records it will definitely kill the performance. In this case memory and cpu speed also counts. Instead split them into 8 queries each of 1000 records. That will improve performance.
 
Share this answer
 
Comments
Joe Doe234 4-Sep-18 4:29am    
so how can i split them can you please help me ?

if i have :

insert into table1 values ("hello1");

for 8000 times .. how can i split it into 8 queries ?
Mike V Baker 4-Sep-18 9:01am    
INSERT INTO ComicCharacters (FirstName, LastName) VALUES
("Bugs", "Bunny"),
("Yosemite", "Sam"),
("Elmer", "Fudd"),
("Grape", "Ape");

Start with the INSERT INTO line, then just keep listing the values in parens. When it reaches the batch size you run the cmd, clear it and start over. Use comma between records, semi-colon on the end. 1000 is an arbitrary number I picked so that I could show progress in the program.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900