Click here to Skip to main content
15,884,298 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
Hi All,

I have a requirement of importing multiple files from a folder into a local DB table. column structures are same. Row count varies from 25 k to 2 lakhs in each file.

which tool can be used to do it faster? like Sql server, oracle or Mysql

Please post your opinions.

Thanks in Advance

What I have tried:

I tried MySQL. just import from wizards, have not used any code. It was dead slow
Posted
Updated 15-Nov-17 16:55pm
Comments
Mehdi Gholam 15-Nov-17 9:43am    
Pick your database engine first, Oracle, MySql, SQL Server are different things.
sadiq_j 15-Nov-17 9:46am    
which is faster among them?
Richard MacCutchan 15-Nov-17 10:50am    
Speed of processing varies according to the structure and content of the database; It is impossible to answer the question.
Santosh kumar Pithani 15-Nov-17 22:59pm    
SSIS packages is very fast for import and export Data Records.

1 solution

While bulk copy and other bulk import options are not available on the SQL servers, you can import a CSV formatted file into your database using SQL Server Management Studio.

First, create a table in your database into which you will import the CSV file. After the table is created:
1.Log in to your database using SQL Server Management Studio.
2.Right click the database and select Tasks -> Import Data...
3.Click the Next > button.
4.For Data Source, select Flat File Source. Then use the Browse button to select the CSV file. Spend some time configuring the data import before clicking the Next > button.
5.For Destination, select the correct database provider (e.g. for SQL Server 2012, you can use SQL Server Native Client 11.0). Enter the Server name; check Use SQL Server Authentication, enter the User name, Password, and Database before clicking the Next > button.
6.In the Select Source Tables and Views window, you can Edit Mappings before clicking the Next > button.
7.Check Run immediately and click the Next > button.
8.Click the Finish button to run the package.


--------------------------------------------------------------------
SQL Query for importing CSV File 

To handle rows which aren't loaded into table because of invalid data or format, could be handle using ERRORFILE property, specify the error file name, it will write the rows having error to error file. code should look like.

BULK INSERT SchoolsTemp  --Table Name
    FROM 'C:\CSVData\Schools.csv'
    WITH
    (
    FIRSTROW = 2,
    FIELDTERMINATOR = ',',  --CSV field delimiter
    ROWTERMINATOR = '\n',   --Use to shift the control to next row
    ERRORFILE = 'C:\CSVDATA\SchoolsErrorRows.csv',
    TABLOCK
    );
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900