Click here to Skip to main content
15,881,172 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
Hi I am loading DBF file into existing sql table in c# console application. First of all my dbf file having 8 ,50,000+ records i had load the whole DBF file into datatable, It's lead to out of memory exception when processing 400000+ records. So i have read a record at a time and add into the datatable as a row. It's worked fine untill 6,00,000+ records. After this record i am facing System.OutofMemoryException. Please anyone help me to load DBF file into sql table in c# console application. My code is here, I have tried lots of code optimizations but no use
C#
 public void ConvertDBFtoSQL() {
     string connStr = @"Provider=VFPOLEDB.1;Data Source=dbf file path;Persist              Security Info=False; User ID=Admin;Password=;";
     using (OleDbConnection oleDbConnection = new OleDbConnection(connStr))
     {
         oleDbConnection.Open();
         string query = "select * from dbftable";
         OleDbCommand oleDbCommand = new OleDbCommand(query, oleDbConnection);
         oleDbCommand.CommandTimeout = 600000;
         OleDbDataReader oleDbDataReader = oleDbCommand.ExecuteReader();
         int fieldCount = oleDbDataReader.FieldCount;
         DataTable dataTable= CreateDataTable();
         string[] rows = new string[fieldCount];
         while (oleDbDataReader.Read())
         {                     
             using (dataTable)
             {
                 Array.Clear(rows, 0, rows.Length);
                 for (int i = 0; i < fieldCount; i++)
                 {
                     try
                     {
      rows[i] = oleDbDataReader.GetFieldType(i).Name.ToLower() == "datetime"?
"\"" + Convert.ToDateTime(oleDbDataReader.GetValue(i)).ToString("dd/MM/yyyy").Substring(0, 10) + "\"":
                         
                             (oleDbDataReader.GetValue(i).ToString().Contains(",") ? "" :"\"" + oleDbDataReader.GetValue(i).ToString().Trim() + "\"");                                 
                     }
                     catch (Exception ex) { }
                 }
                 string row = string.Join(",", rows);
                 dataTable.Rows.Add(new object[] { row });
             }
         }
     }
}


What I have tried:

First i tried dataTable.Load(oledbcommand.executereader()) it's fail with out of memory exception then i have tried with reading a row and add into a datatable.
Posted
Updated 11-Nov-16 1:13am
v3
Comments
Michael_Davies 11-Nov-16 2:37am    
Why do you load them all? No one can possibly read all of them or need to see them all!
Tomas Takac 11-Nov-16 2:38am    
The data is just too big, you cannot load it all into memory. Why do you need to do that in the first place?
sankarisiva 11-Nov-16 2:43am    
Because I need to load this whole data into a sql table. If i datatable has whole data then i can use bulkinsert option to load into the sql table
Tomas Takac 11-Nov-16 2:50am    
No, you don't need to load it all into memory. SqlBulkCopy can use a data reader.[^]

You can use data reader with SqlBulkCopy:
C#
const string sourceConnectionString = "abc";
const string targetConnectionString = "xyz";

using (var sourceConnection = new OleDbConnection(sourceConnectionString))
using (var targetConnection = new SqlConnection(targetConnectionString))
using (var sourceCommand = new OleDbCommand("select * from dbftable", sourceConnection))
{
	sourceConnection.Open();
	targetConnection.Open();
	
	using (var reader = sourceCommand.ExecuteReader())
	using (var bulkCopy = new SqlBulkCopy(targetConnection))
	{
		bulkCopy.DestinationTableName = "aTableName";
		bulkCopy.BatchSize = 1000;
		bulkCopy.WriteToServer(reader);
	}
}
 
Share this answer
 
v2
Comments
sankarisiva 11-Nov-16 4:13am    
Sorry for the delay and thanks for your reply. I have tried this method but it produces timeout exception and no records are inserted into the destination table.
F-ES Sitecore 11-Nov-16 5:46am    
Just process the data in chunks of a thousand or whatever instead of all at once. If you had to move 10 boulders from A to B and you're not strong enough to carry them all at once you have to do them 1 at a time, it's the same concept here.
Tomas Takac 11-Nov-16 6:24am    
As F-ES said, set the batch size to an appropriate value. See my edit.
sankarisiva 11-Nov-16 8:50am    
Thanks. This method works fine
 
Share this answer
 
Comments
sankarisiva 11-Nov-16 4:22am    
I have already those methods, Those produce exception as syntax error
Mehdi Gholam 11-Nov-16 4:37am    
1) Select less data by using a filter, and do it in chunks
2) Run on a 64bit system.
sankarisiva 11-Nov-16 4:44am    
I am running on 86bit
Mehdi Gholam 11-Nov-16 4:46am    
86bit??
sankarisiva 11-Nov-16 4:50am    
I have set my build platform as x86. Then only i could the dbf file

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900