Click here to Skip to main content
15,886,026 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
Hello,

I have 2M records in the table which I am fetching it through store proc. Then I want to convert it into JSON format. When I am fetching it and storing it into datatable I am getting below error. How I can fetch 2M records and saved it into JSON format.

Error :- The memory could not be written.

What I have tried:

C#
DataTable dt = new DataTable();

           using (var connection = new SqlConnection(ConfigurationManager.ConnectionStrings["CRM"].ConnectionString))
           {


               using (var cmd = new SqlCommand("SPGetList", connection))
               {
                   cmd.CommandType = System.Data.CommandType.StoredProcedure;
                   SqlDataAdapter da = new SqlDataAdapter();
                   da.SelectCommand = cmd;

                   da.Fill(dt);// This line causing an error and throwing the memory exception error.

               }
           }
           return DataTableToJSONWithStringBuilder(dt);
Posted
Updated 11-Jul-17 21:10pm
v3

How should we know?
We have no idea what your DataTableToJSONWithStringBuilder method is doing - other than the broad outline the method name provides - nor any idea exactly which line is causing the problem.

So, its going to be up to you.
Put a breakpoint on the first line in that code, and run your app through the debugger. Then look at your code, and at your data and work out what should happen manually. Then single step each line checking that what you expected to happen is exactly what did. When it isn't, that's when you have a problem, and you can back-track (or run it again and look more closely) to find out why.

Sorry, but we can't do that for you - time for you to learn a new (and very, very useful) skill: debugging!


Quote:
Hi,
I am also thinking the same. But I am not a sql expert. How I can fetch it in batches like first 50k and then so on..


Depends on your SQL server version.
The old way is to use ORDER BY with a ROW_NUMBER:
SQL
SELECT MyColumns 
FROM (SELECT MyClolumns, ROW_NUMBER() OVER (ORDER BY UniqueKeyColumn) AS RowNum FROM MyTable) s
WHERE s.RowNum BETWEEN @StartRow AND @EndRow
This works from SQL 2005 onwards.
SQL 2012 introduced OFFSET-FETCH:
SQL
SELECT MyColumns FROM MyTable ORDER BY UniqueKeyColumn OFFSET @STARTROW ROWS FETCH NEXT @ROWSCOUNT ROWS ONLY
 
Share this answer
 
v2
Comments
Telstra 12-Jul-17 1:13am    
Hi,

Thanks a lot for your response.
DataTableToJSONWithStringBuilder method is just a method which is converting a datatable record into Json format. That method is not an issue. Also when I did debugging, i got the error on da.Fill(dt); line with memory exception.
That is the area where I am not able to process 2M records and not able to saved into datatable due to memory issue of datatable. Once I got the record into datatable
DataTableToJSONWithStringBuilder() method will jsut convert it into Json.
Could you please help how to resolve that error.

Thanks,
Nilesh
CodeWraith 12-Jul-17 1:32am    
How about not trying to do everything at once? Loading and processing less rows at a time and doing this in a loop until all rows have been processed may help a lot.
OriginalGriff 12-Jul-17 1:40am    
If the error comes when you try to fetch all the rows at once - and that's what you are saying - then there is only one solution: don't fetch all the rows at the same time...
Telstra 12-Jul-17 2:04am    
Hi,
I am also thinking the same. But I am not a sql expert. How I can fetch it in batches like first 50k and then so on..

Thanks in advance..
OriginalGriff 12-Jul-17 2:34am    
Answer updated
Can you try to use 'data reader' instead of adapter and see if the fetching of data is successful?

if the 'data reader' works for fetching data then you can convert/store the resultset returned to datatable or a complex type as required for further processing (i.e. converting to JSON).

You can also use TimeSpan to check the difference in time required to fetch records using adapter and reader. E.g.: do this excercise for 50K, 150K, and 500K records. I am sure data reader is going to be faster.
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900