|
after looking a little more, it may be better to use the cellvaluechanged event...
sample
(note this was tested only in C# win32.forms, and the column ValueTypes are strings):
private void dataGrid_CellValueChanged(object sender, DataGridViewCellEventArgs e) {
if ((e.RowIndex >= 0) && e.ColumnIndex != DataGrid.Columns["Total"].Index) {
int r = e.RowIndex;
int Qty = 0;
int.TryParse((string)DataGrid["Quantity", r].Value, out Qty);
double Price = 0D;
double.TryParse((string)DataGrid["UnitPrice", r].Value, out Price);
DataGrid["Total", r].Value = Price * Qty;
}
}
modified 24-Dec-13 19:07pm.
|
|
|
|
|
This helps a lot....big thanks...
|
|
|
|
|
Hello All,
I have an ASP.Net label error message set at the top of the page. And for some reason MaintainScrollPositionOnPostback = false is not working. I would like the page to scroll back to top, so users can see this error message. It does not scroll to the top, and maintains its existing position, leaving the user to keep clicking submit button.
I have tried this:
Specified on the code behind of asp.net user control:
Page.MaintainScrollPositionOnPostBack = false;
It does not scroll back to top of the page, as i would like it to so the user can read the message at the top of the page.
am I missing anything?
Thanks!
|
|
|
|
|
Hi all
I just moved a entire solution from my laptop to a server where VS 2012 is installed. When I try to debug step by step, one of string is always empty. VS 2012 said that it doesn't exist on this context . This is a very simple code which try to retrieve value of app.config file:
string attributeWindowsLogin = ConfigurationManager.AppSettings["WindowsLogin"].ToString();
string attributeFullName = ConfigurationManager.AppSettings["FullName"].ToString();
string attributeAlpSid = ConfigurationManager.AppSettings["ALPSID"].ToString();
attributeFullName is always unknown by VS.
Impossible for me to edit a class.
Please what's wrong with my project ?
|
|
|
|
|
Does the 'FullName' element exist in the config file?
Getting information off the Internet is like taking a drink from a fire hydrant.
- Mitchell Kapor
|
|
|
|
|
Hi
of course this attribute exists. When I execute the code on Watch window, it works.
Maybe it is a problem with key of projrct. It is impossible to edit the class!
|
|
|
|
|
Weird. I am using VS2013, but I also have VS2012 installed. I have never seen this issue before, though. Not sure what is causing it.
Getting information off the Internet is like taking a drink from a fire hydrant.
- Mitchell Kapor
|
|
|
|
|
How to upload flat file data to Oracle DB using C#?
Daljit S. Gill
modified 26-Dec-13 5:44am.
|
|
|
|
|
Hi,
I have written the below code to read a larege text (1.5 gb) using background worker thread. Also I have used StreamReader.ReadLine() method.
But I am receiving OutOfMemoryException(I have mentioned where I am receiving the exception in code as comments).
Is there any any way to read data from large text file?
const string dataFile = @"D:\\test.txt";
string line;
public Form1()
{
InitializeComponent();
InitializeBackgroundWorker();
}
private void InitializeBackgroundWorker()
{
backgroundWorker1.DoWork +=
new DoWorkEventHandler(backgroundWorker1_DoWork);
backgroundWorker1.RunWorkerCompleted +=
new RunWorkerCompletedEventHandler(
backgroundWorker1_RunWorkerCompleted);
backgroundWorker1.ProgressChanged +=
new ProgressChangedEventHandler(
backgroundWorker1_ProgressChanged_1);
}
private void button1_Click(object sender, EventArgs e)
{
backgroundWorker1.RunWorkerAsync();
}
private void backgroundWorker1_DoWork(object sender, DoWorkEventArgs e)
{
StringBuilder sb = new StringBuilder();
using (FileStream fs = File.Open(dataFile, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (BufferedStream bs = new BufferedStream(fs))
using (StreamReader sr = new StreamReader(bs))
{
while ((line = sr.ReadLine()) != null)
{
if (backgroundWorker1.CancellationPending)
{
e.Cancel = true;
break;
}
sb.AppendLine(line + "\n");
e.Result = sb;
}
}
}
private void backgroundWorker1_RunWorkerCompleted(object sender, RunWorkerCompletedEventArgs e)
{
if (e.Cancelled)
{
MessageBox.Show("You've cancelled the backgroundworker!");
}
else
{
richTextBox1.AppendText(e.Result.ToString());
MessageBox.Show("Done");
}
}
private void backgroundWorker1_ProgressChanged(object sender, ProgressChangedEventArgs e)
{
progressBar1.Value = e.ProgressPercentage;
}
private void backgroundWorker1_ProgressChanged_1(object sender, ProgressChangedEventArgs e)
{
}
private void button2_Click(object sender, EventArgs e)
{
backgroundWorker1.CancelAsync();
}
|
|
|
|
|
To respond meaningfully to your query, I think we need to know more about the context:
0. structure of the file ? flat ? cvs ? JSON, XML, compressed ?
1. reading from ... where ? a local drive, a network server, the web ?
2. hardware: how much ram do you, or your app's users' machines, have ?
3. use case: is it absolutely necessary to read the whole file, or, can you read it in chunks, and process/transform it chunk-by-chunk ? see: >[^] for an example of reading by chunks.
4. compression: is the file compressible in a way you could that there could be significant memory savings if you could compress the file first, load the compressed file into memory, and then operate on the compressed file as needed ... within the requirements of whatever you are doing with the file content, and the acceptable response times for operations on/with the file ?
“I'm an artist: it's self evident that word implies looking for something all the time without ever finding it in full. It is the opposite of saying : 'I know all about it. I've already found it.'
As far as I'm concerned, the word means: 'I am looking. I am hunting for it. I am deeply involved.'”
Vincent Van Gogh
modified 23-Dec-13 7:02am.
|
|
|
|
|
0. structure of the file ? flat ? cvs ? JSON, XML, compressed ?
flat file
1. reading from ... where ? a local drive, a network server, the web ?
local drive
2. hardware: how much ram do you, or your app's users' machines, have ?
2gb ram
3. use case: is it absolutely necessary to read the whole file, or, can you read it in chunks, and process/transform it chunk-by-chunk ? see: >[^] for an example of reading by chunks.
Even if I read it in chunks it not an issue.But I have to read the file.
In my text file, I will have data like
12/12/2013 John 23/12/1978
New York USA 1234-567-897
What I am trying to say is,my data will span over two different lines to form one complete data.
4. compression: is the file compressible in a way you could that there could be significant memory savings if you could compress the file first, load the compressed file into memory, and then operate on the compressed file as needed ... within the requirements of whatever you are doing with the file content, and the acceptable response times for operations on/with the file ?
If you compress it can you read the data?
BR,
Arjun
|
|
|
|
|
|
It's sheer coincidence that the out of memory exception is thrown were you marked it in your code. You essentially read the file a line at a time and then append that into a StringBuilder instance. That means at one point there will be so much memory consumed that you'll run into that kind of exception either when reading a new line or when trying to append said line into the StringBuilder .
First question would be do you really need all those lines in memory at once. The program does not really do anything useful in that loop as it just pours it into that buffer.
Second question is how much memory does you system have and is it a 32 bit or a 64 bit OS.
Regards,
— Manfred
"I had the right to remain silent, but I didn't have the ability!"
Ron White, Comedian
|
|
|
|
|
Hi,
Its ok even if I read chunks of data.But i should read the entire line. See my text file will have data which spans to next line to form one complete sentence.
For example:
12/12/2013 John 03/12/1978
New York USA 1803-345-233
The above data can also be in same line.
12/12/2013 John 03/12/1978 New York USA 1803-345-233
So I want to read one complete line even if the file is read in chunks.
RAM:2gb
OS type:64 bit
BR,
Arjun
|
|
|
|
|
Arjun Mourya wrote: So I want to read one complete line even if the file is read in chunks. That still doesn't mean that you need the entire block in memory. Put what you read in a buffer until you reach the end of the sentence. Process the sentence, clear the buffer, continue reading.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
 Hi,
I was able to read the file and insert each read line into database. So in order to show progress to the user, I used Backgroundworker with progressbar.Below is my code:
const string dataFile = @"F:\Bharath CS\Document1.txt";
public Form1()
{
InitializeComponent();
InitializeBackgroundWorker();
}
private void InitializeBackgroundWorker()
{
backgroundWorker1.DoWork +=
new DoWorkEventHandler(backgroundWorker1_DoWork);
backgroundWorker1.RunWorkerCompleted +=
new RunWorkerCompletedEventHandler(
backgroundWorker1_RunWorkerCompleted);
backgroundWorker1.ProgressChanged +=
new ProgressChangedEventHandler(
backgroundWorker1_ProgressChanged_1);
}
private void button1_Click(object sender, EventArgs e)
{
backgroundWorker1.RunWorkerAsync();
}
private void backgroundWorker1_DoWork(object sender, DoWorkEventArgs e)
{
int count = 0;
string prev = "";
foreach (string line in File.ReadLines(dataFile))
{
if (backgroundWorker1.CancellationPending)
{
break;
}
backgroundWorker1.ReportProgress(count);
try
{
MySqlConnection conn1 = new MySqlConnection("server=demo;port=1234;database=demodb;userid=xyz;pwd=xyz");
conn1.Open();
MySqlCommand cmd1 = new MySqlCommand();
cmd1.Connection = conn1;
string s = line.Replace("\"", "");
if (s.Length > 0 && !(s.Contains("-")))
{
if (s.Contains("ED5."))
{
cmd1.CommandText = "insert into yashomati_demo values('" + s + "')";
cmd1.ExecuteNonQuery();
s = "";
count++;
}
cmd1.Dispose();
conn1.Close();
conn1.Dispose();
}
}
catch (Exception ex) { throw ex; }
}
}
private void backgroundWorker1_RunWorkerCompleted(object sender, RunWorkerCompletedEventArgs e)
{
if (e.Cancelled)
{
MessageBox.Show("You've cancelled the backgroundworker!");
}
else
{
progressBar1.Value = 100;
MessageBox.Show("Done");
}
}
private void backgroundWorker1_ProgressChanged_1(object sender, ProgressChangedEventArgs e)
{
progressBar1.Value = e.ProgressPercentage;
}
private void button2_Click(object sender, EventArgs e)
{
backgroundWorker1.CancelAsync();
}
}
What I am actually facing the problem is , every line is getting inserted twice.
For example: if you have three lines in the file, all three lines gets inserted first and again same three lines is inserted(mean to say file is read again and all lines are inserted)
BR,
Arjun
modified 24-Dec-13 6:18am.
|
|
|
|
|
Firstly, there are two things to notice here:
1) The maximum size of any object in .NET is 2GB - so any contiguous block of memory that the system tries to allocate (regardless of whether you are operating in a 32 or 64 bit environment) must be less than 2GB.
2) StringBuilder works by allocating a chunk of memory and copying your new data into it every time you Append anything. If the space is too small, then the memory buffer is doubled, and the existing content copied in before the new data is added.
So, if you are trying to read 1.5GB, then it will try to allocate memory well and truly in excess of that...Not only will this be pretty slow, but it will need a full 2GB chunk to hold your data, and there will be a lot of Very Large objects created on the way there.
Secondly, how long do you think it is going to take to display 1.5GB of rich text in a RichTextBox? And what earthly use would it be to the user to do that? Do you want to sit there and scroll through that much text looking for the bit you want?
I would very strongly suggest that you reconsider this whole approach, and look at creating something that the user can actually use without wanting to beat you over the head with his keyboard...
|
|
|
|
|
Hi,
I don't mind pushing the whole data into any database(MySQL/MSSQL).
This code I had written for demo purpose only(just to check how long it will take to read a large text file).
Point me to a link or please provide me a sample code.
BR,
Arjun
|
|
|
|
|
Even then, you start doing silly things - a database is worse in many ways because there is no easy way in SQL to return a "chunk" of a column.
What are you trying to do with the data in the real world?
|
|
|
|
|
We try to read data from the file and insert into database and generate reports the data.Which number is dialled from which extension.
But I am failing to read itself.
BR,
Arjun
|
|
|
|
|
So start by looking at the data - how is it organised?
Hopefully (since it has a .TXT extension) it is line based - if so, then it should be pretty easy to handle.
Have you tried
string[] lines = File.ReadAllLines(datafile);
If that works, (and it should, even on a 64 bit system!) it gives you a chance to process each line and transfer that to a separate row in SQL - which would be a lot easier to work with!
|
|
|
|
|
I have a similar problem to be done.
My requirement is to read one large file and split it into 2 files depending on the content.
File format is flat file containing records in each line.
Depending on the record, it will either go into 1st or second file.
What I am currently doing is to read the file one line at a time, check it and write it to either of the two new files created.
With this approach, it is taking around 3 hrs for a file size of 900 MB.
I would like to improve the logic for faster processing.
Can anyone suggest on a better approach ?
|
|
|
|
|
Well, this did it in 59.413 seconds with a 900MB text file, but it also removed all the duplicate lines (of which there were a lot, I don't keep huge text files lying around!)
string origPath = @"D:\Temp\MyHugeText.txt";
string inPath = @"D:\Temp\MyHugeTextIn.txt";
string notInPath = @"D:\Temp\MyHugeTextOut.txt";
var lines = File.ReadLines(origPath);
var isIn = lines.Where(l => l.Contains("raise"));
var notIn = lines.Except(isIn);
File.WriteAllLines(inPath, isIn);
File.WriteAllLines(notInPath, notIn);
Even with the select test reversed so the the duplicates are still written to disk, we are talking about 96.693 seconds showing it's a bit disk limited!
Never underestimate the power of stupid things in large numbers
--- Serious Sam
|
|
|
|
|
Hi this is executing fast.But I will check how long it will take if i try processing each and every line.
BR,
Arjun
|
|
|
|
|
Thanks OriginalGriff. I will try this out.
BTW, is'nt this method using LINQ?
Because, in my deployment scenario there is no 4.0 framework installed. So I may have to target for 3.0 or earlier frameworks.
I can modify this logic to not use LINQ right?
|
|
|
|
|