|
Mukesh Ghosh wrote: But as per our company policy i have to be microsoft certified. Then talk to the people responsible for the policy and ask them what certification you need.
|
|
|
|
|
hey guys is there an example on how to get username and password in a file?
input username and pass look it up if ok write ok..
|
|
|
|
|
Most of us use a database for data storage, this article [^]reads and writes to a file and may help you. Getting the logon details are left to the student!
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
|
thx guys.i just need it really simple..
the part that im stuck on is the lookup of the user and pass..im of for one but when it comes to to strings i get confused..
|
|
|
|
|
techker2 wrote: when it comes to to strings i get confused. In what way? You can read a file line by line and use a simple string compare (http://msdn.microsoft.com/en-us/library/858x0yyx(v=vs.110).aspx[^]), to find the keyword you are interested in. Alternatively, you could create a serializable class and let the objects do the work.
|
|
|
|
|
from the readLine
someting like
string Lookup = Console.Readline(username and password)
|
|
|
|
|
|
Just started school not to long ago..i just wanted to practice some stuff before a project..
Thx!
|
|
|
|
|
Then that book I suggested is an excellent introduction to C# and would help you a lot in any small project. It is very well written with lots of sample code, and easy to read, even if English is not your first language.
|
|
|
|
|
|
great!just downloaded the book..
thx for the help guys!
|
|
|
|
|
Hello All,
I'm new here. I'm rewriting an application I have in Java which uses Apache's Derby database. I am looking for the simplest embedded database solution I could use from within Visual Studio.
Would SQL Server Compact be a good choice? Once again, I favor simplicity over speed.
Thanks in advance,
Pawel
|
|
|
|
|
Paweł Mrozik wrote: SQL Server Compact be a good choice
Yes.
Paweł Mrozik wrote: use from within Visual Studio
Why would you do that?
|
|
|
|
|
Thank you very much. SQL Server Compact it is then.
|
|
|
|
|
I have an application needs to retrieve large dataset from the database (5 million could be go up to 20 millions records) then use it for stats calculation (such as range, mode, stddev, variance ...). The application has memory spike when loading large dataset and store the records into LIST collections. sometimes the application crashes due to memory usage.
Is there any alternatives ? use any other technologies, such as NOSQL.
|
|
|
|
|
Once you retrieve the dataset in code, the database no longer matters.
You should consider setting up a configurable report and only pulling in the data needed for the report. I doubt that you are reporting on all 5 Million records at any given time.
Additionally, you can retrieve the data grouped and sorted from the database.
<jl>W
|
|
|
|
|
Tricky since we don't know what you need to report, but pulling 5-20 million records into memory seems a bad idea in any case. let the server do the work. There are numerous options available to you to speed up things.
You could calculate statistics "continuously" in the background so when you pull the report (or stats) you have a much less work on getting it out. Calculating counts, sums, averages etc... is pretty though on resources and time when using large datasets. If you analyze your queries you can probably optimize all this, but remember that aggregate functions render any index on that column useless.
Another option would be to write a server side module that receives a "task" and notifies the user when it's done. You could write it in such a way that it consumes less memory, but it will probably take longer. You could optimize this maybe by using several threads.
Both solutions have the downfall that when many users launch these calculations, the server comes to a standstill. Your best bet is to probably divide all this in bits and pieces that calculate seperately.
If this is a real production database and the calculations are often en significant I'd recommend a new database optimized for reporting and go from there. (Database structures and setup can be different depending on the usage of the database)
Just to give you some ideas. I have rarely seen "real-time" statistics on large datasets, rather they have a delay (eg statistics on my database are quite simple, but due to the size only calculated once a day)
hope this helps...
|
|
|
|
|
What database are you using? If MSSQL or Oracle, you could write stored procedures or views to do the statistical calculations server side, and just return the results. This is likely to be much more efficient than pulling all of the data client side and then calculating
=========================================================
I'm an optoholic - my glass is always half full of vodka.
=========================================================
|
|
|
|
|
As all the others have stated, you are using the wrong tool for the job. You are using the CLIENT, used to display stuff, to do the work of a SERVER, used to crunch stuff. Use the database to crunch the data the client needs to display.
Most LOB developers work with volumes like these and never do the crunching on the client. We have a policy that we never send more than 2k records to the client. There are exceptions but the business has to sign in blood to get them accepted.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Mycroft Holmes wrote: We have a policy that we never send more than 2k records to the client.
I like the fact you have a setup a real limit for this.
Out of curiousity, you do look at the record size too right? eg. I send way more than 2k records to my client website, but they are merely timestamp/value pairs. If I would need to send several megabyte blobs I would limit that severely
|
|
|
|
|
V. wrote: you do look at the record size too right
No, all our data is text and none will amount to a lot per record so we don't go past the record count. I did catch one guy trying to pass a list of 1k records with attached documents up to 20mb EACH. Nailed that rather quickly. You want an attachment you get it one at a time, here have a list of file names!
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
I suggest you to download the data first, instead of keeping such a huge data in in-memory.
You can download the data to the client and store it in MSAccess or .TXT. Later use your code to do the calculation on the data.
|
|
|
|
|
Hi,
does anyone has a good tutorial on pagination , I searched the internet no tutorial walking.
Thank you
|
|
|
|
|