|
|
Things Paris does not understand.
|
|
|
|
|
Yet another Cage movie?
Hmm i wonder why its doing that......ARGHS NO STOP, ROLLBACK ROLLBACK...F*** That's how i learned to "Always Backup"!!
Dogs are man's best Friend,
Cats are man's adorable little serial killer
|
|
|
|
|
Nope it's not Mad Max III
|
|
|
|
|
Today i came across a scene where we had a discussion on this topic.
we have some work on the database side where batch files data is insert (row count is several thousand per operations )/ some text files generated by windows services / flag status maintain based on the data insert in the database per operation and many other small operations happen during this process. obviously correctness of the operation is must but at the same time performance of the operation came in to the picture. some teammates says that as there are lot of process happens during per operations it will take time to complete things while some believe that there can be something done to reduce the time. Client is okay even if that takes time as for him accuracy is more important rather than time taken. How to manage such scenarios ? after some level we can not think of optimization on query as all the things which is written is necessary. does anyone face such things ? how do you manage things to improve the performance?
Ravi Khoda
|
|
|
|
|
ravikhoda wrote: Client is okay even if that takes time as for him accuracy is more important rather than time taken.
So it appears that from the client's point of view there is not a problem with 'performance' - so who is pushing for 'performance'?
ravikhoda wrote: after some level we can not think of optimization on query as all the things which is written is necessary.
There are always tweaks one can make to speed things up - proper analysis with regards to indexes, table structure, partitioning, staging the data from the batch files and indexing those staged tables, improvements at the hardware level...
You have not given much information in order to be able to answer the question.
“That which can be asserted without evidence, can be dismissed without evidence.”
― Christopher Hitchens
|
|
|
|
|
ravikhoda wrote: Client is okay So why wasting time on something no one wants?!
ravikhoda wrote: accuracy is more important So work on accuracy...
In a much theoretic level - split your operation into smaller junks (it seems you done it already, but see if there is more), examine every step on it's own. Also consider asynchronous handling. Sometimes you do not need all the data in real time, so you may use multiply (synchronized) databases, one for writing and one for reading operations...And much much more things, most depend on your specific solution...
I'm not questioning your powers of observation; I'm merely remarking upon the paradox of asking a masked man who he is. (V)
|
|
|
|
|
ravikhoda wrote: we have some work on the database side where batch files data is insert
So is it done by a sheduled night job or something similar? If it is do you have a risk that other things might take to long time time or not getting the data (other night jobs that needs data from the previous job, etc) ?
If it is a seperate night job with no other dependecies (and given that it doent take a long time = >3-6 hours) leave it be, then it's better to focus on accuracy. But it all comes down to what is done and what do the import achieve?
If a user presses a button to start the job and expect the outcome faily rapidly then 3-6 hours is way to long and you need to work on both accuracy and performance.
ravikhoda wrote: does anyone face such things ?
Yes, constantly.
|
|
|
|
|
Basically you can answer this question by your self, by asking youself a bunch of questions:
Is the process a critical part of the application?
Depending on how critical the process is, you can do several things to optimize performance. For example, if the process isn't that important to the basic operation, then you can try to push the process in the background, i.e. give it a lower priority.
On the other hand, if it is a critical operation, you have to find other means to optimize the process. And that can be very hard, especially if the code has to keep it's accuracy.In that case you have to decide if the work, required to optimize the code, is worth your client's money. Because that is what it comes down to.
Are there opportunities to do optimizations/micro-optimizations on several parts of the code?
This will probably cause a heated discussion, but micro-optimizations do work, if there are plenty of opportunities. I.e. benefit > time. The basic rule here is ~+5% performance, or more, should be worth the time. In reality those figures will come down to 2-3% performance plus, but depending on how much time your process takes, the total benefit can be huge.
Can a technology change improve performance?
This is a big one, a very big one. Before you even consider to switch to a different technology, you have to do "the math". Is it even possible to migrate the "old" storage, to the "new" one? How much time, and in the end money, does this cost? Is the benefit high enough? And so on, and so forth.
This is a step i usually don't recommend, but it can pay out in the long run.
Is that what the client wants?
In the end, the client has the last word. If he is satisfied with the performance of the application and doesn't want any optimizations, then the situation is very clear. Even if other people, colleagues, think otherwise.
Greetings Daniel
|
|
|
|
|
You lost me already at the subject line.
If it's not accurate, it's buggy, and therefore wrong.
|
|
|
|
|
But all doubles can be rounded up into ints ...
We'll save 4 bytes! That's FIFTY PERCENT LESS MEMORY USAGE!!!
|
|
|
|
|
Increasing performance is never a lost exercise even if you shouldn't implement it right away. When the product matures and new functionality is added you can optimize (or often by then, it is needed).
Although the first thought should always be to optimize algorithms or indexes, other tricks do exist which make applications perform better.
eg. We used to have an application that showed a form with many comboboxes loaded with thousands and thousands of records from the database. They ALL had to be in there. To increase the performance of that winforms application we did the following:
* first used suspendlayout-resumelayout functions to avoid reloading of the comboboxes.
* replaced the comboboxes with a textbox and a button. the button opened a small dialog with the records (which were loaded only then). Since nothing had to be loaded on form load the user had the feeling the application behaved better.
others:
* When using grids you can also use paging. In listboxes there's a way to load only the visible part (+ a small extra part above and below). When scrolling the rest is loaded.
* You can use seperate threads
* you can perform calculations on the server and notify the application when done...
* ...
Point is, there are many different ways to improve performance. You don't need to implement it right away, but it's a good exercise to start thinking about options that might be acceptable for the client.
hope this helps.
|
|
|
|
|
ravikhoda wrote: Client is okay even if that takes time as for him accuracy is more important rather than time taken.
The client is always right.
The report of my death was an exaggeration - Mark Twain
Simply Elegant Designs JimmyRopes Designs
I'm on-line therefore I am.
JimmyRopes
|
|
|
|
|
Not always, sometimes you have to convince the client to do the right thing.
For example, i had a client that had a existing web application and my task was it to add features for which the original code never was designed for. I.e. it would have take me longer to implement the new code, than rewrite the whole application from scratch.
So i had to convince my client to do "the right thing", and let me rework the whole thing for the same price.
|
|
|
|
|
Hi there, I have an application in the need of a new feature...
|
|
|
|
|
How about, NO?
Done it once, and will never again.
|
|
|
|
|
Fair enough.
|
|
|
|
|
Daniel Lieberwirth (BrainInBlack) wrote: Not always, sometimes you have to convince the client to do the right thing.
In this case the client is doing the right thing. They wart accuracy over performance. How can you argue with that?
The report of my death was an exaggeration - Mark Twain
Simply Elegant Designs JimmyRopes Designs
I'm on-line therefore I am.
JimmyRopes
|
|
|
|
|
I didn't argue, i just challenged your somewhat general statement that the client is alway right. He isn't and that's the point i wanted to make clear with my statement.
|
|
|
|
|
Whenever in doubt the customer is always right.
The only exception to that is if you do not want to be paid for your efforts.
The report of my death was an exaggeration - Mark Twain
Simply Elegant Designs JimmyRopes Designs
I'm on-line therefore I am.
JimmyRopes
|
|
|
|
|
1- The client is right about WHAT they want. Client is usually never right about HOW to do it or how much it will cost.
2- Accuracy over performance. Accuracy is usually a black or white determination (not always, but usually). Performance can be tweaked and dialed up in degrees over time and iterations. To paraphrase...first make it work, then make it work fast, then make it elegant.
|
|
|
|
|
jeffreystacks wrote: first make it work
The 3 biggest lies:
1 - I love you.
2 - I will respect you in the morning.
3 - Just make it work so we can release it and we will go back and refactor it later.
The report of my death was an exaggeration - Mark Twain
Simply Elegant Designs JimmyRopes Designs
I'm on-line therefore I am.
JimmyRopes
|
|
|
|
|
Depends. If numerical calculations are involved, there are technical limits to the accuracy you can achieve, and trying to push down the tolerance limits below that won't achieve anything but provide a false sense of accuracy that simply isn't there.
|
|
|
|
|
Ravi Khoda wrote: we have some work on the database side where batch files data is insert (row count is several thousand per operations )/ some text files generated by windows services / flag status maintain based on the data insert in the database per operation and many other small operations happen during this process. obviously correctness of the operation is must but at the same time performance of the operation came in to the picture. some teammates says that as there are lot of process happens during per operations it will take time to complete things while some believe that there can be something done to reduce the time. Client is okay even if that takes time as for him accuracy is more important rather than time taken. How to manage such scenarios ? after some level we can not think of optimization on query as all the things which is written is necessary. does anyone face such things ? how do you manage things to improve the performance?
Let's not lose track of the task at hand. It is database update, not calculations to put a space probe on Uranus.
The report of my death was an exaggeration - Mark Twain
Simply Elegant Designs JimmyRopes Designs
I'm on-line therefore I am.
JimmyRopes
|
|
|
|
|
A valid point. Then again, the only meanings of accuracy I am aware of don't relate to the only alternatives being data is there or not there! If during a transaction data is lost, then the transaction is not inaccurate, it is erraneous!
P.S.: I've just found that "free from errors" is actually one of the possible meanings of accurate[^].
modified 8-May-14 2:25am.
|
|
|
|