This is quite simple, all you need is:
The time taken so far
The percentage complete
Now, we can work out how long is left.
Think about this for a minute, if we have done 20% (1/5th) of the processing, then the total time is going to be 5x our current processing time. if we have done 50% (1/2) of the processing, then the total time will be twice our current time.
If you notice the trend there, the total time is going to be 1/fraction_complete.
double nPercentage = 0.2; double nElapsed = 5.0; double nTotalTime = (1.0 / nPercentage) * nElapsed;
double nRemaining = nTotalTime - nElapsed;
Time units are not important, as long as you are consistent.
This is the code I used for testing, if you care:
#include <stdio.h>
#include <Windows.h>
void TimerStart(LARGE_INTEGER *pSpec) {
QueryPerformanceCounter(pSpec);
}
double TimerQuery(LARGE_INTEGER *pSpec) {
LARGE_INTEGER li, liFreq;
QueryPerformanceCounter(&li);
QueryPerformanceFrequency(&liFreq); return (li.QuadPart - pSpec->QuadPart) / (double)liFreq.QuadPart;
}
int main() {
LARGE_INTEGER start;
TimerStart(&start);
for (int i = 0; i < 200; ++i) {
Sleep(100);
double nPercentage = i / 200.0; double nElapsed = TimerQuery(&start); double nTotalTime = (1.0 / nPercentage) * nElapsed;
double nRemaining = nTotalTime - nElapsed;
printf("%2.1f%% complete, Estimated %f seconds remaining\n", (nPercentage * 100.0), nRemaining);
}
printf("Total execution time: %f seconds\n", TimerQuery(&start));
return 0;
}
EDIT:
This is a simple approach. It will only be accurate, so long as it takes roughly the same time to complete any 1% of operations.
For example, copying files. If you use the number of files copied/remaining as the percentage then a 1GB file will be estimated as taking the same time to copy as a 1KB file. In this case either take a long term average as well, or use data size, rather than file count as the metric of percentage.
Since you are using data size, this should not be an issue, but keep it in mind if you need to do timings at any other stage.