Click here to Skip to main content
15,910,123 members

Comments by Rasepretrep2 (Top 1 by date)

Rasepretrep2 18-Nov-10 11:24am View    
i think the time-consuming bottleneck in this case for either MD5 or SHA1 is the fact that you have to read the whole amount of data to generate the checksum. The MD5 algorithm ist really pretty fast (mostly only shifts and adds) , the IO time is the limiting factor here.
Unfortunately there is no way around that. You could try to calculate an MD5 sum on only portions of the data to speed up the process for each file but then there might be changes just in the skipped data.