Click here to Skip to main content
15,889,527 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more: , +
Hi all,

I am in a fix for a better solution here for my digital web shop.

My situation is that I have 6 packages for sale, each containing 1GB of files on my server.
These files will be uniquely watermarked before being compressed to allow for
download by each buyer - reason to uniquely identify the buyer from any file.(piracy reasons)

The problem is because each file is still 700-800MB after compression, and supposing 50 downloads per product a day, that will work out to (50 downloads x 6 products x 700MB each) = 210GB of files on my server everyday, if i keep them a day for my buyers to download them.

Is there a more practical method to implement this?
Or is this how the experts in the industry do it, just by buying lots and lots of disk space?

Thanks,
Brian
Posted

1 solution

Is there any reason you can't do the watermarking on-the-fly as you are serving the download? That way, all you need to store for more than a few minutes is stuff you're keeping anyway - customer name and watermark key. To do this, you will probably need to set up your watermarker as an external CGI (from Apache's perspective) that you can call from the download page. Warning: You might have timeout issues, depending on how long the watermarking process takes.
A maybe nicer but more complex way would be to recast your watermarker as an Apache module and stream-feed the results. MUCH more work, though, and quite likely less flexible.

[edit] added in response to OP's comment

For CGI, start here[^] Essentially, whatever your program/script writes to stdout gets sent to the client.
Can you get the customer to wait a minute or two while you prepare the download? Show 'em a movie or something :)
The other point I should have mentioned that you can buy a 2TB HDD (almost 10 days worth on your numbers) for around US$100. Maybe the old mainframe mantra of "throw more hardware at it" would be the most painless way to go.

[/edit]

[edit2] in response to second comment

A zip or rar might be what you're looking for. Compression and multiple files in one. IIRC the 'directory' in a zip file is at the end, so you can stream multiple files out, remember the details, and write the directory at the end.

[/edit2]

Cheers,
Peter
 
Share this answer
 
v3
Comments
funniezatee 5-Apr-11 0:29am    
Hi Peter,

actually I did testing for on the fly watermarking. My code is in php, because its an Apache linux server, so no C# or java for me.

After testing, i found that it takes ~1min++ to copy a new set of files to work on, 30-50secs to watermark them, and another 1 min to compress them.
Most of the time is not in the watermarking process, but tied up in file I/O.

I'm not quite sure on how to implement CGI or Apache module. Can i just put php code into them, or?
Are they worthy considerations?

Brian
funniezatee 5-Apr-11 2:22am    
Okay, i was checking out on this on the fly streaming thing and it seems quite a possible method for php. I wonder if its possible to stream multiple files continuously one after the other on a single download? Is there such a thing?

Brian

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900