[Beowulf] Accelerator for data compressing
Bruno Coutinho
coutinho at dcc.ufmg.br
Thu Oct 2 19:42:32 PDT 2008
2008/10/2 Bill Broadley <bill at cse.ucdavis.edu>
<...>
Why hardware? I have some python code that managed 10MB/sec per CPU (or
> 80MB
> on 8 CPUs if you prefer) that compresses with zlib, hashes with sha256, and
> encrypts with AES (256 bit key). Assuming the compression you want isn't
> substantially harder than doing zlib, sha256, and aes a single core from a
> dual or quad core chip sold in the last few years should do fine.
>
> 1TB every 2 days = 6MB/sec or approximately 15% of a quad core or 60% of a
> single core for my compress, hash and encrypt in python. Considering how
> cheap cores are (quad desktops are often under $1k) I'm not sure what would
> justify an accelerator card. Not to mention picking the particular
> algorithm
> could make a huge difference to the CPU and compression ratio achieved.
> I'd
> recommend taking a stack of real data and trying out different compression
> tools and settings.
>
> In any case 6MB/sec of compression isn't particularly hard these days....
> even
> in python on a 1-2 year old mid range cpu.
>
>
>
In Information Retrieval, they compress almost everything and they have
papers showing that using compression can result in a *faster* system. You
process a little more, but get great gains in disk throughput.
If you compress before even storing data, your system could store faster by
using less disk/storage bandwidth per stored file.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.beowulf.org/pipermail/beowulf/attachments/20081002/d60c8b87/attachment.html>
More information about the Beowulf
mailing list