[Beowulf] Project Heron at the Sanger Institute
jaquilina at eagleeyet.net
Thu Feb 4 10:14:17 UTC 2021
I am curious though to chunk out such large data is something like hadoop/HBase and the like of those platforms, are those whats being used?
From: Beowulf <beowulf-bounces at beowulf.org> on behalf of Jörg Saßmannshausen <sassy-work at sassy.formativ.net>
Sent: 03 February 2021 19:23
To: beowulf at beowulf.org <beowulf at beowulf.org>
Subject: Re: [Beowulf] Project Heron at the Sanger Institute
interesting stuff and good reading.
For the IT interests on here: these sequencing machine are chucking out large
amount of data per day. The project I am involved in can chew out 400 GB or so
on raw data per day. That is a small machine. That then needs to be processed
before you actually can analyze it. So there is quite some data movement etc
All the best
Am Mittwoch, 3. Februar 2021, 14:06:36 GMT schrieb John Hearns:
> Dressed in white lab coats and surgical masks, staff here scurry from
> machine to machine -- robots and giant computers that are so heavy, they're
> placed on solid steel plates to support their weight.
> Heavy metal!
Beowulf mailing list, Beowulf at beowulf.org sponsored by Penguin Computing
To change your subscription (digest mode or unsubscribe) visit https://beowulf.org/cgi-bin/mailman/listinfo/beowulf
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Beowulf