[Beowulf] a cluster to drive a wall of monitors

Laurence Liew laurenceliew at yahoo.com.sg
Wed Oct 13 06:04:21 PDT 2004


Check out the Visualization Roll over in www.rocksclusters.org...

The SDCS Rocks guys have got such a system setup....  3x3 panels... 
driven by a Rocks cluster with 9 compute nodes (Shuttle XPC + NVidia 
cards) and 1 frontend....

Contact me offline if there is more interest... or post to the Rocks 
mailing list.


Evan Cull wrote:
> Hi all,
> I was told this list would be a good place to ask for advice on the 
> following project.  (I've tried to search through list archives for 
> related info, but I haven't managed to spot anything so far.)
> I'm helping with a project that want's to drive a wall of about 50 LCD 
> panels with a linux cluster running Syzygy:
> http://www.isl.uiuc.edu/syzygy.htm
> I was considering a cluster of either 50 single processor nodes or 25 
> dual processor + dual output graphics card nodes.  I suppose 50 dual 
> processor nodes would be nice, but I'm pretty sure that's well out of my 
> budget range.  I'm betting that the 50 single processor nodes would 
> easily have twice the graphics performance of the 25 dual nodes because 
> they have 2x as many video cards.  The tradeoff here is that the dual 
> processor nodes might be more useful for other more general computing 
> tasks we could run on them.
> Does anyone here have experience buying rackmountable cluster nodes 
> *with graphics cards* who can point me to a vendor?
> For that matter, have any of you built a similar system & have any 
> suggestions / comments?
> thanks,
> Evan Cull
> _______________________________________________
> Beowulf mailing list, Beowulf at beowulf.org
> To change your subscription (digest mode or unsubscribe) visit 
> http://www.beowulf.org/mailman/listinfo/beowulf
-------------- next part --------------
A non-text attachment was scrubbed...
Name: laurenceliew.vcf
Type: text/x-vcard
Size: 150 bytes
Desc: not available
URL: <http://www.beowulf.org/pipermail/beowulf/attachments/20041013/432b1a47/attachment.vcf>

More information about the Beowulf mailing list