[Beowulf] Brian Guarraci (engineer at Twitter) is building a Parallella cluster in his spare time

Lux, Jim (337C) james.p.lux at jpl.nasa.gov
Tue Jun 3 18:32:35 PDT 2014

"The bus bars are then mounted to the switch using 3M industrial Velcro, which is easy to work with, very strong and serves to insulate the bus bars from the metal switch case."

Yes.. not everything has to be nuts and bolts.. Sticky tape and Velcro.

Jim Lux

-----Original Message-----
From: Beowulf [mailto:beowulf-bounces at beowulf.org] On Behalf Of Eugen Leitl
Sent: 2014-Jun-03 8:27 AM
To: beowulf at beowulf.org
Subject: [Beowulf] Brian Guarraci (engineer at Twitter) is building a Parallella cluster in his spare time


Brian Guarraci is a software engineer at Twitter and in his spare time he’s building a Parallella cluster with a design that was inspired by two of the most iconic supercomputers ever made.

When we saw pictures of Brian’s cluster we were impressed and when we shared these with the community, it became apparent that we were not the only ones! It didn’t take long before curiosity got the better of me and I decided to get in touch with Brian to find out more…

Hi Brian, can you tell me about the Parallella cluster you are building.

I’m building a low-power general purpose compute cluster. I want it to be able to take advantage of standard distributed system packages so that there’s a familiar developer model. The Parallella boards are great for computation but since they have relatively limited storage and memory, I added two Intel NUCs. Each NUC has 1x Intel i3, 16GB RAM, 120GB SSD, 802.11ac WiFi and are also pretty low-power. The NUCs run Ubuntu server and are storage hosts and the primary interface to the external world. The system has 8x Parallella boards and a shared gigabit Ethernet switch, giving a peak performance of around 208 GFLOPs.

Beowulf mailing list, Beowulf at beowulf.org sponsored by Penguin Computing To change your subscription (digest mode or unsubscribe) visit http://www.beowulf.org/mailman/listinfo/beowulf

More information about the Beowulf mailing list