[Beowulf] Multicore Is Bad News For Supercomputers

Nifty Tom Mitchell niftyompi at niftyegg.com
Fri Dec 5 16:36:21 PST 2008


On Fri, Dec 05, 2008 at 01:48:43PM +0100, Eugen Leitl wrote:
> 
> (Well, duh).
> 
> http://www.spectrum.ieee.org/nov08/6912
> 
> Multicore Is Bad News For Supercomputers
> 

Where do GPUs fit in this?  On the surface a handful of cores in a system
with decent cache would quickly displace the need for GPUs and would
have about as simple a programming+ compiler model as can be had today.

Additional cores are not magic but can set the stage for
better math and IO libraries.   

Me I would rather see more transistors thrown at 128+ bit math.  In the
back of my mind I suspect that current 64bit IEEE math is getting in the
way of global science (Weather, Global warming...).  Perhaps 128 integer
math ops would be a better place to start.  And, Any day now we may need
256 bit integers to manage the national debt.


-- 
	T o m  M i t c h e l l 
	Found me a new hat, now what?




More information about the Beowulf mailing list