[Beowulf] The GPU power envelope ------ Re: Beowulf Digest, Vol 109, Issue 22 ----
Lux, Jim (337C)
james.p.lux at jpl.nasa.gov
Sun Mar 17 21:27:25 PDT 2013
One needs to be careful about estimating costs when you presume the
existence of a few "stars" and that you have them available.
That star may not be available for your project, or may not be interested
in it.
In the space biz we always run into the problem that companies would
rather have their "A team" working on a product with hundred million unit
sales rather than on something for us with 10 unit sales. That's why we
use a lot of FPGAs and the relatively few ASICs that have been developed
for Space have had very large government funding (e.g. ESTEC funding the
development of a ADSP 21020 work alike in a rad tolerant process, various
and sundry SpaceWire interface chips like the SMCS332, various
MIL-STD-1553 or CAN bus controllers, and, the Atmel/Temic space SPARC CPUs
starting with the ERC32 and moving through the LEON cores, the BAE RAD750,
the RAD6000 (a MIPS CPU), and that rad tolerant Pentium from Sandia...
Stuff like that)
ASIC designers and FPGA programmers are higher paid than software
developers in general, but not by a significant amount. I'll bet there's
not a lot of folks making, say, $1M/year in wages (you can't count
cashing in on options when something goes right.. That's more a form of
taking part of your salary/wages and investing it in the stock market).
Maybe DE Shaw is paying high rates: they have a very attractive value
proposition for the work being done, and they might consider it worthwhile
to invest higher wages in top people. Someone designing hardware/logic
for high frequency trading might also command a high rate, because in the
context of the amount of money flowing through the process, a $1M/yr
salary isn't even in the noise level.
But those are very niche markets.
I suspect that the guys and gals designing nVidia and Intel processors are
doing well, but not 1% type well. Glassdoor, in fact, shows the highest
salary for an ASIC design engineer is 167k. (Nvidia is 114-140k) FPGA
designer salaries at the same site are about 20% lower.
On 3/16/13 8:01 PM, "Vincent Diepeveen" <diep at xs4all.nl> wrote:
>What budget do you have to build that FPGA?
>
>As under a few million dollar project team to design that FPGA you
>won't easily beat a GPU with a FPGA for things like matrix calculations
>and especially not for double precision floating point calculations
>in general spoken.
>
>Would be interesting to know whether there is exceptions to the rule
>as of today... ...like a guy like Feng Hsiu Hsu (the deep blue chip
>programmer)
>who single handed programmed a chip.
>
>Every year that passes the budget you need to beat the latest GPU's
>is increasing,
>note i would want to extend the gpu versus fpga discussion a tad more
>and also include Xeon Phi in this list as well as the latest IBM
>incarnation,
>though that isn't a GPU of course, yet it already runs what is it 18
>cores @ 72 threads or something (didn't check lately)?
>
>IBM so to speak is scaling up their BlueGenes quicker in number of
>cores than the GPU's currently advance there...
>
>Beating these specialized HPC/GPGPU processors with a FPGA seems to
>get tougher and tougher.
>
>Of course you can do it for specific calculations, especially prime
>numbers... ...but at which budget costs?
More information about the Beowulf
mailing list