[Beowulf] Moores Law is dying
Robert G. Brown
rgb at phy.duke.edu
Tue Apr 14 15:38:57 PDT 2009
On Tue, 14 Apr 2009, Joe Landman wrote:
>> Why is sharing expensive in performance? It might take a little
>> overhead to setup and manage, but why is having multiple virtual
>> addresses map to the same physical memory expensive?
>
> Contention. Memory hot spots. Been there, done that. We are about to do
> this all over again (collectively).
Also, streaming vs jump memory access, cache thrashing, etc. As I said,
nobody unrolls code all the way because extra jumps in code aren't
horribly expensive and memory in the past has always been a scarce
resource. Make machines with a TB of RAM, however -- quite possibly
within 15 years -- and actually increase memory bw anywhere near
proportionally -- and new optima may emerge that use much more text/code
memory than today's models (mostly established when memory was VERY
expensive and inherited from then).
This isn't a prediction or anything like it, BTW. I'm not really
arguing with Forrest's Law, only that it may be more like Forrest's
Transient Rule of Thumb if and as things continue. I'm guessing that
there has been an exponential growth rule for the size of all loaded
text/code on any given class of system stretching back to antiquity
(say, DOS). Data too, but we're not fitting the entire universe of OS
and program into 4K to 64K segments any more either. And there is a
fairly profound growth law in the number of packages produced not by any
single person but by many, many hands. Yes, some of the code is shared,
efficiently or not. But as disks get to be "infinite" in size compared
to code, as operational memory gets to be "infinite" in size compared to
code, one motivation for sharing things goes away.
If it is indeed exponential, it's more a question of WHEN Forrest's Law
stops working, not if.
The same things are all true, by the way, of e.g. the fiction market, or
any of the book markets. I just got a Kindle (2) using (of course)
Other People's Money, and am facing several mutually contradictory
realities. There are tens of thousands of out-of-copyright books
(perhaps order of 100,000 total) on some mix of Project Gutenberg and
the various other free ebooks sites. There are order of 300,000 titles
currently available for the Kindle (although a lot of them are multiple
copies of Project Gutenberg books). There are perhaps 500,000 books
alleged to be in Google Books, although few of them are complete. There
are order of 2,000,000 to 2,500,000 books more or less in print,
available one way or another commercially. There are order of
10,000,000 formally published books "in existence" (say, within a factor
of two or three of this).
These numbers have been stable (except the first few, which are
categories that have only recently come into existence) for decades -- a
few tens of thousands of new books make the publisher's lists every
year, and just as many leave it. Given a mean compressed size of
ballpark 250-300 KB per book WITHOUT pictures (empirically true for the
books I've gotten already, even including some really long sagas), one
might expect to be able to put all of human literature into a few
Terabytes, and include all or most of the figures and pictures and
accompanying graphics well within 100 TB. That is to say, one could
store the essential information content of the Duke library (one of the
top 10 or so in the world) on a slightly fat desktop, and include all of
the pictures and illustrations on a midsized storage cluster. And in a
few more years, the desktop will be able to do it all!
The Kindle has around 1.4 GB storage, not expandable (some alternatives
accept SD chips and can manage tens of GB). I thought this might be
restrictive, but that is maybe 1000-4000 books, depending on how
graphical one gets on a system that still sucks for graphics. I read
like the wind, I have a personal library of paper books so large I have
to keep half of it in storage -- it won't fit in my house. That is to
say, around 3000 books. That's an ENORMOUS personal library. My Kindle
could eat it without a trace, if all of the titles were available and if
they weren't for sale at absurd prices, and it only a second generation
product.
So right NOW the Kindle has -- for me -- nearly infinite capacity -- it
seems difficult for me to imagine filling it at all, let along filling
it multiple times, given how cheap I am and unwilling to pay $10/title
for ebooks. And yet I will, if I live long enough (fill it, not pay
$10/title:-). In fact, as the Kindle itself "grows", as Plastic Logic
and the Irex Iliad and the Sony and alternatives not yet invented come
online with larger screens, as the screens manage color, as the color
screens embrace full color graphics, I'm certain that by the time the
Kindle V comes around, with its 250 GB of flash memory and strong
saturation of an epublishing business that has to stay too cheap to be
worth cracking, my personal ebook will have tens of thousands, maybe
hundreds of thousands of titles. For one thing, the rate of production
of new books will have skyrocketed, as "anyone" can suddenly become an
author (this is already happening, on a much smaller scale than youtube
but the same general way. For another, all of the out of print books
will no longer be out of print -- they'll be in print as POD titles and
available as ebooks for prices like $0.25 or $1. Or (eventually) for
free, as the universe routes around copyright laws altogether and they
are eventually abandoned for some other completely new paradigm.
The point being that it is pretty easy to look at any given snapshot of
the infotech revolution and conclude that there is an "infinite"
capacity represented there already, that we'll never need any more. And
then tech pushes the capacity past that point, the market adjusts and
changes the way that it works to use the capacity (sometimes undergoing
a paradigm shift to do so, not just supply/demand adjustments) and three
years later people are moaning about how old/slow/small systems are and
how much they need new/fast/big replacements. And they're RIGHT. They
DO need them.
I've got a genuine IBM PC motherboard squirrelled away upstairs for
anyone who thinks otherwise.
rgb
>
>
>
> --
> Joseph Landman, Ph.D
> Founder and CEO
> Scalable Informatics LLC,
> email: landman at scalableinformatics.com
> web : http://www.scalableinformatics.com
> http://jackrabbit.scalableinformatics.com
> phone: +1 734 786 8423 x121
> fax : +1 866 888 3112
> cell : +1 734 612 4615
> _______________________________________________
> Beowulf mailing list, Beowulf at beowulf.org sponsored by Penguin Computing
> To change your subscription (digest mode or unsubscribe) visit
> http://www.beowulf.org/mailman/listinfo/beowulf
>
Robert G. Brown http://www.phy.duke.edu/~rgb/
Duke University Dept. of Physics, Box 90305
Durham, N.C. 27708-0305
Phone: 1-919-660-2567 Fax: 919-660-2525 email:rgb at phy.duke.edu
More information about the Beowulf
mailing list