# no 'commodity' OS is 'secure' Re: [Beowulf] Which distro for the cluster?

Robert G. Brown rgb at phy.duke.edu
Wed Jan 10 08:04:49 PST 2007

On Wed, 10 Jan 2007, Andrew Piskorski wrote:

> On Sun, Jan 07, 2007 at 03:49:50PM -0500, Robert G. Brown wrote:
>
>> I completely agree with this.  As I pointed out earlier in the thread,
>> companies such as banks make "conservative" seem downright radical when
>> it comes to OS upgrades.  They have to do a complete, thorough,
>> comprehensive security audit to change ANYTHING on their machines -- as
>> a requirement in federal law, IIRC.  To get them to take you seriously,
>> you MUST be prepared to support the OS they install on (once it is
>> successfully audited) forever -- until the hardware itself falls apart
>> into itty-bitty bits.
>
> And yet these same hyper-'secure' organizations are running Microsoft
> Windows, Linux, and/or Unix on these super important, super 'secure',
> mission-critical boxes?  Frankly, that's oxymoronic.  It sounds
> suspiciously like decision making driven by what the rules and
> paperwork says you're supposed to do (aka, CYA), and/or general
> myopia, rather than a sound assessment of what the right solution to
> the real problem actually is.

CYA doesn't begin to describe it -- try federal law.  And it isn't
really crazy to do things the way that they do them -- look at all the
phishing out there for bank information.  Ten years ago script kiddies
who cracked some Sun workstations on our campus used them as a jumping
off place to attack both banks and the FBI, and they were doing it just
for fun, not because they seriously expected success.  Those were also
the days when there was a real ubercracker in one of the campus's Unix
networks -- one that encapsulated so precisely that the only way you
could tell he or she was there was with a completely passive box on the
same wire monitoring the actual packet traffic created during the times
they came in through there completely invisible back door.  No log
traces, no sign of binaries being tampered with -- every piece of
software that might have revealed their presence was replaced.  (This
was relatively easy with SunOS in the old days, as it was a very slowly
moving target and installed "identically" from tape to system in most
organizations).

I have little doubt that some of those attacks on banks and possibly
other facilities succeeded (forcing FDIC payout or painful losses) and
were seriously hushed up.  Mainframers dominated the scene then, of
course, and I'm equally certain that much FUD was spread about to help
those COBOL coders continue working.  Some very hardnosed individuals
then decided that Unixoid systems could succeed, but only if they were
kept up to a much higher security standard than most sysadmins knew how
to accomplish.  Hence laws that are likely derived from the older
mainframe laws plus a measure of common sense from the banks themselves
(who just HATE to lose money, after all).

> We all know that Windows is (much) less secure than Linux, and Linux
> is presumably less secure than OpenBSD.  But if you take a step back
> and look at the bigger picture, OpenBSD and MS Windows are both in the
> same bin, and that bin is labeled, "inherently unreliable and insecure
> operating systems".
>
> OpenBSD calls itself "ultra-secure", which is like calling the most
> advanced World War II piston-engined fighter planes "ultra-fast".
> Yes, it's true, more or less - as long as you're only talking about
> other piston engined aircraft, and are content to ignore the existence
> of jets and rockets.
>
> It's not something I know much about, but I am told that much more
> reliable and secure operating systems do exist, and have been
> commercially successfull in niche markets, both now and in the past.
> Niche markets like, say, the OS that runs your advanced pacemaker,
> some network routers, or aerospace systems.

Any OS can be made secure.  Even Windows.  It just requires a competent
sysadmin and audit team and a fairly rapid closed development loop
between the OS folks and the implementation/audit folks.  There are
Windows uberadmins and systems engineers out there too, don't forget,
and MS pays its coders lavishly and gets some of the best that there
are.  They just put their money and development effort where the profits
are.  Consumers are sysadmin idiots in ALL cases for ALL OS's because a
modern networking OS is a nontrivial thing to administer with lots of
moving and breakable parts and because they install software from dozens
of sources including ones with absolutely no line of responsibility or
trust.

There is a world of difference between a Windows server set up in a bank
environment, where they are running only a fully patched variant of
Windows that has been really throroughly audited for holes, in a
completely minimal installation (no gorp as all gorp must be audited and
increases risk) with only certain very specific ports open and those
watchdogged and externally firewalled, running software that only MS has
written and debugged top to bottom, being administered by REAL MCSE's --
not the ones that pick up their degrees from an online training program,
but people with masters level CPS degrees AND MCSEs AND credentials from
multiple additional training courses AND ten years of experience in the
trenches.

In this sort of environment, Windows is remarkably stable (surprise
surprise) and not at all easy to crack because People Are Watching the
Software that is Watching the People that are Watching the Software that
is Watching the Computer...(iterate to some sort of convergence).
Problems that emerge are quickly and quietly fixed, and the whole thing
re-audited which is possible because of the minimal configuration thing.
Costs an ocean of money to do things this way, of course, but to a bank
or a government secret org or a major R&D company with secrets to
protect, it is worth it.

The real observation that you are making is that (as is often the case)
"worth it" isn't the same as "cost effective compared to alternatives".
I would guess that it is a hell of a lot easier to secure almost any
unixoid OS in a server configuration, where again one can secure even
things like RH 7.3 or AIX or MacOS IF you are willing to pay what it
takes to close the audit/debugging process and invest the human
resources to configure and run the thing intelligently.  A system (or
internal network) with a single port open to the outside world, with
guardian daemons and humans constantly watching the doors inside and
outside, where physical presence sitting at a local terminal with things
like magstripe cards and/or bioscans needed to authenticate, where those
very physical presences are required to pee into cups and take regular
polygraphs -- it isn't really that easy to crack from the outside, even
for the ubercracker.

Basically they have to find a hole in the daemon that manages the one
open port (whose source has been micro-audited for e.g. leaks and buffer
problems outside of the usual development stream and which may not even
be the same source as what is in the open distribution version) AND
figure out a way to slip inside without getting eaten by any of the
automatic or human cereberus's that guard the door.  The idea that this
occurs and folks succeed makes for a great film idea, of course, but
I'll bet that nearly every successful attempt at a core system protected
in depth like this is made EITHER with penetrations through HARDWARE or
FIRMWARE holes -- tapping that good old powerline or the like to snoop
keys -- or by insiders or with their knowing or unknowing collusion
(snitching their magstripe card, bugging their bedroom where they talk
in their sleep from all of the jolt cola they drink on the job:-).

> Now, I assume that using any such non-mainstream system is probably
> (so far, to date) significantly more painful, annoying, and thus
> expensive than just running Linux.  (And thus is unlikely to be
> appropriate for a Beowulf cluster.)
>
> But if you're a huge organization already throwing millions of dollars
> into horribly painful manual re-audits of even trivial updates to
> "commodity" operating systems for mission-critical "highly secure"
> applications, then I strongly suspect that you're already well into
> the same cost range where investing those \$millions into the use of
> secure-by-design systems might well make much more sense.

Ah, a believer in rational decisioning, CBA, minimal TCO.  Don't you
see, man, that you're up against a whole world of people that don't,
actually, understand the rational process?  A world where 1/2 of its
members have IQ's under 100, and where 100 \pm 10 is usually a bit iffy
when it comes to being able to actually analyze things logically or
mathematically?  A world which is additionally so incredibly
cost-nonlinear that it may well BE cheaper to continue using a very
expensive WinXX network that everybody knows how to use and manage and
that very rarely breaks compared to the HUGE costs of conversion, a fact
that is appreciated by every salesperson of computer software or
services in the universe because it works for them (when a fish is
landed) and against them (when they are trying to land a fish that is
already in somebody else's net).  And then there is FUD, ignorance, pure
cupidity and kickback schemes (not kickSTART schemes, where are
different:-), training costs (everybody already knows how to use X
already, so even though it is an ancient and cumbersome legacy
application, you have to look at days, weeks, months of retraining and
loss of productivity throughout that period, where a lot of the folks
being trained are those ~100 IQ people that are NOT happy learning new
things).

Did I mention the immense inertia of "standard" mission-critical
software packages in monopolistically universal use that become in and
of themselves a criterion for that rational decisioning?  In >>state
law<< in many cases (yes, I'm ashamed to say that NC is one of many
states that test high school students on the use of >>MS Office<<
components, not generic office software tools.  How powerful is that?
Imagine if all NC driving tests required that you take them in a Ford,
and that people were forbidden by law to make cars with certain Ford
features like the ability to burn Ford gasoline which curiously enough
was the only gas to be found in 90% of the pumps?  Something like that,
of course, would spark torch-and-pitchfork activity because the
automobile industry is actually not a monopoly and companies actually
compete.  In the software industry it creates nary a ripple... because
who is there who will complain, rouse the rabble, hang Bill Gates in
effigy?

I tell you, the greatest and most cut-throat monopoly the world has ever
seen, in control of our INFORMATION flow -- western society must have a
death wish.

So, all that stands between that current reality and a future of
rational decisioning in software is a little-bitty paradigm shift.  The
current way of doing business is a clear attractor in anybody's economic
benefit space -- change is always a cost barrier and the designers of
software deliberately do everything in their considerable power to keep
that barrier as high as they possibly can.  To create a change, a new
attractor (CBA basin) has to emerge that is a) much lower than the
first; b) broad enough that there is good phase space overlap with the
operational needs of a large fraction of the customer population; c) a
"valley through the hills", real or perceived, to minimize that cost of
getting from there to here; d) advertising like all hell to get the word
out about the valley and the green hills available on the other side
where the rivers run in milk and cows grow on trees; e) companies and
government officials whose pension funds aren't all tied up in stock in
the company that "owns" the original less good basin; f) something to
trigger a stampede -- a stampede basically tramples down those
mountains real fast.

Arguably linux is a) already, is working (too blindly and slowly) on b),
utterly lacks c), is missing d) altogether, barring a handful of ads
from IBM, is totally f&*!ed by e) (seriously!), and has yet to produce
an f) largely because even if some part of it was capable of doing so
nobody knows about it (see b) and c) and d):-(.

Nothing a good business plan couldn't fix, actually.  Nothing that
nature won't fix on its own, eventually, because a) isn't going away and
is a LOT lower, and b) tends to grow over time, and these two alone will
probably eventually fix c) and maybe f) to the extent that d) and e)
don't matter as much.  Of course MS may manage to create f) all on their
own. One really, really serious security hit -- say a major bank that IS
knocked over to the tune of an unhidable multi-billion dollar loss
because of a flaw in its design -- followed by a change in government
banking policy and public perception -- might be all that it takes to
trigger at least a modest stampede.  Of course linux and the rest are
vulnerable to this sort of thing as well, and MS actually HAS a sales
force that makes FUD a fine art and amplifies even the tiniest incident
into massive perceived risk...

rgb

--
Robert G. Brown	                       http://www.phy.duke.edu/~rgb/
Duke University Dept. of Physics, Box 90305
Durham, N.C. 27708-0305
Phone: 1-919-660-2567  Fax: 919-660-2525     email:rgb at phy.duke.edu