[Bloat] [Cerowrt-devel] wired article about bleed and bloat and underfunded critical infrastructure
dpreed at reed.com
dpreed at reed.com
Mon Apr 14 16:22:40 PDT 2014
All great points.
Regarding the Orange Book for distributed/network systems - the saddest part of that effort was that it was declared "done" when the standards were published, even though the challenges of decentralized networks of autonomously managed computers was already upon us. The Orange Book was for individual computer systems that talked directly to end users and sat in physically secured locations, and did not apply to larger scale compositions of same. It did not apply to PCs in users' hands, either (even if not connected to a network). It did lay out its assumptions; but the temptation to believe its specifics applied when those assumptions weren't met clearly overrode engineering and managerial sense. For example, it was used to argue that Windows NT was "certified secure" according to the Orange Book. :-)
On Sunday, April 13, 2014 8:57pm, "Dave Taht" <dave.taht at gmail.com> said:
> On Fri, Apr 11, 2014 at 12:43 PM, <dpreed at reed.com> wrote:
> > I'm afraid it's not *just* underfunded. I reviewed the details of the code
> > involved and the fixes, and my conclusion is that even programmers of
> > security software have not learned how to think about design, testing, etc.
> > Especially the continuing use of C in a large shared process address space
> > for writing protocols that are by definition in the "security kernel"
> > (according to the original definition) of applications on which the public
> > depends.
> >
> >
> >
> > Ever since I was part of the Multics Security Project (which was part of the
> > effort that produced the Orange Book
> > http://csrc.nist.gov/publications/history/dod85.pdf)
>
> Which I incidentally have read... and fail to see how it applies well
> to networked systems.
>
> > in the 80's, we've
> > known that security-based code should not be exposed to user code and vice
> > versa. Yet the SSL libraries are linked in, in userspace, with the
> > application code.
>
> I note that I am glad that they are mostly dynamically linked in -
> something that wasn't the case for some other crypto libs - because
> finding applications
> that linked statically would be even more difficult.
>
> And I have seen some reports of people using heavily patched openssl
> doing smarter things with memory allocation - why weren't those patches
> pushed back into openssl?
>
> Well, because they were held private and not publicly reviewed... and
> don't appear to actually work, according to this:
>
> http://lekkertech.net/akamai.txt
>
> > Also, upgrades/changes to protocols related to security (which always should
> > have been in place on every end-to-end connection) should be reviewed *both
> > at the protocol design level* and also at the *implementation level* because
> > change creates risk. They should not be adopted blindly without serious
> > examination and pen-testing, yet this change just was casually thrown in in
> > a patch release.
>
> Yes, change creates risk. Change also breeds change. Without change
> there would be no progress.
>
> Should there be an "office of critical infrastructure" or an
> underwriters labratory examining and blessing each piece of software
> that runs as root or handles money? Should some governmental or
> intergovernmental group be putting a floor under (or a roof over) the
> people working on code deemed as critical infrastructure?
>
> heartbleed was not detected by a coverity scan either.
>
> > I suspect that even if it were well funded, the folks who deploy the
> > technology would be slapdash at best.
>
> I agree. Recently I was asked to come up with an "phone-home inside
> your business embedded device architecture" that would scale to
> millions of users.
>
> I don't want the responsibility, nor do I think any but hundreds of
> people working together could come up with something that would let me
> sleep well at night - yet the market demand is there for something,
> anything, that even barely works.
>
> If I don't do the work, someone less qualified will.
>
>
> > Remember the Y2K issue
>
> I do. I also remember the response to it.
>
> http://www.taht.net/~mtaht/uncle_bills_helicopter.html
>
> The response to heartbleed has been incredibly heartening as to the
> swiftness of repair - something that could not have happened in
> anything other than the open source world. I have friends, however,
> that just went days without sleep, fixing it.
>
> I've outlined my major concerns with TLS across our critical
> infrastructure going forward on my g+.
>
> > and the cost of
> > lazy thinking about dates. (I feel a little superior because in 1968 Multics
> > standardized on a 72-bit hardware microsecond-resolution hardware clock
> > because the designers actually thought about long-lived systems (actually
>
> I agree that was far-thinking. I too worry about Y2036 and Y2038, and
> do my best to make sure those aren't problems.
>
> it seems likely some software will last even longer than that.
>
> > only 56 bits of the original clock worked, but the hardware was not expected
> > to last until the remaining bits could be added)).
>
> Multics died. It would not have scaled to the internet. And crypto
> development and public deployment COULD have gone more hand in hand if
> it weren't basically illegal until 1994, and maybe before then, some
> reasonable security could have been embedded deep into more protocols.
>
> It would have been nice to have had a secured X11 protocol, or
> kerberos made globally deployable, or things like mosh, in the 80s. In
> terms of more recent events, I happen to have liked HIP.
>
> We don't know how to build secured network systems to this day, that
> can survive an exposure to hundreds of millions of potential
> attackers.
> >
> >
> > The open source movement, unfortunately, made a monoculture of the SSL
> > source code, so it's much more dangerous and the vulnerable attack surface
> > of deployments is enormous.
>
> No it didn't. Alternatives to openssl exist - gnutls, cyassl, and
> polarssl are also open source. Libraries that merely implement
> primitives well like nettle, gmp, and libsodium - all developed later
> - also exist.
>
> I am GLAD we don't have a monoculture in crypto.
>
> What happened was mostly inertia from openssl being the first even
> semi-legal library for crypto operations and huge demand for the
> functionality backed up with too little understanding of the risks.
>
>
> >
> >
> >
> > Rant off. The summary is that good engineering is not applied where it must
> > be for the public interest. That remains true even if the NSA actually
> > snuck this code into the SSL implementation.
> >
> >
> >
> > On Friday, April 11, 2014 2:22pm, "Dave Taht" <dave.taht at gmail.com>
> said:
> >
> >> http://www.wired.com/2014/04/heartbleedslesson/
> >>
> >> And Dan Kaminisky writes about "Code in the Age of Cholera"
> >>
> >> http://dankaminsky.com/2014/04/10/heartbleed/
> >>
> >>
> >>
> >> --
> >> Dave Täht
> >> _______________________________________________
> >> Cerowrt-devel mailing list
> >> Cerowrt-devel at lists.bufferbloat.net
> >> https://lists.bufferbloat.net/listinfo/cerowrt-devel
> >>
>
>
>
> --
> Dave Täht
>
> NSFW:
> https://w2.eff.org/Censorship/Internet_censorship_bills/russell_0296_indecent.article
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.bufferbloat.net/pipermail/bloat/attachments/20140414/c2a0a9e0/attachment.html>
More information about the Bloat
mailing list