Home / General / I Had No Intention On Sleeping Tonight Anyway

I Had No Intention On Sleeping Tonight Anyway



Well, this makes one feel secure:

The U.S. nuclear weapons system still runs on a 1970s-era computing system that uses 8-inch floppy disks, according to a newly released report from the Government Accountability Office.

The GAO report found that the Pentagon’s Strategic Automated Command and Control System — which “coordinates the operational functions of the United States’ nuclear forces, such as intercontinental ballistic missiles, nuclear bombers, and tanker support aircrafts” — runs on an IBM Series/1 Computer, first introduced in 1976.

The system’s primary function is to “send and receive emergency action messages to nuclear forces,” the report adds, but “replacement parts for the system are difficult to find because they are now obsolete.”

I see no potential downside to this situation.

  • Facebook
  • Twitter
  • Google+
  • Linkedin
  • Pinterest
  • plarry

    The risk from malicious hackers is probably small.

    • Murc

      This may be a deliberate security choice, in fact.

      I say may. But “security through obsolescence” is a real thing in the cybersecurity profession; if nobody can access your systems because they’re compatible with absolutely nothing anymore and acquiring the hardware to do new builds is expensive and time-consuming and a huge pain in the ass, it dramatically lowers the chances of you being penetrated.

      That said I’m not sure the Pentagon is actually that smart.

      • Brownian

        Well, we can scratch Cylon Attack off our list of humanity-ending threats.

      • tsam

        This may be a deliberate security choice, in fact.

        Yeah, I’ve heard this before–that using prehistoric hardware and software reduces vulnerability. It’s probably true, but being so old that replacing parts that weren’t very reliable in the first becomes a big issue sounds like a disaster waiting to happen.

      • Sly

        The 8″ floppy story has been around for a while (60 Minutes did a piece on it two years ago), and the Pentagon/NORAD spokesperson they get for comment always raises the security issues.

        Which is entirely valid. 10% of Iran’s uranium enrichment centrifuges were crippled in about a month because one Israeli operative plugged a thumdrive containing the Stuxnet worm into connected computer. You’re not going to fit that kind of sophisticated piece of malware onto a storage medium with a 250kb capacity (and that kind of sophistication would be needed even on an old system), and the system is isolated so its impossible to get in from an outside connection.

        That said, I would be much more comfortable if ICBM technicians weren’t communicating with each other and with their commanding officers on the same phone system my grandmother used.

        • Captain Splendid

          I would be much more comfortable if ICBM technicians weren’t communicating with each other and with their commanding officers on the same phone system my grandmother used.

          I do. Simple, copper, Plain Old Telephone System, is cheap and easy to set up, maintain, and requires very little power.

          • Captain Oblivious

            But it does require a fairly expensive switch of some kind that has to be programmed and have its lookup tables maintained.

            Also, I wonder how many network techs these days know how to use a wire-wrap tool.

        • Richard Hershberger

          That said, I would be much more comfortable if ICBM technicians weren’t communicating with each other and with their commanding officers on the same phone system my grandmother used.

          Why? Grandma’s phone was rock-solid reliable: far better than today. I just hope that those ICBM technicians wait until after five o’clock and keep their calls under three minutes…

    • mark

      Yes, as a computer guy I’m perversely comforted by thinking the nukes are sitting on a computer that is relatively simple, not some Windows server or even Linux where just the OS is has 50 million lines of that no one person could ever understand.

      I’m not sure it’s a good idea, mind you, but I can think of worse ways to make a mistake.

      • Philip

        Yeah. The smaller and more well-tested the attack surface, the better. Plus, floppies are really good at data integrity, which is a useful thing in nuclear weapons systems!

        • postmodulator

          Thirded. As another IT guy I’d feel way more nervous about the nuclear arsenal being controlled by version 0.8 of something from Github.

          • tsam

            As the IT guy for my office (built and maintain a simple P2P LAN and Windows PCs), I can see how a tiny, command line only operating system would be a good way to secure a system–but then it’s connected to a network, I assume, so I’m not sure any of that matters.

            • Philip

              Most security problems that can be exploited via networking are because the things sitting behind the network connection are too complicated. It’s easier than you’d think to make a secure, bug-free networked program so long as you keep its scope very small.

              • tsam

                Ah–well I’ve never dug into that end of things, but I see how a focused set of tasks would be more secure than what has come of the trend of making it easy for people to set up their own networks.

            • max

              It’s connected to a network – the special internal network purpose-built to transmit NCA action messages. In practice this is a pre-IPv4 system so, if you could find the wires the military are using to talk to the ICBM silos, and then dug them up and physically wired into it, you’d still need to understand the classified networking protocol they use to communicate with. And then you’d have to figure the encryption they use, and the format and content of the action messages, AND you’d need to figure out the right authentification keys to forge a valid message.

              That’s where Loomis post has it backwards – if those super-old skool action System/1s break, then NCA cannot communicate with a given silo crew and, as a consequence will not receive orders and thus will not turn the control keys to initiate launch.

              Cold War-style Hawks and people worried about maintaining deterrence credibility should be unhappy that the system that has worked so well is now so old that the materials it was made of are physically breaking down and cannot be replaced. E.G. old (highly reliable!) 8″ floppy disks are hard to find.

              Since the system is just a super-sophisticated telegraph (silo crews do not need or want Facetime to relay action messages) it’ll have to be ‘upgraded and modernized’) with hardware with robustness and reliability equivalent to a ’70’s IBM System/1 with 8″ floppies made of parts that can be actually be had.

              That’s going to cost a fuckton of money per unit. I am completely OK with as long as it’s up to the standards those System/1s have upheld for 40 plus years… so we don’t have a broken system and accidental launches.

              [‘Better that then the fucking F-35.’]

          • NBarnes

            acts_like_fissionable 0.9.2 was much more reliable than 0.9.1

            • postmodulator


              – Source code reformatted from GNU convention to K&R.
              – Pulldown menu options reorganized to be more intuitive.
              – Should now build on NetBSD.
              – Fixed divide-by-zero bug in network code. (Sorry about that, Kiev.)

      • There are RTOS systems out there that are very secure and stable, but my feeling is we should be doing all we can to reduce the system to bare bones instead of modernizing it.

    • Todd

      Also a hedge against sentience.

    • Captain Oblivious

      I suspect (or at least hope) that the network these Series/1s are connected to is a dedicated internal one “air-gapped” from any other networks.

      I also doubt that these S/1s have anything resembling remote maintenance access. That would require programming or operating system components that didn’t exist at the time, to say nothing suitable network access — the Internet was barely operational at the time and consisted mainly of university geeks sending text messages over 300 bps telephone connections.

      When I was working on S/1s, I had to actually get on a plane and fly to locations to install software updates.

      • Speaking of going on site to do upgrades, I’ll just leave this here:


      • NonyNony

        To your suspicion – back in the 1970s there is no way in Hades that a military general was going to give the okay to put the computers that control our nukes on the same networks as a bunch of academics doing DARPA research at universities. Because back in the 70s people weren’t stupid about opening up access to anything and everything just because Internet.

        (Seriously – the idea that banks have the networks that they use to store transaction information on the same damn network that any criminal in the world can get access to is not something that would have happened in the 1970s, no matter how much your IT staff misused the word “firewall” to explain how protected you were going to be. But the Internet is magic and so now all of those accounts sit on the same network protected only by a few computers intended to stop traffic from getting in. It’s insane, but that’s how we roll here in the future I guess.)

    • Warren Terra

      The risk from malicious hackers is probably small.

      I like the qualifier. How are we st for defense against whimsical hackers? Indifferent hackers? Altruistic hackers, even?

      • sonamib

        Well, friendly “hackers” do exist. They test your defenses and inform you of any weaknesses they found.

  • Yeah, the only risk I can see is that it may become impossible to launch nuclear weapons. Bummer.

    • What if antiquated systems fail in ways that indicate a nuclear attack is imminent? What if antiquated systems meant to monitor foreign nuclear activity fails completely? How long would it take to rebuild that network? What mistakes could be introduced by rushing the job?

      • “The system’s primary function is to “send and receive emergency action messages to nuclear forces.”

      • mark

        This isn’t what monitors other people, it’s what controls our nukes. Monitoring would be a distributed task between things like multiple satellites and seismic sensors and so on, not to mention human intel.

        I’m sure the monitoring is old hardware too but I doubt it’s a single point of failure. Not out of government cleverness but because it’d be really hard to design a computer system that did that much.

      • The Temporary Name

        It’s replacing the stuff I worry about.

  • c u n d gulag

    Then can I assume ENIAC and UNIVAC are still busy being used at the CIA?

  • burnspbesq

    Welcome to the wacky world of Federal IT procurement. There was a time, not too long ago, when the IRS ran a small manufacturing operation making replacement parts for its 1960s computers that it could no longer buy from the manufacturers.

    • It’s that bad at the state level too, at least in Rhode Island

    • I remember reading about NASA buying parts for the space shuttle from eBay.

    • postmodulator

      That was a driver of the Clinton email scandal, too — apparently part of the issue was that the State Department “approved cell phone” was an antique BlackBerry and HRC wasn’t having it.

      • tsam

        Obama seemed much less than thrilled when they handed him a Blackberry too.

    • Dr. Ronnie James, DO

      Just a few years ago, the main employee time card system for Los Angeles County was still using IBM paper punch cards.

      • Captain Oblivious

        A lot of banks are still using check sorters that were built in the 70s.

    • bender

      The IRS shouldn’t have to do that, but it would be a very good thing if our military had in house manufacturing capability for replacement parts.

    • Anna in PDX

      I left the US foreign service in 1998 and we were still using Wang computers for the classified cables.

  • Brownian

    Who is still manufacturing 8″ floppies?! Or are the launch codes passed around on diskettes sporting layers of scribbled-over labels?
    I’d hate to think the difference between survival and nuclear annihilation is a thin piece of plastic with a Dysan sticker proclaiming “Gerry’s resume” in sharpie.

    • Murc

      Lay off Gerry, man. The job market is tough for experts in antiquated systems architecture.

      • John not McCain

        We’re calling him Terry now.

        • wjts

          Damn it, Terry!

    • I’d hate to think the difference between survival and nuclear annihilation is a thin piece of plastic with a Dysan sticker proclaiming “Gerry’s resume” in sharpie.

      There is no chance of survival in a nuclear attack. It doesn’t take that many nucs to set off a nuclear winter, so whoever initiates a first strike would doom us all anyway, whether we can retaliate or not.

      • twbb

        A few nukes would absolutely not set off a nuclear winter. You’d need many and many of them, and while it would lead to massive upheaval and a lot of deaths, it would not end the human race.

      • Brownian

        There is no chance of survival in a nuclear attack.

        Maybe for you. If beer commercials are any indication, winter (nuclear or not) is when we Canadians really shine.

      • NonyNony

        As twbb says and as a child of the 70s/80s who stressed about this stuff A LOT and so did A LOT of research – a single nuke or even a couple of nukes is not an extinction level event for humanity. It would be a slaughter, but not an extinction level event.

        As far as extinction level events go, climate change is going to kill far more people and have a longer term effect on our planet’s livability in general than a few nukes accidentally going off would, even if they were in heavy population centers.

        (Now an actual war, on the other hand – that’s an extinction level event. Because of MAD, if two nuke-having countries ever get into a war where one decides to drop nukes, we’re screwed.)

    • cleek
    • postmodulator

      Who is still manufacturing 8″ floppies?!

      If the military treated them like they treated vacuum tubes, they have a warehouse full of them somewhere.

      Trivia time! Until the late 80s the only purchasers of vacuum tubes in this country were the US military and makers of guitar amps. Tubes wear out, so any device that uses them eventually needs replacements. I think it was in 1990 that the military decided they could keep their vacuum tube computers running for a century even after a nuclear war, so they quit buying them. All the American manufacturers of vacuum tubes closed up shop, because the amplifier business was a miniscule fraction of the DoD business. For a couple of years we all worried that we’d have to quit using tube amps. (Ninety-five percent of serious guitar players refuse to use anything else.) Russian and Chinese manufacturers picked up the slack, but it is widely acknowledged that the quality is not as good — there are companies that just buy up tubes, test them, and resell the ones certified as good at a substantial markup.

      • Captain Oblivious

        This brings up another point about the Series/1. It had printed circuit boards, but in those days the transistors and capacitors were mostly still not miniaturized. These kinds of components are relatively simple to manufacture and are still available. Clip the failing part off the board and solder on a new one.

        The difficulty is in figuring out what the failing component is.

  • Crusty

    I don’t really know what any of this means, but it is possible that it may be preferable for there to be little to no involvement with computers in launching the nukes. A failure whereby nukes are launched when they are not intended to be launch is obviously terrible. A “failure” where nukes don’t launch automatically but need a little more human* oversight might be a good thing, or as the kids say, feature, not a bug.

    *It had better be the right human.

  • Captain Oblivious

    My programming career started out on IBM Series/1 computers. They were rugged little fuckers. I am not at all surprised (and less worried than I suspect most people are) that they’re still running. I knew first-hand or heard from reliable sources of Series/1s that had been sprayed with water, choked with coal dust, dropped out of trucks, or knocked over, and after cleaning them up a bit and reseating the cards and cables, started back up just fine.

    At one place I worked, the Series/1 was in a room with no A/C where the temps would get up around 120 F in the summer. For a 1970s-era computer to keep chugging along in such heat was pretty amazing.

    On the downside, when they did fail, they could be a pain in the ass to diagnose. About all you could do was swap out cards until you found the failing one.

    My main concerns now would be (a) finding programmers who really know the Series/1 — most of them are going to be retired or close to it by now, and there were very few of us to begin with, although I suspect the DoD code is rock stable by now and needs no maintenance — and (b) replacement parts.

    The first Series/1 I worked on had 128K of memory and no hard drive and no operating system. You selected the floppy containing the program you wanted and booted off that.

    The first hard drive we got was a single-sided 15″ 9Mb drive that weighed ~110 pounds. The capacitors were as big as paper towel rolls, and the read-write head the size of your first. It cost several thousand dollars in 1970-ish money. We were ecstatic to have it — no more booting off floppies.

    ETA: There was a special super-duper-rugged version developed specifically for the military.

    • Michael Cain

      The software probably hasn’t been modified in years. Which, broadly speaking, means it’s about as bug-free as it’s likely to get, and has many fewer bugs than new software running on contemporary hardware would have.

      • Captain Oblivious

        And it was probably pretty simple to begin with (for assembly-language code). Reading between the lines, I’m guessing all these boxes do is act as message relays/network interfaces.

        The #1 use of Series/1s was to provide communication interfaces between computers, or peripherals and computers, that otherwise could not talk to each other.

      • Bruce B.

        My father worked at NASA on ranging systems for space probes, and explained it that way to me back when – that the oldest hardware and software for which they can reliably get parts and that passes endurance tests is almost always the best choice, because it’s so well understood.

    • Philip

      Yeah. Fwiw the consensus in my corner of the infosec world was that this is actually really reassuring, because god knows none of us trust anything modern.

      • Ramon A. Clef

        That was the upshot of discussion in my office, as well. What they’re going to replace the old system with worries us much more than what they’re replacing.

    • Back in my days as a nuc submariner, taking a tour of the the reactor monitoring and control systems was like taking a history tour of electronics. We had everything from mag-amps to 8080 microprocessors and micro-electronic circuit boards, and I was quite fine with that! Much easier to troubleshoot and fix.

    • Sly

      My main concerns now would be (a) finding programmers who really know the Series/1 — most of them are going to be retired or close to it by now, and there were very few of us to begin with, although I suspect the DoD code is rock stable by now and needs no maintenance.

      Even if it wasn’t rock-solid, NORAD isn’t a private tech business so they’re ideal candidate isn’t a 22 year old with 12 years of experience who doesn’t need any training.

  • junker

    In an earlier thread someone was complaining about Obama going to Japan to talk about nuclear weapons while also calling for an increase in the budget for them… well this is why. Not because he want’s to make it easier to launch nukes but because our ridiculously outdated nuclear infrastructure is horrifyingly dangerous.

    It also blows my mind that, considering how military crazy most of the federal government is, that they can’t get the money to keep up on this.

    • Captain Oblivious

      There’s something to be said for if it ain’t broke, don’t break it.

      As I said in a previous post, my main concern would be finding replacement parts.

      Replacing these units means developing, testing, and installing something new. This in turn means introducing the potential for bugs and new failure points.

      Also, presumably these S/1s have been kept for a reason. As I also stated, the #1 use of these boxes was as interfaces, not data processors/storage units. It’s possible that the equipment they are interfacing with is also very old.

      • bender

        Not only a potential for new bugs and failure points, but a potential for sabotage.

        Our domestic manufacturing capacity is orders of magnitude less robust than it was in the 1970s. Some of the parts for the new system are going to be imported. Also, even though the Vietnam War was going on in the early 1970s, I believe that it would have been more difficult for a foreign power to find engineers and technicians willing to turn traitor for money then than now.

    • In an earlier thread someone was complaining about Obama going to Japan to talk about nuclear weapons while also calling for an increase in the budget for them

      I think that was me :-)

      And I still stand by that comment. We’d be better off reducing the stockpile to 100 nucs instead of spending a trillion bucks upgrading what we have.

  • N__B

    Yuugest disks! And floppiest!

    • Hey! There’s nothing floppy about Mr. Trump!!

      • Mae West

        Is that a floppy in your pocket, Donnie, or are you just sad to see me?

        • tsam

          HAHA! Mae West was a true American hero.

    • tsam

      MTV used to air this British comedy called The Young Ones back in the 80s. Rick Mayall was the only name I could remember from it, but my brother and I loved this show. I told you that story so I could tell you this one–it needed time era context, also an indication that I’m kind of weird. I don’t know if you guys know that about me or not.

      Anyway, the hippy character (of the hippy, punk rocker, angry wavo type, and I don’t know what Mike was), used the term “floppy disks” as a swear word.

      This concludes the most interesting post of all the posts contained within this specific post. Thank you and good day.

      • Mike was the conservative/Tory

        • Davis X. Machina

          Mike was an embryo barrow boy — non-posh self-made businessman, often on the shady side of the stock and forex markets, full of US-based marketing jargon, but only one step removed from being a costermonger. Thatcherite Britain idolized them.

          • tsam

            British version of our YUPPIES from the 80s, then?

            • Davis X. Machina

              More class-conscious. Yuppies — at least according to the stereotype I remember — were upper-middle class, and well educated. Barrow boys were more aggressively and obviously on the make.

              The kid in the original ‘Wall Street‘ film is close…

      • Keaaukane

        Vivian! Where’d you get the howitzer?

        • tsam

          I forgot SPG the hamster(?)!

          “Will you all shut up? I’m trying to be ill!”

          • gorillagogo

            I’ve got to stop sniffing this ajax!

      • TribalistMeathead

        Considering I know next to fuckall about Great Britain in the 1980s, I sure do love The Young Ones.

      • margin of error

        All this reminded me that Rik Mayall, who died two years ago, also played the unforgettable Lord Flashheart in Black Adder II (and again in IV).

        “I’ve got a plan, and it’s as hot as my pants! Woof!”

    • CrunchyFrog

      I’ll bet most of the readers here have never seen an 8″ floppy disk, and when reading the article pictured something else. The first PCs in 1981 used the 5.25 inch version, which were still floppy, and if you had a desktop computer built before 1996 or so it probably still had a 5.25 floppy drive for compatibility reasons with older backup disks and software. The most common disks starting in the late 1980s were the 3.5″ plastic ones that for reasons unknown were still called floppy but didn’t flex like the earlier versions. Iomega created a proprietary zip drive which was popular in the late 90s and early 2000s for backups, but the invention of the USB stick killed that product line off.

      Those 8″ floppys probably are the state-of-the-art (for late 1970s) version DSDD format that could hold up to 1.2 MB unformatted. Before 1976 the most the disks could hold was 250 kb. This underscores just how relatively simple the software in this computer must have been.

      • Captain Oblivious

        I suspect the only thing stored on the floppy is a standalone bootably app written in S/1 assembly language.

        The Series/1 was a 16-bit machine with a 16-bit memory bus, so without doing some tricky jumping through hoops, the largest app could be only 64K. It was possible to write larger apps, but I doubt that was the case here.

      • N__B

        I’ve seen them. Last time was in 1981 or so.

      • jmauro

        The 3.5″ disk media was the same material as the 8″ and 5.25″ drives. The casing was different to better protect the media since at higher densities the disks weren’t too tolerant of bending, dust and such. Because it was the same material\media it kept the name floppy disks. Since if you disassembled the disk the media was still very bendable.

        Also it used the same interface as the 5.25″ floppy drives so all the BIOS commands were the same between the two.

        • Captain Oblivious

          You could actually bend the case of a 3.5″ floppy.

          I found this out during the 1989 Loma Prieta quake (the “World Series” quake). I had left a floppy half-inserted into my Fat Mac on my desk at home. When I got home, the Mac was tipped over on its face, with the floppy still in it and bent at about a 30-degree angle.

          The case actually straightened itself out in a few days.

          And the Mac was fine.

    • Just_Dropping_By

      If Trump is 8″, then that’s pretty yuuuge, even if it is floppy! (And, having made that joke, now I’m going to throw up.)

      • vic rattlehead

        I think it probably looks more like this. Why do you think Ronald T. Dump got so upset?

        Warning: NSFL http://www.gq.com/story/donald-trump-nude-micropenis-portrait-is-free

        • tsam

          If there’s a new way
          I’ll be the first in line
          But it better work this time…

          Nice nym, btw…

          • vic rattlehead

            Peace sells. But who’s buying?

            I have a tattered old t-shirt *somewhere* around here…

  • rwelty

    i kind of wonder what the system is written in. it’s not going to be a simple hardware swap, the software environment is also completely obsolete and a ground up rewrite will be in order.

    • i kind of wonder what the system is written in.

      Visual ADA.

    • Vance Maverick

      Yeah, the existing code will probably need to be ground up.

      Maintaining legacy technology is a hard problem. At the same time (or rather, because of that) I would expect government agencies to plan for gradual turnover and upgrade in every system.

      • CornFed

        Hahaha! The idea of government agencies being able to plan for anything beyond the current budget horizon is laughable. The idea of any entity with a complex procurement process planning for gradual turnover and upgrade cycles for equipment with an innovation cycle shorter than a procurement lifetime is just asinine.

      • mikeSchilling

        Yeah, the existing code will probably need to be ground up.

        Or at least shredded.

    • Captain Oblivious

      Series/1 assembly language.

      • MD Rackham

        As I recall, the Series/1 had RPG II as another option.

        • Captain Oblivious

          Also COBOL I, some version of FORTRAN, EDL (a proprietary language for the Series/1 EDS OS), and eventually C. However, almost everything that had to run fast and occupy a small memory footprint was written in AL or EDL.

          Also, the compilers for these other languages were hugely expensive and sloooooooooooooooow. Programmers who knew S/1 AL or EDL avoided using higher-level languages like the plague. The higher-level languages were largely confined to business apps, and those often didn’t run very well.

          The bank I worked at briefly had an ATM network server that was mostly written in COBOL. Entirely because it was written in COBOL and therefore had a huge memory footprint because of all the crap it had to link in, it could handle a maximum of 42 ATMs. (They eventually replaced the Series/1s with a Tandem, which turned out to be an expensive pile of shit.)

        • …sweet. I work with (and can program a very little) RPG IV every day.

        • Captain Oblivious

          Adding, I believe (at least on the EDS OS), it was not a full implementation. You had to wrap it in COBOL and pretty much only the report printing part was supported.

          That not have been the case on the other OS (RPS). I didn’t do much work on RPS so I don’t know.

  • Bruce Vail

    Of course the purpose of the GAO report is to provide justification for spending additional $billions$ on upgrading the nuclear arsenal.

    • Murc

      You say that like its a bad thing. Our nuclear arsenal desperately needs upgrading. Some of those silos are in terrible shape and there is bullshit going on like “the doors no longer work properly and are kept propped open with rocks.”

      • Bruce Vail

        Right you are. Spending money upgrading the arsenal would absolutely be justified — if we reduced the overall size of it at the same time.

        • Murc

          Y’know, I honestly have no idea what the Obama Administration has been doing re: nuclear disarmament. I mean. I know about the Iran deal, of course. I more meant about how they are on reducing our own ludicrous nuclear footprint and coming to bilateral deals with Russia, China, India, etc.

          Just one of those things I haven’t kept track of.

          • Pseudonym

            I assume New START is still operative?

          • bender

            The Obama administration has done a bunch of effective work in the continuing project of securing fissionable material from countries that have decided not to have nuclear weapons any more, and turning it into reactor fuel.

            As far as reducing our own stockpiles, I don’t know but it doesn’t seem to have been a priority.

  • Morse Code for J

    Every radar display in my air traffic control facility is powered by a 486, basically.

    The media and Congress repeatedly bash the FAA for its “World War II-era radar system,” because we combine primary target returns and secondary transponder returns for airplane identification. But it works, and is up 99.99994% of the time. Old tech is usually robust tech.

    • Captain Oblivious

      I think younger people, even tekkie types, who weren’t around in the Golden Heyday of Floppy Drives, don’t realize that as computers have become smaller and more powerful, reliability issues have gotten worse, not better.

      The hardware these days is definitely more reliable on the whole, especially data storage systems.

      However, the massive increase in computing power has led to massively larger and more complicated apps, and this in turn means more bugs and failure points.

      The problem with handing an organization like the DoD or FAA a 4.2 Ghz quad-core 16GB PC is that’ll start dreaming up all the bells-and-whistles shit they think they can do with it.

      • cleek

        the new version will be a cluster of microservices communicating via REST interfaces with an HTML5 front end, and a Hive datastore. it will take 5,000,000 lines of Java and Python and will require a dedicated fiber connection to GitHub in order to process the constant stream of updates to the 200 different open source frameworks it uses.

        it will be considered obsolete in 8 months.

        • postmodulator

          That’s the scariest fucking thing I’ve seen this morning, and I just watched It Follows.

          • Captain Oblivious

            The scariest part is if they do replace it, the code will be written in Ada.

            • postmodulator

              Damn, I almost made an Ada joke there and I thought it was too obscure for the room.

            • Colin Day

              Naive question: What’s wrong with ADA?

              • Redwood Rhiadra

                Naive question: What’s wrong with ADA?

                It’s written in Wingdings (ok, not really wingdings, but all Greek letters and math symbols not available in a conventional font).

            • mds

              Ah, yes, it’s too bad they didn’t beat INRIA to the punch and call it CAML, to reflect how it was designed by committee.

              (That said, would Ada really be scarier than Java?)

        • it will be considered obsolete in 8 months.

          More work for the laid-off coalminers, then!

        • tsam

          You’re joking, right? Please tell me you’re joking…

  • Rob in CT

    I’m actually not worried by this at all. If anything, I’m reassured.

    LOTS of things run on “1970s technology” by the way.

    Old tech is usually robust tech.

    Yeah, this.

    • Linnaeus

      Slightly OT, but I was watching a talk by the economist Robert Gordon (in which he was arguing against the notion of a Second Machine Age that will bring about unparalleled prosperity) and he pointed out that we actually haven’t improved much, if at all, on a lot of the main technologies that we use today.

      • Rob in CT

        This is the guy who has been arguing for the idea of a technological advancement (and thus economic growth at least in developed countries) slowdown, right?

        • Linnaeus

          Yes, that’s him. In opposition to folks like Erik Brynjolfsson and Andrew McAfee.

      • bender

        The rate of technological advancement has been slowing for about a century. It peaked around 1890. Think of all the different useful inventions that came out of Edison’s lab.

        If you walked around New York City or Chicago in 1916, you would understand how the city worked, and most of the tech that makes cities work in 2016 was already present (electricity, elevators, underground sewers and water pipes, subways and streetcars, etc.) If you time traveled one more century to 1816 London, you are practically back in the Early Modern Period.

        There have been a few new technologies developed since the 1930s, such as everything that’s followed from cracking the DNA code, but the breakneck pace of change of the late nineteenth century is over.

        People have finally begun to notice this, and that’s why Where’s My Flying Car? jokes are getting old hat.

    • Philip

      LOTS of things run on “1970s technology” by the way.

      Banks keep trying to convince schools to bring back COBOL in their curriculums because they’re still running on it, and everybody who can write it keeps retiring.

      • Rob in CT

        A Big Insurance Company which shall remained unnamed also, too (and, AFAIK, much of the rest of the industry).

        Customer-facing stuff is all pretty web stuff. But it connects back to 1970s-era systems.

      • Vance Maverick

        What’s depressing here is not that an old language is being kept on life support, but that banks can’t expect new grads to be able to learn a programming language on the job.

        • Murc

          This seems like an unwarranted slur against CS grads. Most of them would be thrilled to learn a new skill on the job.

          What they would be less thrilled about is spending a lot of their own time learning a mostly obsolete language in order to maybe get a job that doesn’t have a lot of cross-industry applicability.

          • Vance Maverick

            I hope you’re right. But old or new, part of programming work is inevitably learning skills that won’t transfer — sometimes because they’re already obsolete, but also because they’re going to be obsolete soon, or the tools were made in-house.

            • Murc

              Yeah, but there’s a difference between “this job, which I am being paid for, is having me learn skills that won’t transfer” and “I already have many skills that are applicable to many other jobs; why should I learn mostly obsolete skills that won’t transfer to maybe get a very niche job?”

              I’d learn COBOL too if my job were paying me to do so. I’ve no interest in learning it on my own time, with my own money; I could spend the time learning much more monetizable programming languages instead.

          • Philip

            +1. Also, because of the crap that is CS hiring, learning COBOL (or working at a big non-tech-oriented company in general) can be actively detrimental to your career because ignorant people at the “cool” companies will make assumptions. So it’s not just learning a potentially-non-transferable skill, it’s spending time learning a skill you might have to actively avoid discussing if you’re looking for a new job down the road.

            • skate

              Meanwhile my office is advertising for a FORTRAN coder, and the ad indicates prior experience with that language not required.

      • Murc

        Banks keep trying to convince schools to bring back COBOL in their curriculums because they’re still running on it, and everybody who can write it keeps retiring.

        Maybe they could pay for internal training. I bet they could find the money somewhere. You know. Only fill their swimming pools half full of Dom Perignon. Switch to domestic instead of imported hookers.

        • wca

          Maybe they could pay for internal training.

          That’s just crazy talk.

      • Captain Oblivious

        COBOL is an excellent programming language for business apps. I don’t know why people think otherwise.

        It’s not hard to learn. The real problem here is that banks are cheap-asses that don’t want to pay to train programmers. (I know — I worked at a big bank for a while). But if you know another programming language, you can pick up COBOL pretty quickly.

        I had to learn COBOL in two days once. I took the IBM COBOL manual home for the weekend, read it, and came back on Monday and wrote, compiled, and debugged a five-up label printing app in one day. This was with each compile taking about 40 minutes to run.

        • CrunchyFrog

          The main problem with COBOL is/was inefficiency. A lot of that is inherent in the language design, but a lot of it came from COBOL programmers not understanding how the computer would implement what they wrote. I remember, for example, being asked to diagnose why a certain application was bringing the computer to a standstill. I found that the programmer – a woman who had learned COBOL after 25 years as a grade school English teacher – was sourcing in every data declaration in the shared libraries “just in case” she might use them, and thus was forcing massive memory allocations. In addition, because she’d build programs to stop and start frequently, this meant that the memory alloc and dealloc happened frequently, taking all kinds of computer cycles and lots of memory thrashing. Well, *she* didn’t know – no one had told her. She was treating the program like an English essay. That was the most extreme example I know, but in years of diagnosing COBOL performance problems the biggest cause was programmers not knowing what the COBOL text – logical though it was – actually caused the computer to do.

          • Captain Oblivious

            Another common failing of the poorly-trained COBOL programmers was overuse of subroutines and the PERFORM statement. On IBM mainframes in particular, this could bring the system to its knees with page swaps/faults.

            My favorite beginning COBOL programmer story is of the intern who was given the task of writing an app to read in a bunch of 16-digit credit card numbers stored as EBCDIC strings and convert them to BCD. IBM COBOL allowed you do arithmetic on EBCDIC and BCD. Her solution was set the BCD value to 0, then keep subtracting 1 from the EBCDIC value and adding 1 to the BDC value until the former was 0.

            One of the mainframe systems programmers estimated the average card number would take about 40-45 days to convert.

            • CrunchyFrog

              That’s a cute story (and very believable for anyone whose been there), but of course that could happen in any language. More likely to happen with a COBOL programmer, since they were less technical.

              • Captain Oblivious

                I have a lot more stories.

                I worked in a couple large shops where the “cold mirror” test was the primary interview technique for screening entry-level programmers.

            • It does have a kind of horrifying ingenuity and simplicity though.

              • Captain Oblivious

                MOVE EBCDIC_CARD_NUMBER TO BCD_CARD_NUMBER would have been a lot simpler.

            • Murc

              My favorite beginning COBOL programmer story is of the intern who was given the task of writing an app to read in a bunch of 16-digit credit card numbers stored as EBCDIC strings and convert them to BCD. IBM COBOL allowed you do arithmetic on EBCDIC and BCD. Her solution was set the BCD value to 0, then keep subtracting 1 from the EBCDIC value and adding 1 to the BDC value until the former was 0.

              Is it wrong that I read that and thought “this person will probably actually go far as a programmer once they learn some more syntax and how to swallow their pride and ask others for help; that solution is a logical and rather ingenious way to get around her shortcomings with the language.”

              • Captain Oblivious

                I don’t know what happened to her in the long run (different department in the company — she worked in mainframe app programming, I worked in network and data entry systems programming, not even the same building), but she was still working there when I left a couple years later.

          • the programmer – a woman who had learned COBOL after 25 years as a grade school English teacher –

            On the other hand, the only COBOL programmer I’m aware of knowing, also a woman, quit her programming job about 25 years ago, after her—still flourishing—job as a romance novelist took off.

        • postmodulator

          But if you know another programming language, you can pick up COBOL pretty quickly.

          This is true of a hell of a lot of languages. Unfortunately HR departments aren’t aware of it. Since they’re the only people who actually need to be aware of it, this is sort of a bad thing.

          • “You coded in version 2.1b? Sorry, we’re looking for someone with 2.1c experience. Preferably three years worth.”

            (note that version 2.1c released six months ago)

            • Philip

              A few years ago I saw “must have 7 years experience in Go.” At the time, Go had only existed for 5 years. The rumor has always been these listings are attempts to do an end-run around H1B visa rules.

              • postmodulator

                I have witnessed such a listing being used as an end-run around H1B visa rules. Not a literally impossible level of required experience, but an extremely unlikely one.

                And it’s been going on long enough that I remember seeing a “five years of Java required” listing in 1998.

              • CrunchyFrog

                It’s more than a rumor. At my previous company we hadn’t hired anyone from the outside in years (well, below the VP level anyway) – just constant downsizing. However, we always had 3-4 jobs posted in order to renew existing H1Bs.

                • postmodulator

                  At my previous company we hadn’t hired anyone from the outside in years (well, below the VP level anyway)

                  The problem with late capitalism, short enough to fit in a Tweet.

              • Redwood Rhiadra

                My boss had to get his H1B renewed a few years back, and that’s exactly what they did – “must have 15 years experience in product X” was a requirement in the job listing, which was only possible if you happened to be one of the original developers before it was published.

            • Captain Oblivious

              I was told I didn’t get a C programming job once because my ten years or so of C experience was “out of date”.

              Not C++. Not C#. Plain old C.

              • mds

                In fairness, they probably urgently needed someone who was demonstrably up to speed on C11. Imagine the chaos if their new hire still used gets.</sarcasm>

                • Captain Oblivious

                  And here I thought it was because I had no experience with char16_t.

                • Pseudonym

                  char16_t? That can’t even represent emoji!

          • Philip

            One of the most infuriating practices in programmer hiring is filtering resumes on listed languages. If it’s C or C++ or similar, I can maybe see it because safe manual memory management is so alien if you aren’t used to it. But “oh, you don’t know Java and this is a Java shop” is absurd.

            • Captain Oblivious

              The other thing they do is list a bunch of “productivity” tools (quotes intended) and third-party frameworks that the “ideal” candidate will possess an intimate, working knowledge of.

              Often the chances of any human being on this earth even knowing what most of these things are, let alone know how to use them, is ~0.

            • Captain Oblivious

              When I was hiring programmers, I was mostly looking for talent. It was nice to get the experience, too, but there were plenty of interviews where it became obvious that I was dealing with somebody who knew a lot of programming languages and had no clue how to write code.

          • CrunchyFrog

            Blame Sun and UNIX. Before then a programmer was a programmer and anyone could learn anything. In fact, a new job usually meant learning a new OS and all that came with it.

            Sun sold the world on the idea of programmers as interchangable parts, never needing training, as long as they knew C.

            • Captain Oblivious

              My first programming job was at a large company run by programmers. It was a sweatshop, and the company had contractual commitments to keep its network up 99.9% of the time, which put a lot of stress on the technical staff, but at least the management appreciated that not everybody could do what we were doing.

              I left to take what I thought would be a better, less stressful job. The management there were not programmers. They couldn’t understand that no, you just can’t hire someone who’s taken a 90-day evening programming class at the local community college and expect them to maintain a million-line financial services app.

              This was long before Sun.

              I think the real problem is that most companies are run by complete fucking idiots who got to the top by fucking over everybody else and/or sleeping with the right people and/or being born into the family that owns the business. (E.g., T-Rump).

              I’ve generally found that the stupider the person I’m dealing with, the less respect they have for pepole who actually know how to something that requires a high degree of skill or knowledge.

              And this, more than anything, has led to the belief in many quarters than programming can’t really be all that hard, can it?

              • mds

                I think the real problem is that most companies are run by complete fucking idiots

                Yeah, that’s pretty much the real problem in all kinds of ways.

      • cleek

        the USPS uses COBOL for a lot of its stuff, so developers who want to use their data have to translate the USPS’s sample apps from COBOL.

        i spent 6 months doing that not too long ago.

        • Captain Oblivious

          I would think the hardest part of that is getting used to COBOL’s data declarations.

          The rest is pretty much plain English.

          • cleek

            the string handling in COBOL is unusual but very powerful, and translating that to C++ took a lot of effort: fixed length strings, with all kinds of neat slicing and splicing functionality.

      • mikeSchilling

        The new thing is object-oriented COBOL, also known as ADD COBOL, 1 GIVING COBOL.

        • Actually I think it was fully 20 years ago that I saw my first (and, thanks to quick action, my last) advertisement for OO-COBOL.

      • IS

        Company I used to work for used mainly MUMPS. Knew it wasn’t getting anyone with experience in it, especially for entry-level positions. So it actually had a pretty good training for both the language and all the in-house stuff.

  • Perhaps nuclear war itself is obsolete, no?

    • tsam

      When I saw the earlier post that lead with a tweet from Sanders saying that climate change is the biggest threat to our planet, it got me thinking that it’s rather amazing, in my lifetime, to have seen nuclear war supplanted by climate change as a world ending threat.

      Nuclear war may be obsolete, but it’s certainly still a scary threat. The slow suffocation in a thinning atmosphere and rising oceans, mixed with the inability to produce enough food for everyone is 10 times as scary.

  • Linnaeus

    I’m thinking that we should replace the current computer technology that we use to run our nuclear arsenal with something more advanced. Something that “thinks”. We could call it…let’s see…War Operation Plan Response – WOPR. Sounds good to me!

  • N__B

    Given the beautiful photo in the OP, let me just say here how happy I am that “In your guts, you know he’s nuts” has been recycled after a 52-year hiatus.

  • Crusty

    Y’all need to be watching the Americans on FX. Its got everything- nuclear threats, bio-weapons threats, honey traps, floppy disks, Keri Russell, Matthew Rhys, Frank Langela, martial arts, necklacing, mail robots. Season finale is next Wednesday, but there’s time to catch up over the summer.

    • Linnaeus

      That’s a fantastic show.

  • Dennis Orphen

    The rule of synchronicity almost dictates that I’m going to find a box of NOS 8″ floppys at a yard sale this morning.

  • AcademicLurker

    The next generation system should be based on android. Nothing says “piece of mind” like a nuclear weapons launch system that can interface with twitter…

    • weirdnoise

      Nothing says “piece of mind”


    • Ramon A. Clef


    • Hogan

      We can call the next generation of strategic bombers “Angry Bird.”

      • Pseudonym

        I like it, right up there with Planey McPlaneface and the B-21 Ultimatum.

    • mikeSchilling

      “Send the emergency override! Now!”

      “Working on it … Shit. We’re all dead.”


      “After I encrypt it, it’s 141 characters.”

  • Rob in CT

    Wildly OT, but I just have to mention it.

    Bumper stickers on a car I passed on the way to work today:

    “Feminists for Life” and “Team Trump”

    How does the person who put those stickers on that car manage to tie their shoes in the morning? Hayzus Christo!

    • Captain Oblivious

      They wear slip-ons.

    • NonyNony

      A “Feminist for Life” is a pro-life right-wing woman trying to “reclaim” feminism back for conservatism. Which is what “everyone knows” feminism was when it started back with the First Wave feminists.

      It’s the same kind of Orwelling wordplay they like to do with everything, so I wouldn’t be at all surprised to see a Trump sticker next to it.

      • rea

        “Feminist for Life” is a pro-life right-wing woman

        Or rather, an anti-choice rightwing woman. Or maybe just a chicken who loves Col. Sanders

  • The fact that they don’t use Windows is probably why we aren’t flecks of radioactive dust, but the DoD intends to upgrade its system by the end of FY 2017, so who knows.

  • Woodrowfan

    at least it’s not trying to upgrade to Windows 10 every half hour.

    • tsam

      But Windows 10 is awesome tho.

      • Linnaeus

        I suppose, though my work laptop very much did not like my attempt to upgrade from 7 to 10.

        • tsam

          One advantage I have is building all my own computers. So I don’t have any proprietary motherboards–and I always have drivers around for them or know where to find them in a big fat hurry. I stick with Intel and Intel chipsets, and that keeps me stable and quick to fix problems.

          • Pseudonym

            Which is fine if you’re one of us approximately five remaining people who use desktops at home. When it comes to laptops I stick with Apple; for rackmounts, SuperMicro.

      • Pseudonym

        Windows is, ironically, a bit like the Giants: they seem to get it right every other release.

        • Captain Oblivious

          That was also true of Word for the first six iterations or so.

          Also, Minecraft, which is now an MS product, seems to have fallen in that pattern. 1.8 was stable and fast enough, 1.9 was a slow, glitchy mess, and 1.10 (in pre-release as of yesterday) looks a lot better (although still some lighting glitches).

          • NonyNony

            Microsoft would be a LOT better off if they made ever other release a “public beta”. Because in practice that’s the way they seem to do their releases.

        • tsam

          Definitely. Nothing compares to the Vista release though. That new shell just blew up the whole world.

          By and large though, the problems with Windows are BTTKAC

    • Hogan

      MS just went ahead and upgraded my laptop at home. I guess it’s an opt-out kind of thing.

      I remember when people gave commands to computers, instead of the other way around.

      • Captain Oblivious

        I’m somewhat sympathetic to MS trying to get people off W7. I don’t agree with the bullying tactics — it should be enough that they’re giving it to you for free. And, seeing as it’s free, there’s no reason for the vast majority of W7 users to not upgrade.

        • NonyNony

          I was basically forced into upgrading from Windows 8 to Windows 10 by my tax software this year. it wouldn’t install on Windows 8 and eventually I gave up and gave Microsoft the go ahead to install Windows 10.

          I have to say – it’s the first version of the OS since XP that feels like they bothered to actually work on making it usable and stable instead of “glitzing” up the interface and making superficial changes. I’ve been mostly happy with it.

      • tsam

        Happened to a couple of my work machines–of COURSE it was the two people for whom any kind of change essentially turns them into catatonic lumps.

  • Jeff Ryan

    In the spirit of the posts preceding, you have not represented the DoD rationale at all well. Hell, when I saw the original story’s headline, I thought “Great!”

    I wouldn’t object to the use of the Enigma machine if it made the process more secure.

    Now, though, we must all fear that the launch codes will be accessible from “The Cloud.”

    Yeah, that’ll work.

  • sonamib

    Like many others here, I find this state of affairs quite reassuring. Didn’t we use to fear that the machines controlling our nukes would take over and doom us all? You know, like Skynet in the Terminator series? Well, there’s no chance of that ever happening if the launch system relies on computers from the 1970s.

    On a related, OT note, there are some corners of the Internet that freak out at the idea of a sentient AI. Because a greater-than-human intelligence will end up enslaving us all. But… how would that even happen? As we all know, it’s not sufficient or even necessary to be intelligent to rule the world. Quite a lot of stupid people have held huge amounts of power. And dolphins are quite smart, but they don’t rule the seas with an iron fist (iron fin?).

    The argument seems to be :

    1. Sentient AI
    2. ??????

    Can anyone fill in the details of number 2? Because I don’t see how some sentient program in some computer or data center could possibly rule the world.

    • Murc

      You don’t understand, sonamib. The Omnics are takin’ our jerbs.

      • sonamib

        Automatization is a concern if it means that the number of available jobs plummets and there’s no safety net for the unemployed.

        And to riff on your comment, I wonder what would happen if the robots that replaced us at our jobs became sentient. Would be become bigoted against them? Would the robots consider striking for better working conditions? So many questions.

        • Murc

          Would be become bigoted against them?

          Yes. Absolutely. We can become bigoted against anything and really fast. In fact, I would submit that not only would there be widespread bigotry against synthetic intelligences, but that many oppressed groups would actually gleefully join in that oppression.

          • sonamib

            And the best way for the bigotry to start is to believe that the machines have a secret plan to take over the world and enslave us all! We could even recycle some anti-semite tropes!

            • Murc

              The Protocols of the Elders of Cylon.

              • postmodulator

                I have a delivery of one Internets for a Mr. Murc, can you sign for this?

    • Linnaeus

      As long as we have the Turing police to protect us, we should be okay.

    • Based on the Frankenstein principle, any AI we could create would inevitably start out or become implacably hostile and an existential threat to humanity as a whole.

      • Hogan

        Because based on humanity, we would inevitably treat it like shit.

        • postmodulator

          Well, think about who works in IT in this country. Think how many people teaching the new AI ethics would call “Atlas Shrugged” their favorite book.

        • sonamib

          As long as we keep them oppressed, there won’t be any problems for us! Let’s hope they don’t pull some kind of October Revolution, though.

    • tsam


  • Here are a few links on this not-new story that may provide perspective.

    • sonamib

      The last link on that list was very interesting. It’s really tragic that the floppy disk industry is all but dead, since so much critical equipment apparently relies on them. This quote s spot on :

      But when it comes to mission-critical hardware that literally controls a potential nuclear holocaust, “tried and true” carries more weight than “new and improved.”

      Very few machines are replaced every ten years, and yet the standard for media storage changes a lot faster than that.

  • sean_p

    It actually kind of baffles me that people are more worried about an old computer system with well known characteristics/bugs, etc; than something brand new. A newer system is not automatically better or more secure than an older one.

    Also: as someone who works in the defense industry, I can tell you that the costs to verify safety, security, suitability, reliability, maintainability, and availability (especially for a nuclear application) can be so astronomical that it makes sense to retain obsolete systems long past the point where it would otherwise seem advisable to retain them.

    • NonyNony

      I think the real cause for concern is the lack of replacement parts. Honestly, if they could get replacement parts this would be a complete non-issue – there’s no real need to upgrade this stuff to something more advanced. it does the job it needs to do and that’s sufficient – it isn’t like people need to play Minecraft on these boxes or anything.

      • it isn’t like people need to play Minecraft on these boxes or anything.

        Project Plowshare.

  • Gwen

    Do you really want modernization? Because this is what modernization sounds like —

    “nuclearwar.gov, brought to you by the same people who built healthcare.gov”

    — now THAT’s a terrifying thought!

It is main inner container footer text