The mid 1990’s was an economic boom time for the aging population of programmers versed in what many people classed as now obsolete COBOL programs written in the 60’s and 70’s. These computer cold war warriors were rallied into action as the dreaded year 2000 approached. It was feared that on the stroke of midnight December/31/1999 the world was going to experience a meltdown of catastrophic proportions. The financial world would be crippled, major infrastructure’s such as Electricity, Gas, and Telephone services might cease to function. This was modern mans equivalent of the Mayan calendars dire prediction about December 21/2012. We named the problem the Y2K bug. But the problem was not a software ‘bug’ it would be more accurate to say that it was an intentional design choice made in the 60’s and 70’s concerning the way that date information was stored by computers.

Well, here we are 10 years after the Armageddon, and humanity survived that clock tick into the year 2000. There was no meltdown other than the financial impact on companies fixing the problem. On the plus side, it provided a boon to many retirement communities that welcomed the well fed and well heeled, and now retired COBOL programmers into their fold.

But, Y2K showed us just how dependent we are on the computer.

Computer hardware is amazingly reliable. One of my favorite sayings is that if a computer is going to break it will do it in the first week of operation. If it lasts a week, it will last decades!

A much greater issue is in the software that controls whatever device you have. There are very few programs that do not have a bug or two lurking in them. Without question the most reliable software on the planet is not on the planet but in space. NASA go to great lengths to ensure problem free programs, but even they occasionally find a bug. An unforeseen set of circumstances arise, and a program crashes. It is pretty impractical to fly to Mars to hit the Reset button on a rover, so they actually design for this situation.

At the other end of the scale of reliability is the lowly PC that sits on your desk. How many times has your computer ‘frozen’ or crashed showing the ‘Blue Screen Of Death’? We view this as an irritation, reboot and go back to playing Farmville, or whatever we were doing.

Far more sinister is the potential of a crash in a computer that runs a vital service. No finer example of this can be found in the vast recall of cars built by Toyota. Farmville freezing is irksome, your car suddenly accelerating and the brakes failing is a potential deadly situation.

It is interesting that while federal requirements force manufacturers to meet physical safety standards, largely the software is unregulated.

My wife bought a new TV for the bedroom, 99.9% of the time it works great, but very occasionally when we switch it on the channel information show us watching channel 202.00 rather than 202. It is a bug in the programing. It is not dangerous, it is not life threatening, but it serves to show that even the computer program running the simple task of controlling the TV is flawed.

I am by no means a Luddite, I see the enormous benefits that the computer brings to the table, it has revolutionized many products small and large. However I also see the downside of flawed software in mission critical applications.

Should there be regulation on software involved in ‘mission critical’ applications?

Simon Barrett

Be Sociable, Share!