DrainingTheSwamp – An Addiction to Junk Software

The large scale attack on computer systems, including many of the British NHS systems, has generated considerable publicity, not least because NHS patients have suffered considerably with operations cancelled and medical facilities closed. We do not yet know how many have died as a direct or indirect result of the lack of NHS facilities in consequence of the hack. Surprise has been expressed in the media, but this is a situation warned about more than forty years ago, with regular warnings following over the years since. The only real surprise is that this form of attack was not taking place regularly for many years, but then this is not the first attack, just the first to receive widespread publicity.

A very large part of the huge risks are a result of an addiction to junk software and a desire to adopt new functionality with very little though of its benefits and threats.

http://brn.firetrench.com

http://bsd.firetrench.com

 

Since the first electronic computers began life in Britain during World War Two, there have been specialists who have regularly warned about the growing risks that electronic data processing, storage, and communications were presenting. In the early days there was security by obscurity, in that very few people even knew about the existence of electronic computers. There was also physical protection of machines that did not communicate outside the buildings in which they operated.

When electronic computing emerged into public visibility, there was still no communication outside of the sites on which they were located. Data preparation at remote sites produced punch paper and electronic media that was taken by secure transport to the computer and checked before it got anywhere neart the machine.

Military and intelligence computers were protected by traditional physical means with security vetting of all who would come into authorised contact with the machines, and concentric rings of fences and armed guards. That worked very well and security breaches were infrequent, spotted quickly and dealt with. Even the first commercial computers were little more vulnerable, but there were still potential dangers that were discussed. For the first time the warnings started to be ignored.

The mainframe computers, that provided most computing capability, were built as bespoke systems. The first machines were production systems only in as much as the hardware was a series of standard modules that were assembled to meet a need and a budget. The software was built for each machine and was, to a greater of lesser extent, unique. Access continued to be restricted. That all began to change significantly as a remote communications capability was added, microprocessors enabled computers to be built more cheaply, becoming widely available in business, and then to private individuals.

Once computers could be built at a price that made them commodity items, two aspects increased risk. The first aspect was that commodity marketing was introduced and very complex systems were sold simply, glossing over most of the reality of the products. Equally, many customers rejoiced in their total lack of technical understanding. The second aspect was that bespoke software creation was no longer practical for most of the computers being built. The idea of standard software packages was good, but the practice was deeply flawed.

Software packages started life with one, or a very small number, of developers working in primitive conditions where any form of planning and documentation was considered a threat to the programmers divine right to creativity. Typically, software was frequently written by young people working from their bedroom or garage. Most software died quickly but some packages became popular and typically went through a series of ownerships. As the owners became large corporations, they introduced some methodologies and documentation but, for maximum profit, only documented later changes and additions, leaving many lines of code that worked without anyone knowing how they worked or exactly what they really did. As the software passed through a series of corporate acquisitions, new methodologies were employed and, again, left most of the code either documented under a previous methodology, or completely undocumented. When a user complained about a problem, commercial pressures usually meant some form of temporary fix was adopted, or the problem deliberately left until the next major version of the product was released. Of course, the next version frequently retained the same problems from the previous version, and added new problems. That meant that most popular operating systems and applications software was full of holes that were never filled. It created a rich environment for the first ‘hackers’ to roam around in.

The first ‘hackers’ were not hostile forces, but specialists with a desire to understand and improve software. That was positive because those with deep pockets could employ a ‘hacker’ to identify and remove vulnerabilities in the commercial software packages used with their computers. Unfortunately, that also created an environment where ‘crackers’ could roam around without any positive intentions. Eventually, it created a growing army of hostiles, now popularly described as ‘hackers’ by the media, and the rapid growth of the virus writing community. That process has been present for more than forty years and was speeded up by the development of the Internet and the dramatic reduction of the cost of computing. Today, crackers can build the equivalent of a super computer very cheaply from older hardware and cluster the computers. That can give them more computing power than those who try to counter them and at a fraction of the cost.

As computers have come to depend on junk software that is full of bugs, the Internet has grown from its original objectives more than forty years ago to become a truly global networking environment that can be accessed by anyone from anywhere. It has also developed a commercial component since the early 1990s as the international links took it out of its North American domain. As we have now networked virtually every computer via the Internet, it has become a vast unregulated environment that is as lawless as the old Wild West and offers, in almost equal measures, good and evil, benefit and great risk. As with the computers themselves, the network contains a mass of old code and known problems that no one has ever had any great inclination to fix. It is a crackers paradise.

Now that the cost of computer and network crime is mushrooming, costing many billions of dollars, threatening millions of peoples’ lives, there is an urgent need to address the enormous risks, but the question is: “where do we start”. Start we must but, at the same time, we do need to start avoiding the repeat of old mistakes.

Right now, temporary fixes will have to be implemented in the time honoured manner of the information and communications industry. They will of course fail to solve the problems, but they should reduce the levels of threat. Inevitably, some computers will have to be taken off networks so that they can be strongly protected. This has been done in the past for very sensitive government systems and the most sensitive systems have to be completely separated from any form of networking or remote connection. However, there is the option to employ ‘air-gapping’ where a computer or private protected local network can produce data that is manually taken to another area and loaded into a computer that connects with external networks, including public networks. That method has proven effective although it also adds cost and slows communication.

There are now many choices technically that can be employed to dramatically reduce risk. There is also the option to change the way computers, associated equipment and software are produced. This has been recommended for decades but the basic problem is that there are so many junk software addicts that it will take a great deal of effort to break their addiction.

For the discerning few, some companies are taking Linux functionality and building a completely new operating system, using a maths language within the methodology. This becomes a premium product and commands a premium price, until enough companies adopt the same approach and it becomes a commodity product, at a commodity price. However, price and cost are two very different things. A high priced product that works effectively, and prevents crackers from causing the mess that has just been created by one hack, actually costs significantly less than a cheap popular product that is full of bugs and invites attack by the curious cracker and the professional, very hostile, cracker who is motivated politically or financially.

Junk software addicts are like those addicted to tobacco, alcohol and/or drugs. Originally, they were not addicts. It took cynical, profit-addicted exploiters to offer them addictive products and then keep them hooked. Now they are addicted, it requires a method of breaking their addiction, but it also requires them to understand their bad habits and want to change. The worst addicts are in government and large corporations and this presents an institutional problem that has to be addressed. The basic challenge is that these super-addicts have a long tradition in believing that they are superior to suppliers and completely control the process, when in reality they are easily manipulated and end up buying poor products at grossly inflated prices. Once they realise their mistake, these junkies then plead for more funds to buy a replacement that ends up being at least as useless and even more costly.

So the long term solution is to drain the swamp, by retraining employees to understand and practice procurement efficiently, firing those government and corporate employees that prove incapable of retraining. Only then will the real solution prove practical.

For those who have difficulty in understanding the size of this enormous problem area, the NHS is an excellent example of where and how it has all gone wrong. Any supplier wishing to change a bureaucrat’s mind only has to complement him or her on a “Courageous Decision”. This is guaranteed to make the bureaucrat abandon the course the supplier wishes to replace. The direct proof on the NHS management and politicians collective failure in procurement is classically demonstrated by the £12 billion spent on a massive new IT system that they then had to abandon, spending yet more money to buy themselves out of contracts. It is no surprise that no one resigned and no one was sacked for this epic failure – but then it was only taxpayers’ money, so no problem, lets just whine that the NHS is underfunded and carry on wasting mega bucks.

Over the coming months a great deal of money will be wasted buying largely ineffective security fixes as the standard knee-jerk reaction of bureaucrats. They will largely ignore the low cost highly effective options and repeat all of the old mistakes. As with the infamous Millennium Bug, the rush to spend money for nothing will be overwhelming. New armies of hourly-billing consultants will become very rich. Somehow, we have to find the will to do the job right. Part of that is to think about procurement as a statement of functional requirement that is then developed to produce a supply requirement that can be met and implemented, with all risks identified and costed. We can also take practical steps to ensure that there is always a copy of data that cannot be accessed by crackers because it is not held on networked computers. We can also consider the use of computers that are not networked if there is no benefit to out weigh the risks. Then we can ask embarrassing questions of suppliers about their IoT products. If we don’t want to be spied on in our homes, we should not buy televisions that provide this facility for suppliers and broadcasters. Even for those happy to be spied on by suppliers, we must not forget that a hacker can also use this unannounced functionality common to many TV sets. Unless we take actions like this, it will not be long before a cracker will hostage a house and set it on fire by hacking all the smart phone accessible controls on heating, intruder detection, fire detection and appliances.

The other elephant in the room is that politicians have failed to bring in legislation to protect privacy and intellectual property which is now vulnerable to data aggregation.

Data aggregation is one of the massive risks that is never talked about and not understood by the mass of citizens. Hostiles can acquire many pieces of sensitive information and bring them together to provide a picture that makes the individual very vulnerable. This is not something new and governments have used it to provide themselves with information that laws prohibit them from acquiring without a judge’s warrant, or under any circumstances, in other ways. It is legally possible because politicians have failed to introduce legislation to govern it. There may be justification for using data aggregation to detect and monitor a terrorist cell, but even then it should be done in an accountable manner. However governments also use this approach to acquire information that they can sell to anyone prepared to pay for it.

So many actions needed and so little time, but that should never be used as an excuse for doing nothing to correct the many past mistakes.