Wednesday, 17 April 2013

Will There Be An Industrial Revolution In Computing?

The year 1991 saw the successful conclusion of one of the most remarkable enterprises in the history of invention. A team of engineers at the London Science Museum managed to complete the construction of a cogwheel computer that had first been designed, but never built, 170 years earlier by the Victorian scientist and mathematician, Charles Babbage (1791-1871).

During his lifetime, Babbage insisted that his cogwheel brain, which he named the Difference Engine, was capable of being constructed and would have worked if it had been. His efforts to build it, however, met largely with ridicule, and Babbage died a disappointed man in 1871.

Why couldn’t Babbage build a Difference Engine in his lifetime? Today, with the benefit of hindsight, we know that the biggest problem which stymied him was the lack of a precision metal industry. It wasn’t that the overall design of his machine was faulty, or that cogwheels can’t be computer components (they can, though there’s no denying that electronic components do the job a whole lot better.)

Modern Build Of Babbage's  Difference Engine
The successful modern initiative to build a Difference Engine deliberately made use of components that were no more precisely-engineered than Babbage himself could have produced for individual cogwheels back in the nineteenth century. However, the 4,000 components - most of them cogwheels needing to be fashioned to be as much alike as possible - were manufactured for the modern build using modern industrial manufacturing processes which ensured consistent high quality components to a standard pattern. This was the result of the difference between the small-scale ‘craftsman’ approach for building precision components that Babbage was obliged to adopt, and the modern ‘industrial’ method used by the Science Museum’s engineers.

Charles Babbage
Today, despite his ultimate failure, Charles Babbage is regarded as the father of computing. His reputation continues to increase in direct proportion to the increasing importance of the computer in our high-tech age. It was somehow highly appropriate that in the year 2000, Microsoft’s former technology director, Nathan Myhrvold, took personal delivery of a full-scale modern-built Difference Engine. This was the second such machine ever built. Myhrvold had sponsored the construction of a second machine so that it could become the ultimate conversation piece in his home.

In the arcane world of cogwheel computing, the 170-year battle between the crafts approach and the industrial approach to making components for Babbage Engines was decisively won by the industrial approach. Today, in the enormously important and influential business of computer software development - a business which, in a very real sense, has made the modern world possible - the craftsman-like way of doing things has prevailed for a long time. But the day of the craftsman-like approach may be coming to an end, to be replaced by an industrial way of designing software that is faster, less risky, less expensive and more efficient.
Spining Mule At Beginning Of Industrial Revolution

The whole point of an ‘industry’ is that it brings a concerted and methodical approach to the task of creating something or putting raw materials through a particular process to achieve a final result. A key definition of ‘industry’ in the Shorter Oxford English Dictionary is ‘systematic work or labour’.

Cogwheels in Babbage's Difference Engine
But in so many cases, software development is not systematic. Instead, it is a strangely haphazard matter, involving methods, approaches and outputs that bear far more resemblance to the brilliant but ill-equipped genius Babbage painstakingly struggling with a few hired craftsmen to manufacture precisely-made cogwheels, than anything resembling a modern industry. No wonder the track record of success of modern software development is so, well, unsuccessful.

What about people who might say, ‘all right, anybody can criticise the modern software development business, but the fact of the matter is that enough projects have been completed, and have delivered enough of the functionality that was required, for the world we live in to be an enormously different place from what it was even just twenty years ago, let alone in Babbage’s time.’

This is, admittedly, a fair point, and certainly there is no denying that despite the shortcomings of the software development business, the products of software development obviously do provide significant value to the organisations that commission them, and the users who benefit from them.

Even as I write, software is being used around the world to control, monitor and make safe literally millions of processes that not even such a far-sighted technological prophet as Babbage could have come close to foreseeing. Drilling taking place ten miles below the ground, civilian and military aircraft flying ten miles above the ground or much higher, and all the myriad intricate visible and invisible processes that make the modern world what it is, are all enormously dependent on the power, precision and reliability of computer software.

But does this mean that we should be complacent about the effectiveness of modern software development techniques? Certainly not. Instead, all it means is that despite the ever-present practical, logistical and financial problems relating to software development today, computer software remains so much valued that, by and large, the organisations paying for software development projects are willing to suffer large risks and losses in order to obtain the benefits which software development can offer them.

Ten or even fifteen years ago, the risk factors attaching themselves to software development were much the same as they are today. Indeed I remember that back in the early 1990s the figure of 16 percent of problem-free software development projects was bandied about much as it is nowadays.

Can things ever get better?

The general sense of dissatisfaction with the way software is usually developed has been felt keenly since at least the mid-1980s. From time to time there have been false dawns of new ways of doing things; new ways that were supposed to constitute revolutionary improvements in how software was designed. However, in most cases the false dawns barely survived to lunchtime.

Object-orientated programming (OOP), for example, was supposed to introduce a new way of managing software development by basing programs around certain ‘objects’ - a bundle of features associated with a particular application - that could in principle be regularly re-used as and when required to facilitate rapid software development. But whilst OOP is widely used it didn’t fulfil its wider promise; it was more a question of oops! because developers didn’t find the objects as helpful as the original author initially hoped would be the case. Most developers concluded that the level of modification and customisation of the objects necessary was simply too great for the objects to be useful; so they wound up building their software from scratch instead. Albeit using OOP techniques.
A Reusable Class In UML

More progress has been won in recent years from new kinds of tools which, rather than seeking to revolutionise software development, aim instead to make developers more productive. A particularly useful tool here has been application development frameworks (ADFs). These are a form of open-to-all software packages that facilitate rapid customisation of the software for a particular application to the precise requirements of the organisation in question. But even ADFs are really still a set of generic building blocks, with the objective of being used across a wide variety of applications.

Today, the reason why software development is such a laborious and potentially risky and fraught matter is that the way in which software is designed means that opportunities to re-use code (ie. pre-written programs) is still extremely low. Software development mainly takes place using generic programming languages that render every project essentially a bespoke one. There has been limited infrastructure and markets that have encouraged developers to supply potentially re-useable programming components. But maybe re-use of pre-packaged functionality wasn’t the way to do it after all and a different perspective was required.

That different perspective was hoped to be a concept known as the domain-specific language (DSL). A DSL is a special kind of language designed to be used for a particular application or solution. It is not a new concept to computer scientists but it is one that has only really caught ion over the passed 10 years.

As one might imagine, a DSL will be all the more useful the more tightly the particular application or solution to which it caters is focused, because this will tend to increase the likelihood that another programmer, building a solution to a similar problem, will be able to use the same features of the language to specify what it is s/he wants the computer to do.

In fact some DSLs have relatively wide domain remits (eg. retail banking) while others have narrow remits (eg. operation of a retail banking ATM network). Incidentally, Microsoft has recently developed a special kind of DSL whose particular domain is helping developers to design other DSLs! It is now be possible for organisations (or their suppliers) to define languages aimed specifically at providing solutions for problems within their business domain. I doubt even Babbage would have thought of that.

DSLs create the possibility of a software ‘factory’. What is a software factory? A software factory is not some form of business or a special type of organisation.  In essence, it is a set of tools, a development environment if you will, that is optimised for building solutions in a particular business domain.  Hence, a software factory could be used by a traditional software supplier (assuming he adopts the new tools and techniques) or an organisation’s own internal IT department if they are developing applications for that organisation.

A software factory provides solutions not through using pre-written code or following best practice for a particular type of solution (although these can be a part of it) but through the use of a programming language (a DSL) that is inherently designed to write computer solutions to particular types of problems.
Software factories are most emphatically not about selling software packages that provides a one-size-fits-all solution to a particular problem. Instead, software factories acknowledge that everyone may want a different type of product but, to use analogy, if a factory is optimised to build motorcars is not suited for building houses.

1871 Machine Enabling Reproducible Screws
Despite allowing individual solutions to be provided, the software factory offer reliable ways to develop applications rapidly, relatively inexpensively and with lower risk than is seen using the generic programming languages and tools. One can make a direct comparison here with the introduction, in the mid nineteenth century, of standardised and precisely-milled mechanical components by the great mechanical engineer Joseph Whitworth, whose components and machine tools won international renown for their supreme accuracy in a repeatable format. The impact was felt not only in the form of Whitworth being the only supplier of these items, but also that his gospel of precision encouraged everyone to think about solving particular problems using tools built to common standards.

The ultimate aim for software factories is to remove much of the small-scale ‘crafts’ element from routine software development and confine such craftsmanship to the really sharp end of the most demanding software development initiatives. This will allow software development to become less expensive, lower risk and far more reliable while still giving organisations all the scope they need to establish and maintain a competitive advantage from their software.

A software factory, in other words, would offer the best of all possible worlds. However, it is a solution that is still struggling to find its place.  I suspect we are yet to see the full potential of software factories realised.  Or maybe they are impractical for other reasons: why learn to work in a specific factory when you can have a generic skill that can be moved to any employer.

I think what will happen over time is that this approach (or something similar) will begin to attract more business oriented people, and they will start to undertake work in their business domain that was only previously undertaken by specialist programmers. Maybe we will start to see a wider acceptance that with the right tools anyone can be creative when developing computer systems.