Does IT matter?

First published May 2007 in the Internet Newsletter for Lawyers.

“Does IT Matter?” is the title of a controversial 2003 article in the Harvard Business Review by technology writer Nicholas Carr and also of his follow-up book in 2004 which expands on the theme. The nub of his argument is that IT has become a commodity input and that for most companies there is little strategic advantage to be gained from aggressive IT investment; rather companies should spend more frugally and think pragmatically. (See his Rough Type blog.)

IT has lost its mystique: it has become normal; it has even started to disappear. Let’s start at the beginning and see how this came about and what this means to us now in practical terms.

What is “technology”?

Computer scientist Bran Ferren memorably defined technology as “stuff that doesn’t work yet”. That’s not just a clever definition, it is particularly apt in helping us understand what’s happened to “technology” and how much it now matters.

Time was when the wheel was technology. But as the wheel was refined and its use proliferated, a wheel came to be regarded as … well … just a wheel: the technology disappeared.

The I(T) revolution

What of IT? By information technology (or, more fully, information and communications technology – ICT) we mean all the technology, both hardware and software, used to store, process and transport information in digital form. The invention of the microprocessor in 1968 marked the beginning of what we now refer to as the information revolution. (And note, even here, that the word “technology” is absent.) The advent of the personal computer, bringing computing power to office and then home desktops at affordable prices, led to a rapid development of information technologies. They were technologies because they didn’t (quite) work. To use the hardware and software effectively we needed a degree of technological competence, a dogged persistence with trial and error or thick paper manuals and training courses. To get things to work and to “talk” to each other, if we were lucky, we had IT departments that would deign to help us in our hours of need (and there were quite a lot of those). “IT” (both the concept and the department) assumed an importance that some of us feared and many of us resented.

By the early-90s we had some sophisticated operating systems and office applications that did work rather well. The vast majority of workers used Windows and predominantly two applications – word processing and spreadsheets – which had matured and standardized (thanks, or not, to Bill Gates) to the point where the technology had all but disappeared. But beyond those areas, numerous proprietary systems vied for our attention and communications (the C in ICT) was still a headache. Technology still mattered. The internet has changed all that.

The new business infrastructure

The internet has played a critical role in accelerating the commoditization of IT, encouraging standardization and, in many cases, increasing the penalties of using proprietary, closed systems. At the same time, by facilitating effortless and instantaneous communication in globally-standardised ways, the internet has placed zero distance between everybody and everything else. The network has become the computer.

Recent developments demonstrate just how profoundly the internet is transforming business. Just as the development of the great infrastructure technologies of the past – the rail and shipping networks in the 19th century and the electricity, highway and telegraph networks in the 20th century – brought not just operating efficiencies but broad changes in the market, so the internet in the 21st century is transforming and creating businesses and markets. Lawyers are not immune from this!

Of course IT matters. But it is only by becoming a shared and standardized infrastructure that it can deliver its greatest benefits. Ultimately it may disappear as we simply plug into “the grid”.