The history of the personal computer – sixty years of desktop progress

Gareth Halfacree

Sunday 13 July 2014

Personal computers have reached a ubiquity that could barely be imagined just a decade or two ago, at least in the world of business. Between laptops, desktops and smartphones, it’s possible to have multiple powerful devices, but that wasn’t always the case, and the personal-computing revolution is an excellent case study in disruptive technology.

The birth of modern computing

Like many inventions whose time has come, the personal computer has no one direct inventor. Rather, the concept came about as the result of work done by numerous individuals, starting with the world’s first electronic general-purpose computer, ENIAC. Built for the US military in 1946, ENIAC took up over 160 square metres of space and required 150 kW of power to calculate artillery tables.

The same technology would draw commercial interest, too. Catering company J. Lyons and Co. was the first to implement electronic computing, commissioning a variant of the Cambridge University EDSAC machine dubbed the Lyons Electronic Office or LEO. As with EDSAC, and ENIAC before it, the machine was anything but personal – the system was used for everything from valuation and payroll through to inventory and delivery schedules.

The first salvo

It would take another three decades before the personal computer could really get off the ground.

“Several things all came together in the US in 1977,” explains Kevin Murrell, trustee of The National Museum of Computing and author of Early Home Computers. “Affordable and available 8-bit microprocessors, sufficient memory to run high level languages. I would lump together three important machines: Commodore PET, RadioShack TRS80 and Apple II as leading the real personal-computer revolution.”

These machines all had something in common. Not only were they portable, they were affordable. Where a business would previously have a single computer that all employees would have to share, it became possible for each employee to have their own machine. For those with enough income, it was even within reach to purchase a computer for home use for the first time. It didn’t take long for industry incumbents to notice, either. IBM famously popularised the term ‘personal computer’ with the launch of the IBM Personal Computer 5150 in 1981 in direct response to the success of companies like Apple, Commodore and RadioShack.

The home-computing revolution

It was in the UK where home use of personal computers really took off, however. The launch of the sub-£100 ZX80 by Cambridge-based Sinclair made computing truly affordable, and the £79.95 ZX81 only enticed more users. A former Sinclair employee, Chris Curry, went on to set up Acorn Computers to compete with his former paymaster, famously winning a contract to develop a machine for the BBC’s Home Computer Initiative – the BBC Micro.

For many growing up in the ’80s, the machines were purchased for one thing: games. Cheap colour systems that connected to the home TV could play games sold on tape for pocket-money prices. Curiosity, however, would lead to a radical new career choice.

“It was amazing to me that you could buy a game and then somehow influence how it worked,” recalls Tony Evans, technical lead at a multinational technology and consulting corporation, whose interest in the topic was ignited by the Sinclair ZX Spectrum.

“Programming with a tape drive and pencil taught me a lot about patience and perseverance, but also gave me that buzz you get when you see something you made working,” says Andy Clark, senior technical consultant at Allocate Software. “Without that experience, I would have likely gone into mechanical engineering rather than moving into IT.”

“If I’d encountered computers later in life, I’d probably have found them entertaining,” agrees Adam Short, a software developer of a similar vintage. “But I doubt I would have wanted to make my living with them.”

The industry-standard architecture

A problem with home computers of the ’80s was that they typically suffered from incompatibilities with each other. Coupled with an industry crash in 1983 that sent many companies, Sinclair and Acorn included, bankrupt, the market was ripe for newcomers.

The launch of the IBM Personal Computer in 1981 changed the home and business-computing landscape forever, but in a way IBM had never intended. The machine was based around a Basic Input Output System (BIOS), which acted as an interface between the hardware and software.

Rival companies soon cloned this BIOS, some using clean-room reverse engineering and others – ones who would be quickly sued – copying the source code IBM included in its technical documentation. This allowed the creation of so-called “IBM compatibles” or clones – machines with higher specifications or lower retail prices than IBM’s own Personal Computer that nevertheless ran its software.

IBM attempted to fight back by introducing proprietary technologies like Micro Channel Architecture (MCA), but the clones won out and an open standard dubbed the Industry Standard Architecture (ISA) became the norm.

Today, we take it for granted that any given piece of hardware or software will work with any given personal computer – with a few notable exceptions. That ecosystem can be traced directly back to the clones, and while the phrase “IBM compatible” has dropped out of favour, few can deny that the legacy of the IBM PC can still be felt – right down to the system BIOS and nomenclature.