The evolution of the personal computer market
Apple gets it right
The first real personal computer was the Apple II, created by an engineering genius called Steve Wozniak and a marketing genius called Steve Jobs. Initially working out of his garage, in 1977 Wozniak produced a ready-assembled computer with colour graphics, a major step forward at the time. But while sales were good in the hobbyist market, you couldn’t do anything useful with the Apple II and it was struggling to break out of the nerd market.
The Apple II, like all computers of its time, did not fit the selection pressures of the mainstream, and it might have sunk without trace without the accidental intervention of an MBA student called Dan Bricklin.
Dan had been told by his professors about the way that production in factories was planned using large blackboards that were divided into grids, with one cell containing the number of widgets, one the number of grommets, one the number of workers needed to assemble them (and so on) and the final cell containing the number of finished goods. When the factory manager wanted to change one cell, he would have to erase the contents of all the dependent cells and change their contents too, but he could see his production schedule spread out in front of him on a single blackboard. Yep, it’s a stone-age spreadsheet. Bricklin produced an electronic version of the blackboards for the Apple II, and VisiCalc became the computing industry’s first ‘killer app’.
Armed with VisiCalc, Jobs now started marketing the Apple II as a serious business tool. Thousands of middle managers, tired of supplicating the IT priesthood in an attempt to get access to their own data, took their Apples into work so that they could create simple business models on their desks using VisiCalc. While Bricklin’s spreadsheet was rather clunky by today’s standards it was the tool that got these personal computers (or, ‘lot-less-impersonal-than-a-mainframe computers’, as they would be more accurately described) into our lives. By using memes that appealed to the mainstream, Apple made embarrassingly large amounts of money, and the microcomputer industry was born.
Apple screws it up
In the face of emerging competition (the Commodore PET) and rumours of an IBM microcomputer, the boys from Apple rushed out a new model. The Apple III was launched in 1980 and Jobs went straight for the business market – the wrong side of the chasm for an immature product. Not only was Apple III several times the price of its predecessor, the build quality was so awful that the company sent out a technical bulletin telling customers to drop broken machines from ‘about 12 to 18 inches onto a hard surface’ in an attempt to re-seat the chips on the motherboard . The business market is unforgiving of this kind of sloppiness and the machine tanked.
The Apple III was thankfully euthanised by the launch (or rather, the announcement that there was to be a launch) of the IBM PC. IBM had a stodgy-but-serious brand that was popular with both IT and purchasing stakeholders, and the business market decided to wait to see what the IBM PC was like. In fact, it was pretty good – well designed, thoroughly tested and not too sophisticated. As a result, IBM was able to borrow the Apple II’s ‘small but useful’ memes and conflate them with the IBM ‘quality’ and ‘serious business’ memes.
IBM screws it up
Of course IBM, used to design and production cycles lasting years, soon lost technical and cost leadership of the industry to the clone makers like Compaq and Toshiba, who produced compatible machines faster and cheaper than IBM. Within five years, PCs were everywhere, and the IT world was divided into mainframe systems and desktop systems, with Apple out of the market. Since PCs were becoming cheaper than the dumb green-screen terminals that used to hang off its mainframes, IBM had to do something else to protect its revenues. Their response was the PS/2, which had a much more advanced architecture (and more importantly to IBM) a proprietary operating system. If they had aimed it at the nerds it might have succeeded, but instead they repeated Apple’s mistake and aggressively marketed this new product as a business machine when it really wasn’t ready. This machine created the standard for a lot of the ports on the back of your computer but was pretty much ignored by everyone except IBM.
Apple gets it right again
And what of Apple? It’s a matter of debate whether Apple is consistently brilliant, or just consistently lucky, but Jobs and Wozniak changed the market again. They produced a machine called the Lisa in 1983, based on memes that Xerox had designed a decade earlier and had failed to promote. Apple’s Lisa was the first mass-produced computer to be driven by pictures and windows (instead of lines of text), it was the first desktop computer with a mouse, and it was the first desktop computer to be able to run more than one program at once. It was even portable, if you had a wheelbarrow. The Lisa would have been unbeatable except it cost far too much (again) and ran like a pig on stilts: it was both a technical milestone and a commercial failure – but the errors made in the Lisa design were corrected in the Apple Macintosh, a machine at least a decade ahead of its time when it was released in 1984.
The Macintosh worked, it was faster and cheaper than the Lisa, and it spread like wildfire. The business community were able to produce on the paper exactly what they saw on the screen, thanks to the invention of the laser printer and a font description language called PostScript. This ‘what you see is what you get’ (WYSIWYG) approach was vastly attractive to business users struggling with DOS-based systems. If you were born after about 1970, you’ll just have to trust me when I tell you that the Mac’s WYSIWYG word processor and graphics package were a revolution in personal computing. Apple rightly saw this as a market-changer and launched big – they took all of the advertising pages in one edition of Newsweek, and played on the fear and loathing that most people felt for their IT providers with the iconic and memetic ‘Big Brother’ advert in the 1984 Superbowl.
Apple screws it up again
But the Macintosh was still expensive, it looked like a toaster rather than a serious tool, and it couldn’t run the established base of DOS applications. With a squarer box, volume discounts and sober marketing it might have wiped the IBM clones out. But it didn’t – Apple aimed it at both sides of the chasm at once, marketing it for quirky individualists but pricing it for the business market. It gained a dominant share in the graphics, publishing and arts industries where WYSIWYG really mattered and, arguably, where the quirky individualists tend to group.
However, Apple’s easy-to-use graphical interface was a real threat to Microsoft, and the boys from Seattle experimented with Mac-killers for several years before releasing, in 1990, a look-alike for the PC called Windows 3.0. While really just a pretty loader for DOS applications, it ran on the cheap PCs that the mainstream used. ‘Cheap’ is a core selection criterion for the home market, and this gave Microsoft an overwhelming market dominance (more than 90%) that has continued into Windows 95, 98, XP, Vista, Windows 7, 8.n and 10. Some of these operating systems have been awful, others merely bad.
Thanks to technical compromises in the design of Windows, memes like ‘spyware’, ‘worm’, ‘Trojan horse’, ‘crash’, ‘reboot’, ‘CTRL-ALT-DEL’ and ‘virus’ have entered all our vocabularies.
But is there an alternative to these systems? A lot of people – admittedly people you wouldn’t want to invite round to dinner – rave about systems such as Unix and Linux. Thirty years of testing and a unique (and memetically transmitted) not-for-profit development method means that these old minicomputer operating systems are very, very stable: applications may crash, but the operating system almost never does. Linux also costs next-to-nothing, it’s efficient, and it is relatively secure. Unfortunately it is not amateur-friendly and it has never made it into the mainstream market except as a platform for servers, where the high skill levels of the users means that the horrid interface doesn’t matter.
However, if you buy a modern Macintosh you will be forking out a great deal of money for an elegant front-end (called OS X) that sits on top of Unix. As a result, you get an operating environment that is fast, elegant, easy to use, never falls over, and never gets viruses . But the superior quality of the operating system and the elegance of Apple’s hardware comes at a considerable price, and Apple still have not made that much penetration into the business market where price is a primary selection criterion.
Apple may have invented the market for MP3 players, and reinvented the market for phones and tablet computers, but they blew it when it came to the PC market. Everyone (at least in the Western world) that was going to buy a PC has now bought one, and a large majority of those are Windows machines. We are no longer in a game where virulence matters, but one where sales numbers are dominated by switching percentages.
The PC manufacturers are spending a great deal of marketing money trying to effect tiny changes in market share for their completely undifferentiated products. Those who use Unix-based systems (including OS X) are so loyal to them that this almost represents a trapping state, and the evangelism meme seemingly packaged into every Macintosh suggests that they will eventually win out. But Windows is so dominant, and the product lifecycle in this market is so short, that ‘eventually’ is too late.
So what is the future for the personal computing market? I’m not sure, as Apple’s combination of luck and genius can never be underestimated. However, the techno-ecological niche for the PC is steadily shrinking, and I suspect that the battle has already been lost to the convergence of tablets and telephones… but that, dear reader, is a story for another day.