A History of Innovation, With a Few Blind Spots

Photo
Walter Isaacson, at his home in Washington, has written a book on how collaboration brought about advances in technology.Credit Vanessa Vick for The New York Times
Book Entry
View all posts

Walter Isaacson has written a highly engaging and wildly ambitious history of “the digital revolution.” The author’s book, “The Innovators: How a Group of Hackers Geniuses and Geeks Created the Digital Revolution” (Simon & Schuster), covers more than 200 years of engineering and programming breakthroughs, starting with Jean-Marie Jacquard’s use of punch cards in 1801 to create the first automated loom. The more recent parallel developments in computing power and computer networks spurred the actual revolution of Mr. Isaacson’s title. By chronicling the core building blocks that made these advances possible, “The Innovators” hopes not only to remind us of our debts, but teach us about the very nature of innovation itself.

It is a necessary story, well told. Mr. Isaacson has a journalist’s eye for the personal and interpersonal dynamics that propelled, and impeded, the protagonists. A successful innovation typically requires vision, execution and marketing. Strengths in these domains rarely adhere in a single individual.

His history has a wide range of personality types relating to these various skill sets, but what many have in common is a belief in the primacy of their own contribution. The result is that “The Innovators” is filled with tales of perceived slights and betrayals, long-held grudges and, of course, litigation.

The story of John Mauchly, the flamboyant professor who led the development of the first modern computer, ENIAC, at Penn’s Moore School in 1945, is indicative. In the 1970s, an otherwise obscure Iowa State scientist succeeded in challenging the ENIAC patent. An embittered former professor, the scientist convinced a judge that Mauchly stole crucial concepts during a weekend visit in 1941 to his Iowa home. When it came to the successor to ENIAC, however, it was Mauchly’s turn to cry foul. The Princeton mathematician John von Neumann consulted on the project but was vilified as “stealing ideas and trying to pretend that work done at the Moore School was work he had done.”

Matters have only gotten worse since. Mr. Isaacson notes the 2011 milestone reached when Apple and Google “spent more on lawsuits and payments” for patents than on new product development.

Photo
Credit

Given this track record of bad feelings, Mr. Isaacson’s big finding on the subject of innovation comes as something of a surprise: “First and foremost,” he assures us, “creativity is a collaborative process.”

Mr. Isaacson is fond of the touchy-feely term “collaborative innovation” to describe how technological advances are typically achieved. “Innovation,” he insists, “comes from teams more often than from the light bulb moments of lone genius.” And to be sure in these pages important discoveries do seem to generally involve a synthesis of disparate insights that had not previously been connected.

The problem with his theory, however, is that his narrative is actually replete with light bulb moments. These are often experienced in the shower or while asleep, but never that I can recall in one of the team “brainstorming” sessions promoted by Mr. Isaacson. Furthermore, based on the evidence here, the flash of insight is as likely to relate to a distant memory – an old article or conversation that made an impression – as to the ideas of current co-workers.

Mr. Isaacson tries to get around this problem by simply redefining collaboration so that the term loses all practical meaning. “Collaboration,” he writes, is “not merely among contemporaries, but between generations.” It is fascinating to learn how the pioneering computer programmer Grace Hopper was inspired by a paper written by Ada Lovelace (Lord Byron’s daughter) 100 years before. But does it really mean anything to say that they were collaborators? The essence of collaboration is the joint pursuit of a shared goal. This is not possible inter-generationally.

The idea that human knowledge is best promoted by kumbaya-like sing-alongs among teams “that brought together people with a wide array of specialties“ has a sentimental attractiveness. But experience in business and government suggests that interdisciplinary undertakings as often as not yield least common denominator pabulum. The notion that somehow in scientific domains such interactions will have talismanic results is not credible. Here Mr. Isaacson has an important insight: Collaborative teams need to have a shared objective, usually provided by a passionate and visionary leader. “The Innovators” is filled with examples of once-prolific organizations like Bell Labs and the Moore School that disintegrated once it lost the leaders who defined and enforced common institutional aims.

Mr. Isaacson’s feel for the personalities involved is unfortunately not matched with a corresponding understanding of the basic economics of the businesses created. This weakness manifests itself in ways small and large. Apple’s closed integrated system, for instance, is contrasted to Microsoft’s licensing of its software to all manufacturers.

“Apple’s approach,” Mr. Isaacson writes, “led to more beautiful products, a higher profit margin, and a more sublime user experience.” In fact, Microsoft margins have been consistently higher than Apple’s ever since it went public in 1986 (three times higher on average). Indeed, Steven P. Jobs’ crucial business decision that saved the then-unprofitable Apple when he returned in 1997 was to announce a partnership with Microsoft to support Office for Mac.

More significantly, there is no balance to the later section dealing with the combined impact of the explosion in computing power and emergence of the Internet. Rather than explore the full range of innovations that have transformed the economy and society as a whole, Mr. Isaacson focuses narrowly on applications that relate to journalistic enterprises. For instance, how the emergence of cloud computing has enabled insurgents like Salesforce.com to challenge incumbent giants like Oracle is left unexplored. And within the consumer Internet, content dwarfs commerce in his telling. The rise of Amazon, eBay and Priceline, whose combined market value now exceeds a quarter trillion dollars, are mentioned in passing or not at all. By contrast, many pages are devoted to the history of Blogger, the tiny web self-publishing business bought by Google for scrap value in 2003 a few years after the company had to lay off its staff.

The treatment of Google itself reflects Mr. Isaacson’s blind spots. The history of the collaborative partnership between its founders, Sergey Brin and Larry Page, is well presented. But the story abruptly stops after our heroes develop the PageRank algorithm that revolutionized search. Left unsaid is the fact that Google would be a tiny fraction of its current size if it had simply sold banner ads on search results pages as planned.

To this day, the vast majority of Google’s $60 billion in revenue comes from AdWords – the pay-per-click ad auction technology. The basic idea underlying AdWords, however, was developed by Bill Gross at IdeaLab, not Google. Litigation and a settlement followed.

That many of Mr. Isaacson’s sweeping statements about the nature of technological innovation should provoke objections is not a dispositive criticism of “The Innovators.” Indeed, the fact that the basis of those objections can be found in the nuanced and detailed narrative of the crucial moments in computing history is the book’s core strength. “The Innovators” provides the tools necessary for a thoughtful national discussion about how we as a society can best spur innovation. If it incites such a dialogue, that would be a true innovation.

The Women Tech Forgot

The Women Tech Forgot

Whether in 1843, 1946 or today, women’s contributions are often ignored.