The Innovators, by Walter Isaacson
Like Isaacson’s book, “Steve Jobs,” this book was wonderfully written. It was enjoyable, easy to read, and informative. I highly recommend it to anybody who is interested in understanding the history of computers or the internet. Rather that focusing on the technical details of these digital technologies, Isaacson instead focuses on the people who drove the innovations. I think that it’s easy for us to take modern computers and technology for granted and view them simply as “magical black boxes.” However, it’s good to remember that computers, smart phones, and Google are all relatively new technologies.
Digital revolution timeline
Computer -> vacuum tubes & electromechanical relays -> stored-memory -> transistors -> microchips -> video games -> internet -> personal computers -> email -> world wide web -> wikis -> Google
Ada Lovelace published the first computer program in her work called “Notes”, which discussed how to calculate Bernoulli numbers on Babbage’s Analytical Machine (1843). Ada was a poet and a scientific thinker. However, she became addicted to gambling and opiates. She defied authority and was a bit of an outcast among other women. In general, the geniuses all had a rebellious side.
Alan Turing wrestled with the idea that the human mind was indistinguishable from a deterministic machine. He believed that computers should be able to perform certain actions based on “a table of instructions” (1935). In order words, computers should be general purpose machines, not constrained to a single function, that could complete any logical operation. In 1937, he wrote “On Computable Numbers, with an Application to the Entscheidungs-problem.” He was gay, and in 1954 after he was sentenced for homosexual acts, he committed suicide using a cyanide-laced apple.
ENIAC (Electronic Numerical Integrator and Computer) was proposed by Mauchly and Eckert (1943) and completed in 1945. Since hardware was valued over software in the early days of computing, the programmers were primarily women with math degrees. WWII is responsible for funding Mauchly and Eckert’s machine. Mauchly was a showman and a physics teacher, and he liked good food, good liquor, women, the intelligent, and the unusual. Eckert was an engineering wiz with a passion for perfection. They served as the perfect counterbalance for one another – Eckert would micromanage and drive for perfection, whereas Mauchly would make jokes and make people feel appreciated. ENIAC used a decimal system as opposed to a binary system for its calculations. It contained 17,468 vacuum tubes, weighed 30 tons, and was hundred feet long by 8 feet high, and consumed massive amounts of energy. But, it was 100x faster than any other computer at the time. ENIAC was programmed by plugging and unplugging large cables. Although it was a general-purpose computer, it did not have modern software capabilities. ENIAC served as the foundation for all modern computers.
Grace Hopper earned a PhD in math in 1934 from Yale. She valued clear communication, and was consequently charged with writing the user manual for the Mark I, the world’s most programmable computer at the time. However, the Mark I used electromechanical relays and was very slow compared to the ENIAC, which had electronic vacuum tubes. Hopper was prominent at developing subroutines, and her team coined the term “bug” when they found a moth that got smashed in one of the electromechanical relays.
John Von Neumann was a child genius who could memorize pages in the phone book, divide eight-digit numbers in his head, recall pages in novels and articles verbatim, and spoke many different languages. He was always well-dressed, loved throwing parties, and drove recklessly. He recognized the importance of stored programs and read-write memory. He was known for aggregating ideas and leading the development of EDVAC, which improved upon ENIAC with a binary computing system and stored-programs.
William Shockley was born with a ferocious temper, and although he was smart, he scored only a 120 on his IQ test, which was not high enough to be considered a genius. His most notable skill was the ability to visualize how quantum theory explained the movement of electrons. He developed the idea for a solid-state replacement of vacuum tubes, which led to the development of the transistor. Under Shockley’s removed guidance, Brattain and Bardeen developed the first transistor. However, Shockley was jealous that he did not receive credit for the invention, so he made his own transistor invention in an attempt to “one-up” Brattain and Bardeen. He developed the idea for a n-p-n transistor. The n-layers were doped with electrons, and the middle layer was doped with holes. When a voltage was applied across the p-layer, electrons flowed between the top and bottom n-layers. Shockley moved on to found “Shockley Semiconductor” in Palo Alto, California, thus founding present-day “Silicon Valley.” Shockley is a textbook case of bad leadership – he was secretive, authoritarian, and paranoid. This caused his employees to rebel and form their own company.
Robert Noyce was recruited by Shockley, but under Shockley’s poor management, Noyce joined Moore and six others, deemed “The Traitorous Eight” to form Fairchild Semiconductor (1957), which received $1.5 million in funding from Sherman Fairchild. Noyce was excellent at everything that he did, good looking, and well-liked. He was a champion diver, excellent math and physics student, played oboe in the band, sang in the chorus, and developed circuits for the model airplane club. At Fairchild Semiconductor, he developed the idea to build a microchip using a piece of solid silicon.
Jack Kilby was gentle, taciturn, and curious. He developed the idea to make a p-n junction on a piece of silicon (1958) at Texas Instruments at roughly the same time that Noyce established the idea at Fairchild Semiconductor.
Venture capitalism Eventually, Noyce was ready to leave Fairchild Semiconductor. Along with Moore, they received funding from Arthur Rock to found Intel. Arthur Rock’s investment was the foundation of modern venture capitalism, that is, investing in entrepreneurs that are unable to receive a bank loan. In 1971, Intel revealed the first microprocessor, the Intel 4004, and it was wildly successful.
Steve Russell The value of microchips was first realized by geeks and hackers that loved pranks and games. Most notably among these was Steve Russell, who developed the first computer video game, Spacewar, while studying at MIT. Spacewar was create as an open-source game, and it received contributions from many different programmers. After Spacewar came Atari and Pong. Due to Pong’s simplicity, it became massively popular.
Joseph Carl Robnett Licklider (“Lick”) was an only child that was enthusiastic about model planes and cars. He was also passionate about art; he would go to museums and analyze each brushstroke of a painting. Lick’s outstanding ability was to combine technology with psychology and human factors; he had an eye for recognizing talent. His vision was that computers should be tied together in a network and speak a common language. He was the pioneer who envisioned the internet.
Internet was formed among collaboration from the military, universities, and private corporations. At ARPA, many engineers worked together to form a method for networking computers together. It became known as ARPANET. The idea was that control should be completely distributed as opposed to centralized, and that every node in the network should be able to send messages and interpret messages from one another. To satisfy this requirement, messages were broken into small packets. In 1964, Baran published “On Distributed Communications” to outline the idea. Using the TCP/IP protocol to send and interpret packets within the network, the concept of the internet was born. Although the internet was officially created in the 1970s, it was restricted only to military and academic organizations, that is, until personal computers were developed by Bill Gates and Steve Jobs.
Bill Gates was an extremely intelligent, intense, and rebellious child that came from an affluent family. Instead of being obsessed with computer hardware, he believed in the importance of software. He was a true nerd, he became obsessed with coding, and he befriended Paul Allen, who had similar passions. One of their first projects was to write a traffic emulator, Traf-O-Data, on Intel’s 8008 microprocessor. Gates also had a love for extreme sports. When he was not programming, which he seemed to do unceasingly, he would take occasional breaks to do extreme waterskiing or drive his red Mustang dangerously fast. Most notably, while attending Harvard, Gates and Allen wrote the first high level programming language for a microprocessor. The BASIC interpreter ran on Altair’s Intel 8080 microprocessor, and it required less than 4K of memory. One of the challenges of completing the BASIC interpreter was handling floating point numbers and scientific notation. This same problem was one that Apple’s Steve Wozniak was unwilling to solve, due to its complexity. However, Gates met Monte Davidoff, who had a solution; this was one of the benefits of attending a school full of geeks and nerds, like Harvard. After two years at Harvard, Gates dropped out. Using the foundation of BASIC, Gates founded Microsoft (1976).
Steve Jobs was also an extremely intelligent child, and he knew it. He embraced the hippy culture, unusual diet fads, and personal computers. He partnered with Steve Wozniak to found Apple. For a more comprehensive analysis, see the biography, “Steve Jobs,” written by the same author, Walter Isaacson. Unlike Gates, who believed that software should be open-source (but not free), Jobs wanted to control all aspects of the software and make it fully proprietary.
Linus Torvalds grew up in Finland. He was exceptional at math and physics, but he had no social abilities. Torvalds enjoyed programming in machine language, also known as Assembly, which is the closest language to the computer’s hardware. Hiding in his dark basement, he developed the kernel, Linux (1991). Torvalds also believed that software should be open-source, and he embraced Stallman’s concept of GNU GPL (GNU Not Linux General Purpose License), which stated that “everyone can run the program, copy the program, modify the program, and distribute modified versions – but not permission to add restrictions of their own.” Although Torvalds agreed on the concept of open-source, he also believed that developers should be allowed to get paid for their work, unlike Stallman, who advocated for free software.
AOL The first uses of the internet were to send messages to other people. The first email was sent in 1972 on ARPANET. The entrepreneurial spirit of William von Meister partnered with the practical Steve Case, and the discipline of Jim Kimsey, worked together to found America Online, which became known as AOL. AOL was primarily a social networking platform, but it also offered news, sports, and weather. After AOL, the final barrier to making the internet accessible to everyone was government regulation. Up to 1993, the internet and the personal computer were developed in parallel. Anybody could buy a personal computer, but only universities and military organizations had access to the internet. Under the leadership and legislation created by Al Gore, most notably the National Information Infrastructure Act of 1993, the internet was made widely available to the general public. September 1993 became known as the Eternal September.
Tim Berners-Lee invented the World Wide Web. He was born in 1955, the same year as Steve Jobs and Bill Gates, and he recognized that computers were very good at crunching numbers but not very good at making connections between random ideas. As a child, he was fascinated with the book, “Enquire Within Upon Everything.” In order to link documents and information on computers, he invented hypertext links, as well as a convention for naming each linked document, called Universal Resource Locators (URLs). Additionally, Berners-Lee invented the Hypertext Transfer Protocol (HTTP) for exchanging hypertext, and the Hypertext Markup Language (HTML) for creating web pages. In 1991, Berners-Lee made the first public announcement of the World Wide Web. It quickly became popularized and filled with blogs and wikis.
Larry Page loved music and computers. He attended college at Stanford and did his PhD thesis on how to mathematically categorize and rank all of the web pages on the internet. Page worked with his similarly brilliant friend, Brin Sergey, to develop a method of tabulating links and ranking their relative importance. They called their search engine “Google” (1998).
A few last thoughts
Influential institutions: Bell Labs, University of Pennsylvania, MIT, IBM, RAND Corporation, Intel, Xerox PARC, ARPA, Stanford
Innovation requires 3 parts
A great idea – you don’t have to worry about somebody stealing a truly innovative idea, because truly innovative ideas go against the grain
Great engineering to bring the great idea to reality
Great business acumen to bring the great engineering to the public
The greatest innovators combined a love for technology with a love for the humanities. They combined science and engineering with art. They were passionate about fine art, music, and poetry. Most of them were extremely intelligent, had a rebellious side, and enjoyed tinkering with electronics and software. They also formed partnerships with similarly talented individual who had complementary skills.
paean: a joyous song of hymn of praise
ecumenical: relating to, or representing the whole of a body of churches
lagniappe: a small gift given to a customer by a merchant at the time of purchase
ergodicity: relating to a process in which every sample is equally representative of the whole
esprit de corps: the common spirit existing in the members of a group and inspiring enthusiasm, devotion, and strong regard for the honor of the group
lapidary: a cutter, polisher, or engraver of previous stones other than diamonds