- Gordon Bell (H'10) took risks throughout his career and computer science is better because of it
Five times a year, computer science professionals gather at TTI/Vanguard conferences to discuss issues such as cybersecurity and cloud computing. At one such conference in 1998, Nicholas Negroponte, founder of the Media Lab at the Massachusetts Institute of Technology, gave a presentation about the Internet. He predicted that by 2001, one billion people would be using the Internet.
Sitting in the audience was C. Gordon Bell of Microsoft Research's Silicon Valley lab and a fellow member of the Vanguard board. As he listened to Negroponte, Bell began running numbers through his head. Something didn't add up.
So he called Negroponte's bluff--Bell didn't think there would be a billion users until 2003. And he went further--he publicly bet him $1,000 that he was wrong.
The two exchanged email to hammer out the details, with Bell sending Negroponte a graph predicting the number of Internet users based on his 1995 figures. (Bell loves graphs and charts. He organizes his life with the help of links, charts and PowerPoint presentations. Ask a simple question, you're likely to receive a detailed table in return.) In 2001, with Bell poised to win the bet, Negroponte upped the ante--he bet another $5,000 that a billion people would be on the Internet by 2002. Once again, Bell won, though he says he still hasn't collected.
A true gambler doesn't lose track of his bets, and Bell is a gambler who knows a good bet when he sees it. While others take chances on boxing matches, horse races or Super Bowls, Bell bets on computers. He seems prescient when it comes to computer trends--so prescient that his "Bell's Law of Computer Classes," formulated in 1972, still holds true. Bell's Law predicts that every 10 years a new class of computers will emerge.
Every decade since, a transformative new technology has indeed appeared. The 1970s had minicomputers; the 1980s had personal computers; the 1990s had Internet-scale computers; in the 2000s cell phones and PDAs acquired broad computing capabilities; and 2010 has cloud computing. Bell has now extended his law to cell phones and mobile devices--he optimistically predicts (but won't bet) that within five years, cell phones will store up to a terabyte of data, allowing the handheld devices to contain nearly everything a person sees and hears.
- Online-Only: Gordon Bell's May 16, 2010 commencement address to SCS graduates. PDF reader required.
- And there's more by and about Gordon Bell from the CMU archives in our Web Extra.
Throughout his 50-year career, Bell's ability to predict computing trends has enabled him to pioneer the technologies that have propelled computer science forward. This clairvoyance has helped establish him as the father of computer architecture. In recognition of his lasting contributions to computer science and the world, Carnegie Mellon awarded Bell an honorary doctorate in science and technology at May's commencement ceremony.
"The man is smart--just down to his toes, he is smart," says William Wulf (H'99), AT&T Professor of Computer Science at the University of Virginia, who was a member of Carnegie Mellon's computer science faculty with Bell in the early 1970s.
One of Bell's more recent collaborators, Jim Gemmell of Microsoft Research, calls Bell a "geek" in the best sense of the word. "He loves the technology, he loves the practical, useful side and he gets very excited," says Gemmell, who also appreciates Bell's personal knowledge of the industry. "Here is the guy who (took the industry) from mainframes to minicomputers," Gemmell says. "He has a perspective on how things fade and how things emerge."
Bell grew up in Kirksville, Mo., where at age 6 he started working for his father's electrical contracting company. Before he was 10, Bell was crawling into tiny spaces under homes to run wiring and had built a working electric motor from scraps of tin. He knew he wanted to be an engineer; when his family took vacations, Bell asked to visit power stations with turbines. After high school, he enrolled at MIT, earning a bachelor's degree in engineering and then his master's degree in 1957.
"This was only 10 years after the first major computer had been built and I was right (there) at the beginning," he says. "As far as I was concerned, they were the most interesting artifact out there."
After Bell completed a Fulbright Scholarship in Australia, Ken Olsen and Harlan Anderson recruited him to their fledgling company, Digital Equipment Corp., better known as DEC, in 1960. There, Bell began designing computer hardware, creating input-output devices for the newly launched PDP-1 (for "Programmed Data Processor") and later developing the PDP-4 and the PDP-6. The PDPs were the first commercially successful mid-sized computers, intended to bring automation and data processing to businesses that couldn't afford larger mainframes produced by bigger companies such as IBM. In fact, the PDP-6 turned out to be DEC's first "big" machine--among the first commercially available timesharing systems, it could handle "words" up to 36 bits long and 18-bit memory addressing--features then generally confined to much larger machines.
But while he enjoyed building computers, after six years Bell felt he needed a new challenge. "At the time, I really wanted to think more about where computing was going," especially in regard to computer architecture, Bell says.
Ivan Sutherland (E'59, H'03), a colleague at DEC, told Bell about Carnegie Tech and introduced him to Alan Perlis (S'42)--head of the newly formed Department of Computer Science--and Allen Newell (GSIA'57), then the Institute Professor of Systems and Communications Sciences. Bell liked what he saw--particularly the chance to help shape the new department.
As late as 1966, when Bell arrived at Carnegie Tech as an associate professor of computer science and electrical engineering, the faculty was still creating the CS curriculum. He remembers sitting on "half a dozen" national committees that were creating the requirements for CS degrees. "Looking back, we certainly were developing the foundation of what we were going to teach and how we were going to teach digital systems design," he says.
At Carnegie Tech, Bell collaborated with Newell on the groundbreaking book Computer Structures: Readings and Examples, published in 1971. Wulf calls the book "one of the first influential texts on computer architecture," adding, "that in and of itself has had an enormous impact on computer science."
In the fall of 1968, not long after Carnegie Tech merged with Mellon Institute to form Carnegie-Mellon University, Wulf joined the CS faculty. His office was adjacent to Bell's, and the two learned they shared similar interests in improving and streamlining computer architecture. They'd soon be able to apply their research to a real-life problem, close to home. Artificial intelligence--in particular, speech and vision processing--had emerged as major fields of inquiry at CMU, but the university's existing computers weren't capable of handling the necessary calculations or the volume of data. A brand-new Digital PDP-10 helped, and the university originally planned to acquire several and network them together, but a lack of funding was holding that back.
Instead of trying for more PDP-10s, CMU took advantage of the interest in system architecture among Bell, Wulf and other faculty members, and made a radical proposal to the Defense Department's Advanced Research Projects Agency. CMU would acquire 16 processing units from the newly introduced (but somewhat less powerful) PDP-11 minicomputers and link them together with a crossbar switch, similar to those used in telephone networks. With $1.4 million in ARPA funding, they created a precursor to the multi-core, parallel-processing computers that now, 40 years later, are becoming ubiquitous. The new computer was C.mmp, and the operating system kernel was called Hydra.
No one in 1971 had yet built any machine using more than two processors, says Wulf, who led the development along with Bell, Newell, Raj Reddy, Bill Broadley and Ron Rutledge. Although Wulf was responsible for Hydra--the software side--he credits Bell with the concept for C.mmp.
"There is no question that it was sparked by Gordon's essential notion that there could be this 16-processor computer," Wulf says. "He really thought in very innovative ways. The notion of building a 16-processor computer was just not something that anyone else had thought about, and it was his idea to do it."
Dan Siewiorek, now Buhl University Professor of Electrical and Computer Engineering and Computer Science at CMU, was a Stanford University grad student at the time, using Bell's textbook on computer architecture. Siewiorek met Bell for the first time when Bell visited Stanford, just about the time that the C.mmp project was getting underway. By 1972, Siewiorek had followed Bell to Carnegie-Mellon. He eventually led the development of C.mmp's successor, Cm*, which linked 50 processors.
Siewiorek, who recently stepped down as director of the Human-Computer Interaction Institute, found in Bell a fascinating thinker and innovator. "Gordon would be thinking about several things at the same time and all the thoughts would come out intertwined," he says. "I would replay his thoughts over and over in my mind until I could figure out his unique perspectives."
But by the time C.mmp was coming online in 1974, Bell--though still involved in the project--was back at Digital, working on further development of the 16-bit PDP-11. The original concept for the PDP-11 came from the mind of Gordon Bell, too, Wulf says. At the heart of the PDP-11 was a universal bus called, naturally, Unibus. Instead of requiring dedicated input-output ports for all devices, they shared a bus and were mapped to memory addresses instead. And instead of dedicated input and output instructions, the PDP-11 used an orthogonal instruction set, allowing programmers to use a simple "move" command to put any piece of data into any part of its memory, including memory that actually referred to a device. In this way, data could even be transferred directly along the Unibus from an input to an output. It was easy to learn, easy to operate, and even easy to put together. Programmers and users loved it.
"The design of the PDP-11 arose from a lecture (Bell) gave at MIT," Wulf says. "It was a radically different way to access memory and it was brilliant. He just has this knack of seeing things in a different way that's very constructive." In 1970, the PDP-11 became the second machine type (after the PDP-7) on which the creators of Unix ran their new operating system. The wide acceptance of Bell's PDP-11 no doubt contributed to Unix's longevity.
As advanced as the PDP-11 seemed, Bell was already looking ahead. Soon after arriving back at Digital as vice president of engineering, he wrote a one-page memo, outlining why the company needed to develop what became known as "VAX," for "Virtual Address Extension." Like PDP-11, VAX deployed the Unibus and an orthogonal instruction set, but VAX would use 32-bit addressing (implying huge amounts of memory for the day). Bell enlisted some of his former CMU graduate students to work on the computer, including Bill Strecker (E'66, '67, '71), who designed the architecture. The first machine--installed at Carnegie-Mellon in 1977--could handle 1 million instructions per second and had clustering capabilities that still rival those of modern operating systems. The VAX series became Digital's most successful product line ever.
As VAX revolutionized the minicomputer industry, Bell's reputation grew, and more and more organizations came to him for his expertise. In the early 1980s, along with Xerox and Intel, he helped make Ethernet the standard for local area networks. Though Ethernet was initially developed at Xerox, Bell notes it shares many of the features of the Unibus. "A Unibus could be up to 15 meters long," he says. "If you think about it, Ethernet is just a 2.5-kilometer Unibus."
In 1987, Bell joined a task force studying ways to open ARPANET--a closed network of computers mainly linking universities and government agencies--to a wider array of researchers in fields beyond computer science. In a paper published in IEEE Spectrum, Bell said scientific progress required a high-speed worldwide data network, and he called on the U.S. government to provide the initial funding and access to the existing ARPANET protocols. Soon after, the first commercial users began linking to the ARPANET, and the Internet was off and running.
Though Bell has shown remarkable foresight when it comes to computer architecture, it turns out that some things escape his predictions. He had no idea that ARPANET would transform into the Internet as it exists today.
"That was one that took me by surprise," he says. "Nobody had a view that, 'Oh my God, we were going to have all these computers and anyone anywhere could access these computers and there was going to be a plethora of new stuff.'"
Bell's energy seems boundless and contagious. He holds himself and his colleagues to high standards. He'll thump on a table and raise his voice if someone presents a poorly conceived plan. Yet his desire to excel encourages others to do their best. When Bell moved to Microsoft Research in 1995 to work on telepresence technologies, the company hired Gemmell to be--as Gemmell puts it--his "wingman." Gemmell didn't know much about Bell. He found out fast. "It was amazing to come to work and learn what an amazing guy he was," he says.
Like Siewiorek, Gemmell has also come to appreciate Gordon Bell's unique thought process and way of speaking. When Bell talks, ideas crash together so that a sentence starts on one topic and ends on a completely different subject. "He gets excited easily when we talk about ideas," Gemmell says. "I remember one person asking, 'Is he starting to get old? He can't complete a sentence before he goes onto the next idea.'" Gemmell explained that no, Bell never finishes a sentence. He's even read 40-year-old Bell articles, he says, and even then, Bell didn't always finish his thoughts.
Wulf says that non-linear thought process is one reason for Bell's success. "He really generates ideas that are not part of the conventional wisdom and not in the main stream of thought," Wulf says.
At Microsoft, Bell and Gemmell first worked on gaze problems in teleconferencing. When people participated in telepresentations, they couldn't make eye contact with the people watching them, which made them less effective. As they discussed the challenges they were facing, Bell mentioned that if people had digital presences in their workplaces, they should have "digital file cabinets," too.
That eventually prompted Bell to recall Vannevar Bush's legendary article from the Atlantic Monthly, "As We May Think," which suggested that at some point in the future, people would record everything they did or saw in an electronic filing system that Bush called a "Memex." With large amounts of storage now available at low cost, Bell decided a "Memex" was finally a practical reality. He could digitally index his life. He began scanning documents, and when the task became too daunting, he hired a full-time assistant. Gemmell developed software to organize and store data for the project, which Bell dubbed MyLifeBits.
"Our assignment was for Gordon to live it and for me to build the software," Gemmell says.
Bell didn't simply scan his work files--he scanned everything in his life from his birth certificate to his health records to his vacation pictures to the covers of his favorite record albums. He collects screenshots of every Web page he visits. Bell doesn't possess anything that he doesn't try to digitize. For a while, he wore a camera with an infrared sensor that detected when Bell was talking to another person and then took the person's picture. Bell also records phone conversations for future reference. Bell is always looking for ways to improve his life through technology. Now, since his life is digitized, he can never truly forget any piece of information--he can search through his records as fast as other people can Google their own names.
It's typical Gordon Bell innovation, Wulf says. "That is the kind of thing no one else is doing or thinking about," he says.
MyLifeBits differs from blogging or posting tell-all status updates on Facebook or Twitter, Bell says. "'Life-logging' is not life 'blogging,'" he says. "My biggest concern is privacy. I don't recommend people putting a lot of their life on the Internet." But as massive amounts of storage become cheaper and more readily available, Bell suspects more people will digitize their lives.
Digital health records recorded throughout a person's life would be an enormous help to doctors as they aged, Bell says. He's convinced that "life-logging" will eventually become commonplace. He might bet on it--if he finds any takers. Last year, Bell and Gemmell wrote about their research in the book Total Recall: How the E-Memory Revolution Will Change Everything.
When it comes to betting, Bell has an amazing track record. Like everything else, Bell has kept meticulous track of his wagers, and has a chart to indicate that he's only ever lost twice. But Bell has confided at least one mistake to Gemmell. When Bell returned to MIT after his Fulbright in Australia, he spent a year working on speech recognition technology. He considered pursuing a PhD in speech research, but abandoned it because he thought it wouldn't be a reality for "another 20 years." Besides, Bell says, as an engineer, he wanted to build things people would actually use.
Voice-controlled systems remain rudimentary, and Bell recently told Gemmell, "I was wrong, because it was more like 50 years out."
Otherwise, it's not a good idea to bet against Gordon Bell. He's got the charts to prove it.
Jason Togyer | 412-268-8721 | firstname.lastname@example.org