20 Most Important People

Although computers are technology, they are created by people. And the people who create them are not just one-dimensional nerds--in fact, their breadth fuels their innovation. These 20 people have made the greatest impact on microcomputing.

DAN BRICKLIN

Can you imagine doing business without the spreadsheet? Dan Bricklin can't. But, then, he invented it. He got the idea while sitting in a class at the Harvard Business School. As he watched the professor fill in spreadsheets on the chalkboard, he thought, Wouldn't it be nice if you could do that electronically? Bricklin designed the interface, and his partner, Bob Frankston, wrote most of the code. They released VisiCalc in 1979, an act that fomented the desktop revolution. At last, there was something useful to do on a microcomputer.

Did he know at the time how important spreadsheets would be to computing? "Well, you always believe that your product's going to be wonderful and make major changes, but you can't always depend on that. I thought it would be very useful for business, and I tried to design it to be as useful [in] as many different areas as possible."

What about today's sophisticated spreadsheet features? "For any given user, there are things that are superfluous, and for any given user, there are things that are missing. For my needs, just being able to recalculate is 90 percent of the way there. In that case, almost everything is sufficient."

As important as VisiCalc was, the decision not to seek patent protection helped spawn an entire industry. With a patent, Bricklin could have controlled the market for 17 years. Great for him; lousy for us. "Seeing the advances that did come about from people trying different things and [being] willing to make compromises that we may not have been willing to make, I don't think the industry would have moved as far as it has."

Lotus bought the VisiCalc rights in 1985. Bricklin has gone on to design other successful, though more specialized, products but none has revolutionized computing like the spreadsheet.

BILL GATES

Here's one man who needs no introduction. Back in 1975, Bill Gates and a high school buddy, Paul Allen, wrote a version of BASIC that ran in 4 KB on the MITS Altair 8800 computer. Soon they founded Microsoft and were creating versions of BASIC and other languages for various platforms. Their Big Break came in 1980 when IBM contracted with them to write Disk Operating System, or DOS, for its new PC. Through an incredible act of charity or stupidity, IBM gave Microsoft rights to sell versions of DOS to other manufacturers.

Today, Gates is worth more money than the other 19 people on our list put together (and they include several multimillionaires). But we're here to talk technology, not tax shelters. Gates, who is about to launch Microsoft Network, recalls that he and Allen long ago believed on-line services would be the killer application: "We thought they would catch on in the 1970s and the 1980s. We always thought that would be the defining application and it would get the things in people's homes, which definitely turns out to be true but 15 years later than we expected."

Why the delay? "What you can do with 300 baud is tricky . . . Then there was the small problem of a business model, how to deliver an essentially free service to people and get advertising to pay the freight. Finally, PCs lacked critical mass: Unless you get an immense number of people using it, it's of no value . . . We were naive to think that would spark a critical mass."

Any final thoughts? "The last big revolution in communications to have this kind of impact was the telephone. It was a two-way device, and it shrunk the world. The world became a different place."

STEVE JOBS

Unless you've been stuck on a streetcar named Mobius these past 20 years, you know the saga of Steve Jobs: College dropout, garage-shop inventor, cofounder of Apple Computer, ousted in 1985 at age 30, cofounder of Next Computer. During the Apple II's heyday, he stood in the shadow of his partner, the technically superior Steve Wozniak. But his marketing moxie, his love affair with the microphone, and his unrelenting vision for the Macintosh, released in 1984 with a revolutionary GUI, catapulted Jobs beyond the limelight.

As the name brashly implies, Jobs hoped the Next would be the next killer machine. But with an $11,000 price tag, even a high-tech Billy Graham couldn't win many converts. "We knew we'd either be the last hardware company that made it or the first that didn't, and we were the first that didn't." He's repositioned Next and now wants to be the main man in object technology. "I went to Xerox PARC in 1979, and I saw the Alto. There was a crude graphical user interface on it . . . within 10 minutes it was obvious that all computers would work this way someday. Objects are the same way. Once you understand what objects are, you realize that all software will be written using objects, object technology."

What does this innovator think of today's interfaces? "The Mac has been dead in the water since 1985 in terms of its user interface. And Windows is still a sort of caricature of the Mac. Windows 95 doesn't really get it. The user interface is not very good."

Never short on bombast, Jobs likens today's GUI situation to TV. "You think it's a conspiracy [by] the networks to put bad shows on TV. But the shows are bad because that's what people want. It's not like Windows users don't have any power. I think they are happy with Windows, and that's an incredibly depressing thought."

ROBERT NOYCE

Can you imagine saying Germanium Gulch instead of Silicon Valley? Thank Mother Nature and Robert Noyce for sparing us from that mouth mangler. Here's why.

In late 1958, a young engineer at Texas Instruments named Jack Kilby placed two circuits on a single piece of germanium, hand-wired the interconnects and--presto--created the first IC. Within months, Noyce and company at Fairchild Semiconductor used a planar process they had developed to connect the components on their version of the IC. In so doing, they discovered that the IC's conductivity was better and more controllable when silicon was used instead of germanium. To this day, Kilby and Noyce are both credited as the independent co-inventors of the IC.

Within three years, Fairchild and TI were producing affordable chips in volume using Noyce's process, a manufacturing technique that has undergone minor improvements but remains basically unchanged to this day. ICs were first used in a commercial product--a hearing aid--in 1963. By the mid-1960s, they were used widely throughout the electronics industry. Noyce went on to cofound Intel Corp. in 1968 and served as president and chairman of the board.

In mid-1988, after the U.S. chip industry had been losing market share to offshore competitors for years, Noyce was named CEO of Sematech. The government-industry consortium was established to conduct advanced computer chip R&D on behalf of its members and to advance U.S. competitiveness. It succeeded. Noyce, the son of an Iowa minister, was widely regarded as a gentleman and a scholar. He died at the relatively young age of 62 in 1990.

As an aside, a few years after inventing the IC at Texas Instruments, Kilby helped toll the death knell for the time-honored slide rule when he was a member of the TI team that invented the first pocket calculator. Kilby still works as a consultant.

DENNIS RITCHIE

It took some chicanery to overcome one of the biggest hurdles to the development of Unix. And we're not talking about some kind of sleight-of-hand code writing.

Launched in 1969 as a nonprofit venture between Bell Telephone Labs, General Electric, and MIT, the effort to create an OS for a large computer that would handle up to a thousand simultaneous users was almost scuttled early on for lack of a computer (they were really expensive in those days). Dennis Ritchie and his codevelopers, including Ken Thompson, finally suggested to BTL that it buy a PDP-11/20 for a text-preparation project. BTL regarded text preparation as something useful and spat out the seed money for the $100,000- plus machine.

"There was a scam going on," Ritchie once recalled. "We'd promised [to develop] a word processing system, not an operating system. But by the time the full computer had arrived in the summer of 1970, work was moving at full steam on both." And thus was born Unix. The text-processing system was a success, and the patent department at BTL became the first commercial Unix user in the bargain.

Unix, for which Ritchie deserves much of the credit, was one of the major advances in computing, giving the user features and functions that were previously unthinkable. It was not only a great advance but a great simplification, demonstrating that a relatively small OS could be portable, machine independent, and affordable. The advent of the workstation and the growth of networking have cinched Unix's place in computing. Since the late 1970s, Unix has had a profound impact on DOS, the Mac OS, Windows NT, and many others.

Ritchie and Thompson wrote the first Unix Programmer's Manual in 1971. Ritchie developed C, and in the early 1970s, he and Brian Kernighan coauthored The C Programming Language. Ritchie, now in his mid-50s, still works at AT&T research labs, where he is developing OSes, including Plan 9 from Bell Labs.

MARC ANDREESSEN

Less than two years ago, while his classmates were still nursing graduation hangovers, Marc Andreessen, at the age of 22, cofounded Netscape Communications. The other founder is Dr. James H. Clark, founder and former chairman of Silicon Graphics, Inc. This, the youngest member of our top 20, is the latest wunderkind to compile. What Steve Jobs was to the desktop, Andreessen is to the Internet. His Netscape Navigator (née Mosaic) for PCs, Macs, and Unix machines already accounts for more than half of all Web browsing. He led the development of the prototype while he was an undergraduate at the University of Illinois. Unlike some of the other wunderkind (whose names we won't mention), Andreessen graduated from college.

BILL ATKINSON

If you knew the Lisa like Bill Atkinson knew the Lisa, then you knew a lot more about the Lisa than most of us wanted to know. But from this scarlet woman, named for Steve Wozniak's daughter, came a GUI. Atkinson was the chief wizard behind its graphics engine. The Lisa begat the Mac, and the rest is history. Today, as cofounder of Apple spin-off General Magic, Atkinson wants to create technology that he hopes will be welcomed into people's lives, rather than be a source of stress -- technology like Magic Cap. We also fondly recall that he was the chief designer of HyperCard, the software construction kit that put Mac programming tools into the hands of millions of Mac users.

TIM BERNERS-LEE

If the snobs who whine about the Internet's exploding popularity ever form a vigilante posse, the first man they'll hang is Tim Berners-Lee. He's the guy behind the World Wide Web, which he developed for the CERN (European Council for Nuclear Research) in Geneva, Switzerland, so that physicists could swap data easily. Berners-Lee developed the URL, HTML, and HTTP standards, from which he wove the Web. Since launching the Web in 1991, he has often endorsed the idea of people using it for profitable transactions. He's now at MIT, where he directs the World Wide Web Consortium, which deals with Web security and other issues. He deserves a Nobel prize of some sort.

DOUG ENGELBART

Got patent envy? You'll have a hard time matching this pioneer, who holds 20, most of which are on basic features in microcomputing. Imagine microcomputing without windows; or word processing; or hypermedia, E-mail, and groupware; or the Internet. Imagine microcomputing without Doug Engelbart, now 70, who for years was a fixture at Stanford Research Institute. Engelbart had a vision that computers could be more than giant adding machines; they could be tools for human beings. A few years ago, he founded the Bootstrap Institute, dedicated to getting companies to collaborate on innovation. Comparisons with Thomas Edison do not seem farfetched, which reminds us: He's best known for the first mouse -- a wooden rodent invented in 1963.

GRACE MURRAY HOPPER

As a child, Grace Murray Hopper liked to take apart alarm clocks. She was the first woman to earn a doctorate in math at Yale. In World War II, she joined the Navy and was assigned to its computational center at Harvard. Amazing Grace later developed the first compiler for Remington Rand's UNIVAC in the early 1950s and led the charge to create COBOL. The Navy recalled her in 1967, and she was on active duty until 1986. She died in 1992 at the age of 85 with the rank of rear admiral. Anyone who met her could not help but be awestruck by this diminutive fire storm of a human being. One pictures her stuck in purgatory, refusing to enter Heaven until St. Peter agrees to computerize. With a Lucky Strike hanging from her lip, she fires at the grand saint: "Beg your pardon, Sir, but your excuse, `We've always done it this way,' is the most damaging phrase in the language."

PHILIPPE KAHN

French swagger, German determination, jazz artistry -- must be Philippe Kahn. This software swashbuckler writes great compilers, plays David against Microsoft's Goliath, and never bores us. The son of a German father and a French mother, Kahn grew up in Paris. He studied Pascal with Niklaus Wirth, took a degree in math, earned money playing jazz, and developed applications on an Apple II. But Pascal compilers were too slow, so he wrote Turbo Pascal. Then he marketed it. With only $2000 in his pocket, he landed in the U.S. with no green card and no job. He founded Borland International in an office over an automobile repair shop in 1983. Despite the humble abode, Kahn convinced a BYTE ad salesperson to accept on credit a full-page color ad for Turbo Pascal. At a ridiculous $49.95, Kahn was swamped with orders.

MITCH KAPOR

"Software has been very, very good to me," Mitch Kapor once said. And, we add, Mitch Kapor has been very, very good to software. In 1982, he founded Lotus Development and, with Jonathan Sachs, created Lotus 1-2-3. Dan Bricklin invented the electronic spreadsheet (VisiCalc), but Kapor turned it into a more powerful, yet friendly, business tool. Lotus 1-2-3 remains the world's most widely used application. Given IBM's takeover of Lotus, it's interesting to note that Kapor once tried and failed to interest Big Blue in an exclusive marketing deal for 1-2-3. He left Lotus in 1986. In 1990, he cofounded the Electronic Frontier Foundation, a nonprofit group dedicated to understanding the social impact of the digital revolution.

DONALD KNUTH

Nearly 20 years ago, while Donald Knuth was proofing galleys for the second edition of the first volume in his The Art of Programming magnum opus, it hit him: A book of 0s and 1s doesn't have to be ugly. The result was a 10-year hiatus from his Art series to develop TEX, a typesetting language for scientific publishing, and Metafont, an alphabet design system. Then the prolific scholar/programmer knocked out six books to explain them. (Now there's a word processor.) Now professor emeritus at Stanford, his fourth Art volume of a planned seven is in press. Oh, he's also a biblical scholar, having written 3:16 Bible Texts Illuminated, a history that examines chapter 3, verse 16 in each of the Bible's 59 books.

THOMAS KURTZ

Overkill. That's what Thomas Kurtz thinks of today's software. "The public has been sold the most complicated word processing systems imaginable, when all they want to do is to write a letter." Aching for simplicity in a computer programming language, Kurtz and John Kemeny codeveloped BASIC in 1964. It has its detractors, but BASIC is still bundled on virtually every microcomputer sold. They never copyrighted it, so dozens of variations appeared. This horrified the Drs. K, who dubbed the dialects "Street BASIC." In the 1980s, they formed a company to develop True BASIC, a lean version that meets ANSI and ISO standards. Kurtz is currently a professor emeritus at Dartmouth. Kemeny, once president of Dartmouth, died in 1992.

DREW MAJOR

As Drew Major sees it, "In the next [computer] generation, nothing will not be connected." But what would you expect from Major, chief scientist at Novell and lead architect of NetWare, still the preeminent NOS (network OS). Fresh out of Brigham Young University in 1980, Major and two buddies took a six-week consulting job at Novell (which was trying to make CP/M machines) and wound up staying 15 years. When NetWare 3.0 shipped in 1989, it contained server-based applications called NLMs (NetWare loadable modules), a great leap forward over the kludgy VAPs (value-added processes) of the previous version. How bad were VAPs? They're the only thing about NetWare that Major ever apologized for. Undoubtedly, his mother taught him to be polite.

ROBERT METCALFE

For five points, what came first: commercially sold PCs or the LAN? Robert Metcalfe knows. He outlined local networking technology in his doctoral dissertation at Harvard. In 1973, he went to Xerox PARC, where he invented Ethernet to connect the Alto computers (never sold commercially) in use there. Thus, the LAN was born before the first PCs were marketed. Today Ethernet connects more than 50 million computers. In 1979, Metcalfe founded 3Com, a networking company. He retired in 1990 and was publisher of InfoWorld for 2-1/2 years. So what does the Father of Ethernet think about the information highway? A fad. "Soon the fad will be over," he says. "Then we can get back to the business of building I-ways -- another 50 years of plumbing."

BJARNE STROUSTRUP

Perhaps because their native tongues are not widely spoken, Scandinavians are noted for their multilingual talents. So it's no surprise that C++ inventor Bjarne Stroustrup, a native of Denmark, rejects any notion of a universal programming language: "the idea of spanning the whole spectrum of programming with one language is absurd." In the mid-1980s, Stroustrup, head of Bell Labs' large-scale programming research department, defined the C++ object-oriented extension of the C language. He also authored two notable works on C++, including The C++ Programming Language. To those who whine about how hard C++ is to use, he says: "It wasn't meant to be learned in 2 hours."

JOHN WARNOCK

Two innovations clearly sparked the desktop publishing revolution: The Mac and John Warnock's Postscript PDL (page-description language). Warnock cut his teeth at Xerox PARC, where he developed graphics imaging standards. In 1982, he and his partner, Charles Geschke, founded Adobe Systems to create pioneering software products for desktop publishing and electronic document technology. As millions of computer users begin to wander the information highway, Warnock sees a day when cross-platform document and graphics standards will be a reality. "I think meaningful document standards will emerge over the next five years. There is a need for an abstraction layer that is independent of operating systems."

NIKLAUS WIRTH

Pascal begat Modula 2. Modula 2 begat Oberon. And Niklaus Wirth begat them all. Wirth, of the Swiss Federal Institute of Technology, likes to quote Albert Einstein: "Make it as simple as possible but not simpler." Much of today's software is overweight and inefficient. Wirth is showing a simpler way with OOP (object-oriented programming). His latest, Oberon (a language and an OS), lets developers reuse built-in data structures without recompiling the entire OS. Applications are replaced by leaner tools that the OS can access on demand. One result: fewer bugs. Need more proof? The Oberon PC version, including a GUI, uses 1.5 MB of RAM; Microsoft Windows 3.1 needs 4 MB.

STEVE WOZNIAK

Consider Steve Wozniak, the Wizard of Woz, the Ultimate Hacker, one of the great garage inventors of all time. With the millions he earned when Apple went public, Woz no longer works like the rest of us. The Father of the Apple II (don't worry, the other Steve gets some credit, but it was Woz's baby) now throws his energy into helping youths learn computers. "I believe more and more we should support the people who are not computer experts." He not only spends hundreds of hours teaching, he also personally picks up the cost of AOL accounts for about 100 kids. "The worst problem isn't so much students, but teachers really need forced training. It costs money. The school board has to sit back and reprioritize what is going to be taught."