Time Your Attack: Oracle’s Lost Revolution

In 1995, Larry Ellison announced a computer that would run applications that lived in the cloud. Just one thing: The cloud didn’t exist yet.
cloud sculpture with cables attached
Photograph: Mauricio Alejo

If you buy something using links in our stories, we may earn a commission. This helps support our journalism. Learn more. Please also consider subscribing to WIRED

Just after midnight on August 24, 1995, a student named Jonathan Prentice walked into a bookshop in Auckland, New Zealand, pulled out 200 New Zealand dollars, and became the first owner of Windows 95. Turning to a reporter, Prentice declared: "I will be able to play solitaire and send faxes at the same time." And with that rallying cry, the Windows 95 craze exploded. Lines formed around the world as consumers jostled to buy copies; journalists in Poland prepared to take a Microsoft-sponsored submarine ride under the Baltic Sea; The Times of London handed out free papers wrapped in a Windows ad; and the Empire State Building lit up in Windows' signature colors. Within a month, Microsoft's new operating system had sales of more than $250 million and Bill Gates went from being a nerd to ... well, he was still a nerd, but an invincible rock-star nerd. The future was very clear: PCs—running Microsoft software—would be the single most important device in our lives.

In Silicon Valley, Larry Ellison followed the glowing press coverage and fumed. He had lived in Gates' shadow since March 1986, when Oracle, Ellison's database-software company, had gone public just a day before Microsoft. Gates got attention for everything he did, but barely anyone knew Oracle. Windows 95 was the last straw. "There was peace in the Middle East and war in Bosnia the same week," he later groused. "And all that the major networks seemed to cover was people in parking lots waiting up all night to get their first copy of Windows 95." His grudge wasn't just about ego; Microsoft had already begun nosing around the database-software industry, and its mounting war chest meant that it could easily fund a push into Oracle's territory.

Immediately after the Windows 95 launch, Ellison called one of his lieutenants, Farzad Dibachi, to his mansion in Atherton, California. For years, Dibachi—who was responsible for brainstorming new business strategies—had urged Ellison to think more broadly about his company's potential. Now, the two discussed a vision for Oracle that would neutralize Microsoft's main advantage: the dominance of its operating system. They imagined a simple machine that would eschew software installed on a hard drive in favor of accessing applications online. Data—videos, documents, pictures—would be stored in Oracle databases instead of on the computer itself. In place of a robust operating system, this machine would work with programs and files through browsers like Netscape Navigator. Ellison liked the idea, and he and Dibachi started working on a speech so the CEO could share it with the world. The device would be called the network computer.

It was a powerful idea, one that would enchant companies and analysts throughout the IT industry. But it would ultimately fail. In 1999, after spending four years and losing nearly $175 million, Oracle pulled the plug, changing the name of its network computer spinoff to Liberate Technologies and focusing its business on set-top box software for interactive television. (Ellison personally funded another network computer startup that didn't fare any better.) Journalists referred to the idea as "embarrassingly wrong." Gates called it "a fantasy." Even Ellison distanced himself from the debacle. (He declined to discuss it with Wired.) And yet, while his computer may have failed, Ellison's grand vision did not. Just as he predicted, a wave of cheap, underpowered computers with small hard drives and omnipresent Internet connections did challenge the traditional PC. Ellison's vision of widget-based online software has become conventional wisdom. The network computer failed as a product and as a business, but it seeded an idea—and a group of technologists—that would go on to remake the computing world.

We typically think about failure in one of two ways: It's either something to avoid or something from which we can learn simple personal lessons about perseverance and character. Silicon Valley has long patted itself on the back for its acceptance—even promotion—of failure. By encouraging risk-taking, the tech industry allows bad ideas to get out of the way so good ones can take their place. But failure is not just the negative space surrounding success. Failures, too, can change the world—just as fundamentally as successes. They can serve as the necessary first round of innovation, the sacrificial lambs that bleed out an insight or vision or breakthrough before perishing.

Everywhere you look, our world has been shaped fundamentally by failures. North American farmers in the early 1920s failed to cure their hay properly, causing an outbreak of internal hemorrhaging in cattle but eventually leading to the discovery of warfarin, an anticoagulant that has saved countless lives. Preston Tucker produced only 51 cars, but many of his ideas about safety prefigured the era of airbags and three-point seat belts. Xerox PARC invented the modern personal computer but was unable to capitalize on it, letting others take the technology—from the graphical user interface to the word processor—in thousands of different directions.

In a way, seen through a long enough lens, everything is eventually a failure. Every innovation will ultimately be usurped by something that does the same thing more cheaply or more effectively or more elegantly. Everything is failing all the time; it's just a matter of how quickly and how devastatingly. In his 1964 book, The Nature of Design, industrial designer David Pye attempted to lay to rest the very notion of success: "Nothing we design or make ever really works. We can always say what it ought to do, but that it never does," he wrote. "Everything we design and make is an improvisation, a lash-up, something inept and provisional."

The network computer was the embodiment of that: It didn't work as promised, it was too early, and it cost too much. But looking back now, it's hard to see it as anything but a success.

Larry Ellison in 1997, two years into his push for the network computer.Photograph: Corbis

Ellison unleashed his concept on September 4, 1995, just days after he and Dibachi had come up with it. The setting was ironic: Ellison was serving as a warm-up act at International Data Corp.'s European IT Forum in Paris. The headliner? Bill Gates. Some of the analysts and techies in the room didn't even seem to know who Ellison was. It was a familiar if frustrating experience for the Oracle CEO; his company wasn't even close to being a household name.

Ellison refused to be a second-stringer. "A PC is a ridiculous device," he said, launching an attack on Microsoft's core business. He ran down a list of the desktop's deficiencies: It was hard to learn to operate, expensive, overpowered, and—thanks to the arrival of the World Wide Web—increasingly irrelevant. That's why he was ushering in the post-PC era with the network computer, or NC, which Oracle would help build within a year. The simple $500 box would be a stripped-down unit that served one purpose: to connect to the Internet. For the NC, the Web wouldn't be a mere feature but a utility, as fundamental as water and electricity. "What the world really wants," Ellison told the crowd, "is to plug into a wall to get electronic power, and plug in to get data."

Soon, software and hardware manufacturers around the world had signed on to help make Ellison's vision a reality. British computer maker Acorn began ferrying executives to California to discuss the possibility of an Acorn NC. IBM executive Bob Dies persuaded CEO Louis Gerstner to start building the bare-bones machines, eventually establishing a network computer division. "I explained to him that this could mean companies would not have to update every single desktop around their company," Dies says. "He saw the light quickly." Netscape cofounder Marc Andreessen declared the NC "a pretty major new business opportunity," predicting that hundreds of millions of the machines would be in homes and offices within 20 years. Frank Gens, an analyst at IDC, foretold a day when banks would give them away to attract new customers. "Once that happens, it becomes a completely different PC industry," he told BusinessWeek. "If I were Compaq, IBM, or Dell, I'd be thinking hard about that."

Perhaps nobody was as excited as Eric Schmidt, CTO of Sun Microsystems. Within months, Sun built an NC prototype and began developing a lean operating system to run on it. Speaking to U.S. News & World Report, Schmidt couldn't stop raving about the idea's potential. Of course, the NC could greatly benefit his business: As the company responsible for the development of the Java programming language, which would power the Web applications, Sun and its work would gain new relevance. But the ramifications went way beyond Sun, he told the reporter. The NC would usher in a whole new world of computing, setting off a wave of innovation. "The implications are serious," he said. "If this takes off, it will have enormous impact."

Consumers were just as captivated. Oracle executives received voicemails from strangers expressing interest in the machine. The company's salespeople fielded more questions about the NC than about the databases that constituted the bulk of Oracle's business. Internal support teams geared up to prepare for an onslaught of new customers once the rollout went global. In his attempt to take down a competitor, Ellison had stumbled upon a groundswell of pent-up demand. "The NC story just exploded beyond anything I imagined," Ellison said later. "It took on a life of its own."

Obsessive competitiveness may have inspired Ellison's idea, but it also contributed to its downfall. His turf battle with Redmond ended up kneecapping his product. Looking to stem the momentum of Windows, Ellison promised to release low-cost machines within a year. That meant rushing out computers before they were fully developed. When it hit stores in the fall of 1996, the Acorn NC—commissioned by Oracle to be the model around which the new market would coalesce—had an underpowered ARM processor that produced blocky graphics and strained to render a Web page in less than four seconds. IBM's Network Station computers—the flagship corporate version of the NC—didn't fare much better. They were too slow, too limited, and too complicated to coordinate with company servers. Irving Wladawsky-Berger, who ran the Internet division at IBM and ultimately oversaw its NC project, was embarrassed. "We thought we had a full product," he says. "But when we took it to market, we realized it was an alpha."

Oracle's rush to market also meant that the NC hit shelves before the infrastructure existed to support it. The machine was supposed to run lightweight Web applications instead of installed software—and everyone believed Java would be up to the task. But it was never able to support powerful-enough applications. And with wide-scale broadband penetration still many years away, Internet apps didn't stand a chance against local software.

In the run-up to the release, Ellison's Ahab-like obsession with Microsoft fueled his need to be kept apprised of every development. Nonetheless, the globe-trotting CEO would often shift his focus to other projects, leaving his staff to make major decisions about the NC. ("Mr. Larry is near Australia in his sailboat and can't be reached," read one email from an Oracle employee to a member of the Acorn team. "The feeling on the staff here is that the case designs won't fly.")

By 1999, the NC was basically dead. IBM had sold only 10,000 or so of its version and eventually shuttered its NC division; RCA had developed an NC as well, but it, too, sold only about 10,000, nearly all of which the company later recalled. Acorn imploded, and Netscape was crushed by Microsoft's Internet Explorer. Gates delivered his own, gloating coda in late 1998, speaking at the same Paris IT conference where Ellison had first announced the NC. "The network computer is pretty discredited," Gates told the crowd.

The NC offensive may have failed to create a new product category, but Ellison's primary motivation was not to sell millions of computers; he wanted to damage Microsoft while improving his own reputation. And by those standards, the NC was a rousing success. One of its main draws was price—for a fraction of the cost, NC customers would get everything they needed from a home computer. But almost immediately after the NC was announced, PC prices began to plummet, partially in response to Ellison's threat. From the 1970s to the early '90s, the cost of desktop PCs—adjusted for performance—dropped an average of 15 percent a year. Between 1995 and 2000—the NC era—PC prices fell at an annual rate of 28 percent. By the late '90s, consumers could get a full desktop computer for less than $800. For just a few hundred dollars more, the PC could do everything the NC could, and much more. This was bad news for the NC, but it was also bad news for Microsoft's main allies, the PC makers, who had to slash their margins to compete with the phantom product.

Even more important, the NC initiative put Microsoft on the defensive. Instead of pushing into new areas—like Oracle's database-software business—the company was forced to safeguard its own business model. After initially downplaying the threat and importance of the Internet, Gates became obsessed. Rather than attacking Oracle, he went after Netscape in what became an all-consuming fight that nearly drove Microsoft to a government-imposed breakup. Oracle may have spent a ton of money on its NC gamble, but its now $112 billion database business never faced a serious threat from Redmond.

All the excitement about the NC had also raised Oracle's profile. Ellison was no longer an also-ran; he was lauded as a seer and started getting the same kind of press adulation as Gates. In April 1995, Charlie Rose had Ellison on to talk about the Internet for just a few minutes—sandwiched between discussions of the O. J. Simpson trial and Pope John Paul II. By 1996, Rose had Ellison on as a featured guest. Soon after the NC failed, Ellison spoke with his biographer to revisit the initiative. All in all, he said, he had gotten what he wanted out of it. He was more famous and better respected than ever. He had brushed back Microsoft and established Oracle as a peer and formidable competitor. Sure, the NC itself never caught on, but Ellison didn't seem too concerned. He summed up the entire project in a typically blusterous quote: "As for the network computer, I don't care about it at all."

But many of Ellison's cohorts weren't so quick to dismiss the NC vision. In the following years, they would continue to pursue it. And their efforts would change the computing landscape and ultimately fulfill Ellison's vision—whether he cared about it or not.

In 1997, Eric Schmidt was lured away from Sun to take over ailing enterprise-software company Novell; four years later, he was brought on as CEO of Google. Yet he could never let go of the NC concept. In 2005, he noticed the emergence of Ajax, a technology that enabled Web-based applications to run as smoothly as their shrink-wrapped, locally installed counterparts. It enabled programmers to develop and deploy software in ways that Sun had only dreamed about when creating Java. Almost instantly, Google engineers began building software—most notably Google Docs and Spreadsheets, direct competitors to Microsoft's flagship Office suite.

Last summer, Google announced an even more ambitious project: a lightweight operating system engineered to power inexpensive portable computers that lack hard drives. Called Chrome OS, the software is designed to be barely noticeable. Its sole function is to connect the device to the Web. Sound familiar? "I've been giving the same speech for 15 years," Schmidt says. "But ultimately, the reason the NC didn't work was that the technology wasn't mature enough." Now, he says, that's no longer true. "Chrome is the consequence of the network computer vision."

In 1999, senior vice president Marc Benioff left Oracle to create Salesforce.com. Benioff had been enthralled by the idea of the computer as an "Internet terminal" since 1995, when Ellison discussed the notion with him on a trip to Beijing. Salesforce provided a Web-based alternative to bulky, installed enterprise software—programs that could run on an Internet terminal rather than a full-powered PC. As Salesforce grew more popular in the early 2000s, Benioff became more belligerent, even Ellisonian, in his description of the company's potential impact. He declared Salesforce to be the angel of death for the installed-software industry and ran an ad showing an F-22 emblazoned with "Salesforce.com" shooting down a biplane that had "software" inscribed on its wing. His cockiness would prove merited. Within the past few years, many of Salesforce's competitors—including such powerhouses as SAP and CA—have built their own cloud-based offerings, and cloud has become a Silicon Valley buzzword. (Ironically, Ellison has recently taken to bemoaning the popular but ill-defined term. "Maybe I'm an idiot," he told an Oracle conference in 2008, "but I have no idea what anyone is talking about.")

Now that the Web-software environment has been established, NC-like hardware has begun to proliferate. The first example, of course, is netbooks. Debuting just two years ago, they currently account for more than 20 percent of PC sales. In the 12 months ending in September 2009, sales jumped 77 percent. Yet while the netbook may be the direct descendent of the NC, its cousin, the smartphones, is seen by most alumni of the NC movement as the more powerful force. Every day, iphoness and ipods users download 4.5 million applications—grown-up versions of the widgets that Ellison predicted would run on the NC. "Ellison is often time-dyslexic—right about the fundamental trend but wrong on timing, " says David Roux, a partner at private equity firm Silver Lake and a former Oracle executive vice president. "It's hard to look at a $299 netbook and not see the NC vision come to life."

We tend to think of technology as a steady march, a progression of increasingly better mousetraps that succeed based on their merits. But in the end, evolution may provide a better model for how technological battles are won. One mutation does not, by itself, define progress. Instead, it creates another potential path for development, sparking additional changes and improvements until one finally breaks through and establishes a new organism. In his 1988 book, The Evolution of Technology, historian George Basalla argues that machines develop in the same way. "Only a few variants," he writes, "have the potential to start a new branching series that will greatly enrich the stream of made things, have an impact on human life, and become known as 'great inventions' or 'turning points in the history of technology.'"

In October, Microsoft unveiled Windows 7, the latest version of its operating system. This time around, there were no lines in parking lots, no breathless press coverage, no sense that a new computing era had begun. Indeed, some 40 percent of businesses said they had no plans to upgrade. Accessing the Net is what's most important, and no one needs the latest Microsoft OS to do that. But Ellison himself provided the most glaring sign that the computing landscape had changed. Fourteen years earlier, he had reacted to the Windows 95 launch by becoming one of Microsoft's loudest critics and most ambitious would-be competitors. This time, he said nothing. There was no need. The fight was over.


Senior writer Daniel Roth wrote about Demand Media in issue 17.11.