By Patricia Sullivan,
Steve Jobs, the co-founder of Apple Inc., who introduced simple, well-designed computers for people who were more interested in what technology could do rather than how it was done, died Wednesday at age 56.
In a brief statement, Apple announced the death but did not say where he died. Mr. Jobs suffered from a rare form of pancreatic cancer and had a liver transplant in 2009, and he stepped down as Apple’s chief executive on Aug. 24.
An original thinker who helped create the Macintosh, one of the most influential computers in the world, Mr. Jobs also reinvented the portable music player with the iPod, launched the first successful legal method of selling music online with the creation of iTunes, and reordered the cellphone market with the iPhone. The introduction of the iPad also jump-started the electronic-tablet market, and it now dominates the field.
He also started a chain of retail stores and pushed consumers away from their dependence on floppy disks and CDs.
Calculating that people would be willing to pay a premium price for products that signal creativity, Mr. Jobs had a genius for understanding the needs of consumers before they did.
He knew best of all how to market. “Mac or PC?” became one of the defining questions of the late 20th century, and although Apple sold a mere 5 percent of all computers in that era, Mac users became rabid partisans.
Mr. Jobs was the first crossover technology star, turning Silicon Valley renown into Main Street recognition and paving the way for the rise of the nerds, such as Yahoo founders Jerry Yang and David Filo, and Google founders Larry Page and Sergey Brin.
As a 21-year-old college-dropout entrepreneur, Mr. Jobs led Apple to multimillion-dollar success in five years. Forced out of his own company by the time he was 30, he started another computer firm, Next, whose technology was used to create the World Wide Web. Mr. Jobs also took over a foundering computer animation company and turned it into the Academy Award-winning Pixar, maker of “Toy Story” and “Finding Nemo.” He returned to Apple in his 40s, restoring the company to profitability by paring down the product line and being a leader in innovation.
Known within the technology community for his complex and combative temperament, Mr. Jobs was a private man. But in a June 2005 commencement address at Stanford University, he talked about his pancreatic cancer, diagnosed in 2004, in a video that became an Internet sensation. He later became furious at speculation over his health in mid-2008, when he appeared in public looking gaunt. In late 2008, he took a medical leave from the company, and he had a liver transplant the following year.
In January, he took another medical leave. On Aug. 24 he stepped down as Apple’s chief executive but became chairman of board. Apple’s share price immediately dropped 5 percent on the news because Jobs was so connected to the company, but it rebounded the next day. “Steve Jobs running the company from jail would be better for the stock price than Steve Jobs not being CEO,” one stock analyst told Fortune magazine in early 2011.
His innovations led Business 2.0 to call him “easily the greatest marketer since P.T. Barnum.” One of his employees, noting that Mr. Jobs is able to persuade people to believe almost anything, coined the phrase “reality distortion field” to describe his ability to warp an audience’s sense of proportion. Mr. Jobs described the Macintosh computer, for example, as “insanely great.”
Maybe it was. It was designed for the home and creative professional user, not the computer-science nerd or the bottom-line-oriented businessman. During a famous 1979 visit to Xerox Parc, the hotbed of innovation where the computer mouse, networking and graphical user interface were invented, he and Apple co-founder Steve Wozniak learned that computer users did not have to type in a series of arcane commands to get the computer to perform; they could simply point their mouse at a picture of a file and click the mouse to get the file to open.
That recognition sparked a flurry of innovations, unmatched in technology until designers of Microsoft’s operating software copied the look and feel of their California competitors in Windows 95.
Years later, discussing computer design in another context, Mr. Jobs said: “Most people make the mistake of thinking design is what it looks like. People think it’s this veneer, that the designers are handed this box and told, ‘Make it look good!’ That’s not what we think design is. It’s not just what it looks like and feels like. Design is how it works.”
He could control how it works because Apple “makes the whole widget,” as Mr. Jobs repeatedly said — software and hardware. The company introduced monitors with color screens long before others. Locked out of many retail chains because of its small market share, Apple responded with its own distinctively branded stores, to which users flock like pilgrims. The Mac, Mr. Jobs saw, could become the hub of a digital lifestyle.
Not everything has worked out. A 1983 computer, Lisa, failed miserably. Even the “insanely great” Macintosh, sold without a letter-quality printer and incompatible with other computers, had a difficult start, although it nevertheless launched the desktop publishing revolution. But that wasn’t the first rough start in Mr. Jobs’s life.
Steven Paul Jobs was born Feb. 24, 1955, in San Francisco to unwed parents, University of Wisconsin graduate student Joanne Carole Schieble and a Syrian exchange student, Abdulfattah Jandali. He was adopted shortly after birth by Paul and Clara Jobs.
He grew up in the Northern California suburbs that would later be dubbed “Silicon Valley,” and he showed an early interest in electronics and gadgetry. As a high school student, he boldly asked William Hewlett, co-founder and president of the Hewlett-Packard computer firm, for some parts he needed to complete a class project. Hewlett was impressed enough to give Mr. Jobs the parts and offer him a summer internship at Hewlett-Packard.
Mr. Jobs attended Reed College in Portland, Ore., for six months before dropping out, although Mr. Jobs continued to drop in on classes for 18 months. He worked part time at Atari Computers to raise money for a trip to India in the summer of 1974, studying meditation and shaving his head. But within months, he became ill with dysentery and was forced to return to the United States.
For a short time, Mr. Jobs lived in a California commune, but he soon became disenchanted. In 1975, he began associating with a group of computer aficionados known as the Homebrew Computer Club. Wozniak, a technical whiz, was trying to build a small computer and Mr. Jobs became fascinated with its potential. In 1976, he and Wozniak formed their own company.
The Apple I was sold as an all-in-one device, unlike other computers that required customers to separately buy a screen, a hard drive and a keyboard. It carried a price tag of $666, and about 200 were sold.
Mr. Jobs saw a huge gap in the existing computer market, with no product targeted for home use. While Wozniak improved the technology, Mr. Jobs lined up investors and bank financing. The redesigned computer — christened the “Apple II” — came out in 1977, with impressive first-year sales of $2.7 million. In one of the most phenomenal cases of corporate growth in U.S. history, the company’s sales grew to $200 million within three years and almost single-handedly created a market of home users.
The Macintosh was introduced in 1984 with a third-quarter commercial during the Super Bowl. The advertisement, designed by ad agency Chiat/Day and directed by Ridley Scott, fresh off his classic science-fiction film “Blade Runner,” ran just once, but it was named by Advertising Age as the commercial of the decade.
As controversial as he was creative, Mr. Jobs enforced a culture of secrecy at Apple’s Cupertino, Calif., headquarters. Mr. Jobs, who grew up idolizing the Hewlett-Packard ideal of an egalitarian workplace where employees were highly valued, was known in his younger years for playing mean pranks on underlings, reversing direction without ever acknowledging he changed his mind and yelling at even company directors.
In his private life, he refused to acknowledge paternity or pay child support for his first daughter for years. He threatened to sue teens who published Apple gossip on their Web sites. He refused to put license plates on his Mercedes-Benz and parked in handicapped parking spots at the company until the public ridicule became too great.
Decades later, he attempted to stop the publication of two unauthorized biographies; he persuaded Vanity Fair to cancel the serialization of one, and after the publisher of the second book refused to stop publication, Mr. Jobs yanked all that publisher’s books from Apple’s retail stores.
In 1985, after tangling with John Sculley, the Pepsi executive he brought in to run the company, Mr. Jobs sold $20 million worth of stock and resigned from Apple. He and Wozniak had just received theNational Medal of Technology from President Ronald Reagan, and he was 30 years old. “I didn’t see it then, but it turned out that getting fired from Apple was the best thing that could have ever happened to me,” he told Stanford University graduates in the 2005 commencement address. “The heaviness of being successful was replaced by the lightness of being a beginner again, less sure about everything. It freed me to enter one of the most creative periods of my life.”
After several months of discontent, Mr. Jobs hired some of his former employees to begin a new computer company, called Next.
During that period, Mr. Jobs paid filmmaker George Lucas $10 million for a small computer animation firm, and over the next six years Mr. Jobs poured another $40 million of his own money into the company as it set out to make the first computer-animated feature film, “Toy Story.” It was a huge box-office hit, and Pixar's initial public stock offering was an enormous success.
In late 1996, Apple bought Next for more than $400 million. Within months, he was back in charge of Apple.
Greeted by the Mac faithful as the second coming, Mr. Jobs soon stunned them with two decisions. He revoked agreements with “clone makers,” companies that licensed and built their own Apple-compatible hardware. That cost bargain-seeking consumers many hundreds of dollars and put several successful clone makers out of business, but it returned much lost revenue to Apple.
Then, at a Macworld conference in August 1997, he announced a partnership with arch rival Microsoft, accepting a $150 million investment in exchange for preloading Microsoft’s Internet Explorer browser on Apple’s computers. An unprecedented alliance between rivals, the deal ultimately saved the company by reassuring customers that they could use Microsoft’s ubiquitous Office software on Macs. “We want to let go of this notion that for Apple to win, Microsoft has to lose,” Mr. Jobs said, to shouts of dismay from his normally adoring audience. “Madman at the wheel, eh?” he added, laughing, as he walked off the stage.
That was just the start of his revival of the company. The “Think Different” advertising campaign got the world’s attention again, followed by the 1998 introduction of the colorful iMac desktop computer, which sold a million in a year. In 1999, the iBook appeared, a brightly colored, clam-shaped laptop that enabled wireless Web surfing.
Consumer releases came out in a rush. An overlooked technology called FireWire, a tool for quickly moving large amounts of data between digital devices, allowed the creation of iMovie, software that encouraged non-experts to make their own home videos. The iPod, an MP3-like player with room for thousands of songs, was introduced in October 2001 and has sold hundreds of millions of units, dominating the field. The iTunes application, which allowed consumers to legally buy and download music, started in April 2003 and revolutionized the digital music industry — more than 6 billion songs have been sold.
Mr. Jobs and Apple were suddenly cool again. The iPhone’s debut in 2007 generated a huge buzz, and it soon rolled over the competition from Palm Computing and BlackBerry, despite its higher cost. The iPad, a tablet-based computer introduced in January 2010, sold more than 10 million units its first year.
Although Mr. Jobs’s salary was $1 per year, his stock options made him rich. His fortune was estimated to be $5.4 billion by Forbes magazine in its 2008 survey of the richest people in the world.
But nothing came without controversy. Even as Apple’s stock was booming and its business thriving, Mr. Jobs faced scrutiny in a scandal about Apple’s backdating of stock options. Like many companies, Apple had given out stock options with effective dates chosen later, after it was known that the price was low on those dates, making the stock more valuable once it was sold. Mr. Jobs wasn’t the only employee who benefited, but the company said his options were approved at a special board meeting that never took place. The company was forced to restate its earnings, but a special company investigation concluded that Mr. Jobs had done nothing wrong. Survivors include his wife since 1991, Laurene Powell; a daughter from a previous relationship, Lisa Brennan-Jobs; three children from his marriage, Eve Jobs, Erin Sienna Jobs and Reed Paul Jobs; and two sisters, Mona Simpson and Patti Jobs.
This copy is for your personal, noncommercial use only. You can order presentation-ready copies for distribution to your colleagues, clients or customers here or use the "Reprints" tool that appears next to any article. Visit www.nytreprints.com for samples and additional information. Order a reprint of this article now.
Steve Jobs, Apple’s Visionary, Dies at 56
By JOHN MARKOFF
Steven P. Jobs, the visionary co-founder of Apple who helped usher in the era of personal computers and then led a cultural transformation in the way music, movies and mobile communications were experienced in the digital age, died Wednesday. He was 56.
The death was announced by Apple, the company Mr. Jobs and his high school friend Stephen Wozniak started in 1976 in a suburban California garage.
A friend of the family said that Mr. Jobs died of complications from his long battle with pancreatic cancer, with which he waged a long and public struggle, remaining the face of the company even as he underwent treatment. He continued to introduce new products for a global market in his trademark blue jeans even as he grew gaunt and frail.
He underwent surgery in 2004, received a liver transplant in 2009 and took three medical leaves of absence as Apple’s chief executive before stepping down in August and turning over the helm to Timothy D. Cook, the chief operating officer. When he left, he was still engaged in the company’s affairs, negotiating with another Silicon Valley executive only weeks earlier.
“I have always said that if there ever came a day when I could no longer meet my duties and expectations as Apple’s C.E.O., I would be the first to let you know,” Mr. Jobs said in a letter released by the company. “Unfortunately, that day has come.”
By then, having mastered digital technology and capitalized on his intuitive marketing sense, Mr. Jobs had largely come to define the personal computer industry and an array of digital consumer and entertainment businesses centered on the Internet. He had also become a very rich man, worth an estimated $8.3 billion.
Tributes to Mr. Jobs flowed quickly on Wednesday evening, in formal statements and in the flow of social networks, with President Obama, technology industry leaders and legions of Apple fans weighing in.
“For those of us lucky enough to get to work with Steve, it’s been an insanely great honor,” said Bill Gates, the Microsoft co-founder. “I will miss Steve immensely.”
A Twitter user named Matt Galligan wrote: “R.I.P. Steve Jobs. You touched an ugly world of technology and made it beautiful.”
Eight years after founding Apple, Mr. Jobs led the team that designed the Macintosh computer, a breakthrough in making personal computers easier to use. After a 12-year separation from the company, prompted by a bitter falling-out with his chief executive, John Sculley, he returned in 1997 to oversee the creation of one innovative digital device after another — the iPod, the iPhone and the iPad. These transformed not only product categories like music players and cellphones but also entire industries, like music and mobile communications.
During his years outside Apple, he bought a tiny computer graphics spinoff from the director George Lucas and built a team of computer scientists, artists and animators that became Pixar Animation Studios.
Starting with “Toy Story” in 1995, Pixar produced a string of hit movies, won several Academy Awards for artistic and technological excellence, and made the full-length computer-animated film a mainstream art form enjoyed by children and adults worldwide.
Mr. Jobs was neither a hardware engineer nor a software programmer, nor did he think of himself as a manager. He considered himself a technology leader, choosing the best people possible, encouraging and prodding them, and making the final call on product design.
It was an executive style that had evolved. In his early years at Apple, his meddling in tiny details maddened colleagues, and his criticism could be caustic and even humiliating. But he grew to elicit extraordinary loyalty.
“He was the most passionate leader one could hope for, a motivating force without parallel,” wrote Steven Levy, author of the 1994 book “Insanely Great,” which chronicles the creation of the Mac. “Tom Sawyer could have picked up tricks from Steve Jobs.”
“Toy Story,” for example, took four years to make while Pixar struggled, yet Mr. Jobs never let up on his colleagues. “‘You need a lot more than vision — you need a stubbornness, tenacity, belief and patience to stay the course,” said Edwin Catmull, a computer scientist and a co-founder of Pixar. “In Steve’s case, he pushes right to the edge, to try to make the next big step forward.”
Mr. Jobs was the ultimate arbiter of Apple products, and his standards were exacting. Over the course of a year he tossed out two iPhone prototypes, for example, before approving the third, and began shipping it in June 2007.
To his understanding of technology he brought an immersion in popular culture. In his 20s, he dated Joan Baez; Ella Fitzgerald sang at his 30th birthday party. His worldview was shaped by the ’60s counterculture in the San Francisco Bay Area, where he had grown up, the adopted son of a Silicon Valley machinist. When he graduated from high school in Los Altos in 1972, he said, ”the very strong scent of the 1960s was still there.”
After dropping out of Reed College, a stronghold of liberal thought in Portland, Ore., in 1972, Mr. Jobs led a countercultural lifestyle himself. He told a reporter that taking LSD was one of the two or three most important things he had done in his life. He said there were things about him that people who had not tried psychedelics — even people who knew him well, including his wife — could never understand.
Decades later he flew around the world in his own corporate jet, but he maintained emotional ties to the period in which he grew up. He often felt like an outsider in the corporate world, he said. When discussing the Silicon Valley’s lasting contributions to humanity, he mentioned in the same breath the invention of the microchip and “The Whole Earth Catalog,” a 1960s counterculture publication.
Apple’s very name reflected his unconventionality. In an era when engineers and hobbyists tended to describe their machines with model numbers, he chose the name of a fruit, supposedly because of his dietary habits at the time.
Coming on the scene just as computing began to move beyond the walls of research laboratories and corporations in the 1970s, Mr. Jobs saw that computing was becoming personal — that it could do more than crunch numbers and solve scientific and business problems — and that it could even be a force for social and economic change. And at a time when hobbyist computers were boxy wooden affairs with metal chassis, he designed the Apple II as a sleek, low-slung plastic package intended for the den or the kitchen. He was offering not just products but a digital lifestyle.
He put much stock in the notion of “taste,” a word he used frequently. It was a sensibility that shone in products that looked like works of art and delighted users. Great products, he said, were a triumph of taste, of “trying to expose yourself to the best things humans have done and then trying to bring those things into what you are doing.”
Regis McKenna, a longtime Silicon Valley marketing executive to whom Mr. Jobs turned in the late 1970s to help shape the Apple brand, said Mr. Jobs’s genius lay in his ability to simplify complex, highly engineered products, “to strip away the excess layers of business, design and innovation until only the simple, elegant reality remained.”
Mr. Jobs’s own research and intuition, not focus groups, were his guide. When asked what market research went into the iPad, Mr. Jobs replied: “None. It’s not the consumers’ job to know what they want.”
Early Interests
Steven Paul Jobs was born in San Francisco on Feb. 24, 1955, and surrendered for adoption by his biological parents, Joanne Carole Schieble and Abdulfattah Jandali, a graduate student from Syria who became a political science professor. He was adopted by Paul and Clara Jobs.
The elder Mr. Jobs, who worked in finance and real estate before returning to his original trade as a machinist, moved his family down the San Francisco Peninsula to Mountain View and then to Los Altos in the 1960s.
Mr. Jobs developed an early interest in electronics. He was mentored by a neighbor, an electronics hobbyist, who built Heathkit do-it-yourself electronics projects. He was brash from an early age. As an eighth grader, after discovering that a crucial part was missing from a frequency counter he was assembling, he telephoned William Hewlett, the co-founder of Hewlett-Packard. Mr. Hewlett spoke with the boy for 20 minutes, prepared a bag of parts for him to pick up and offered him a job as a summer intern.
Mr. Jobs met Mr. Wozniak while attending Homestead High School in neighboring Cupertino. The two took an introductory electronics class there.
The spark that ignited their partnership was provided by Mr. Wozniak’s mother. Mr. Wozniak had graduated from high school and enrolled at the University of California, Berkeley, when she sent him an article from the October 1971 issue of Esquire magazine. The article, “Secrets of the Little Blue Box,” by Ron Rosenbaum, detailed an underground hobbyist culture of young men known as phone phreaks who were illicitly exploring the nation’s phone system.
Mr. Wozniak shared the article with Mr. Jobs, and the two set out to track down an elusive figure identified in the article as Captain Crunch. The man had taken the name from his discovery that a whistle that came in boxes of Cap’n Crunch cereal was tuned to a frequency that made it possible to make free long-distance calls simply by blowing the whistle next to a phone handset.
Captain Crunch was John Draper, a former Air Force electronic technician, and finding him took several weeks. Learning that the two young hobbyists were searching for him, Mr. Draper appeared one day in Mr. Wozniak’s Berkeley dormitory room. Mr. Jobs, who was still in high school, had traveled to Berkeley for the meeting. When Mr. Draper arrived, he entered the room saying simply, “It is I!”
Based on information they gleaned from Mr. Draper, Mr. Wozniak and Mr. Jobs later collaborated on building and selling blue boxes, devices that were widely used for making free — and illegal — phone calls. They raised a total of $6,000 from the effort.
After enrolling at Reed College in 1972, Mr. Jobs left after one semester, but remained in Portland for another 18 months auditing classes. In a commencement address given at Stanford in 2005, he said he had decided to leave college because it was consuming all of his parents’ savings.
Leaving school, however, also freed his curiosity to follow his interests. “I didn’t have a dorm room,” he said in his Stanford speech, “so I slept on the floor in friends’ rooms, I returned Coke bottles for the 5-cent deposits to buy food with, and I would walk the seven miles across town every Sunday night to get one good meal a week at the Hare Krishna temple. I loved it. And much of what I stumbled into by following my curiosity and intuition turned out to be priceless later on.”
He returned to Silicon Valley in 1974 and took a job there as a technician at Atari, the video game manufacturer. Still searching for his calling, he left after several months and traveled to India with a college friend, Daniel Kottke, who would later become an early Apple employee. Mr. Jobs returned to Atari that fall. In 1975, he and Mr. Wozniak, then working as an engineer at H.P., began attending meetings of the Homebrew Computer Club, a hobbyist group that met at the Stanford Linear Accelerator Center in Menlo Park, Calif. Personal computing had been pioneered at research laboratories adjacent to Stanford, and it was spreading to the outside world.
“What I remember is how intense he looked,” said Lee Felsenstein, a computer designer who was a Homebrew member. “He was everywhere, and he seemed to be trying to hear everything people had to say.”
Mr. Wozniak designed the original Apple I computer simply to show it off to his friends at the Homebrew. It was Mr. Jobs who had the inspiration that it could be a commercial product.
In early 1976, he and Mr. Wozniak, using their own money, began Apple with an initial investment of $1,300; they later gained the backing of a former Intel executive, A. C. Markkula, who lent them $250,000. Mr. Wozniak would be the technical half and Mr. Jobs the marketing half of the original Apple I Computer. Starting out in the Jobs family garage in Los Altos, they moved the company to a small office in Cupertino shortly thereafter.
In April 1977, Mr. Jobs and Mr. Wozniak introduced Apple II at the West Coast Computer Faire in San Francisco. It created a sensation. Faced with a gaggle of small and large competitors in the emerging computer market, Apple, with its Apple II, had figured out a way to straddle the business and consumer markets by building a computer that could be customized for specific applications.
Sales skyrocketed, from $2 million in 1977 to $600 million in 1981, the year the company went public. By 1983 Apple was in the Fortune 500. No company had ever joined the list so quickly.
The Apple III, introduced in May 1980, was intended to dominate the desktop computer market. I.B.M. would not introduce its original personal computer until 1981. But the Apple III had a host of technical problems, and Mr. Jobs shifted his focus to a new and ultimately short-lived project, an office workstation computer code-named Lisa.
An Apocalyptic Moment
By then Mr. Jobs had made his much-chronicled 1979 visit to Xerox’s research center in Palo Alto, where he saw the Alto, an experimental personal computer system that foreshadowed modern desktop computing. The Alto, controlled by a mouse pointing device, was one of the first computers to employ a graphical video display, which presented the user with a view of documents and programs, adopting the metaphor of an office desktop.
“It was one of those sort of apocalyptic moments,” Mr. Jobs said of his visit in a 1995 oral history interview for the Smithsonian Institution. “I remember within 10 minutes of seeing the graphical user interface stuff, just knowing that every computer would work this way someday. It was so obvious once you saw it. It didn’t require tremendous intellect. It was so clear.”
In 1981 he joined a small group of Apple engineers pursuing a separate project, a lower-cost system code-named Macintosh. The machine was introduced in January 1984 and trumpeted during the Super Bowl telecast by a 60-second commercial, directed by Ridley Scott, that linked I.B.M., by then the dominant PC maker, with Orwell’s Big Brother.
A year earlier Mr. Jobs had lured Mr. Sculley to Apple to be its chief executive. A former Pepsi-Cola chief executive, Mr. Sculley was impressed by Mr. Jobs’s pitch: “Do you want to spend the rest of your life selling sugared water, or do you want a chance to change the world?”
He went on to help Mr. Jobs introduce a number of new computer models, including an advanced version of the Apple II and later the Lisa and Macintosh desktop computers. Through them Mr. Jobs popularized the graphical user interface, which, based on a mouse pointing device, would become the standard way to control computers.
But when the Lisa failed commercially and early Macintosh sales proved disappointing, the two men became estranged and a power struggle ensued, and Mr. Jobs lost control of the Lisa project. The board ultimately stripped him of his operational role, taking control of the Lisa project away from, and 1,200 Apple employees were laid off. He left Apple in 1985.
“I don’t wear the right kind of pants to run this company,” he told a small gathering of Apple employees before he left, according to a member of the original Macintosh development team. He was barefoot as he spoke, and wearing blue jeans.
That September he announced a new venture, NeXT Inc. The aim was to build a workstation computer for the higher-education market. The next year, the Texas industrialist H. Ross Perot invested $20 million in the effort. But it did not achieve Mr. Jobs’s goals.
Mr. Jobs also established a personal philanthropic foundation after leaving Apple but soon had a change of heart, deciding instead to spend much of his fortune — $10 million — on acquiring Pixar, a struggling graphics supercomputing company owned by the filmmaker George Lucas.
The purchase was a significant gamble; there was little market at the time for computer-animated movies. But that changed in 1995, when the company, with Walt Disney Pictures, released “Toy Story.” That film’s box-office receipts ultimately reached $362 million, and when Pixar went public in a record-breaking offering, Mr. Jobs emerged a billionaire. In 2006, the Walt Disney Company agreed to purchase Pixar for $7.4 billion. The sale made Mr. Jobs Disney’s largest single shareholder, with about 7 percent of the company’s stock.
His personal life also became more public. He had a number of well-publicized romantic relationships, including one with the folk singer Joan Baez, before marrying Laurene Powell. In 1996, a sister, the novelist Mona Simpson, threw a spotlight on her relationship with Mr. Jobs in the novel “A Regular Guy.” The two did not meet until they were adults. The novel centered on a Silicon Valley entrepreneur who bore a close resemblance to Mr. Jobs. It was not an entirely flattering portrait. Mr. Jobs said about a quarter of it was accurate.
“We’re family,” he said of Ms. Simpson in an interview with The New York Times Magazine. “She’s one of my best friends in the world. I call her and talk to her every couple of days.”
His wife and Ms. Simpson survive him, as do his three children with Ms. Powell, his daughters Eve Jobs and Erin Sienna Jobs and a son, Reed; another daughter, Lisa Brennan-Jobs, from a relationship with Chrisann Brennan; and another sister, Patti Jobs.
Return to Apple
Beginning in 1986, Mr. Jobs refocused NeXT from the education to the business market and dropped the hardware part of the company, deciding to sell just an operating system. Although NeXT never became a significant computer industry player, it had a huge impact: a young programmer, Tim Berners-Lee, used a NeXT machine to develop the first version of the World Wide Web at the Swiss physics research center CERN in 1990.
In 1996, after unsuccessful efforts to develop next-generation operating systems, Apple, with Gilbert Amelio now in command, acquired NeXT for $430 million. The next year, Mr. Jobs returned to Apple as an adviser. He became chief executive again in 2000.
Shortly after returning, Mr. Jobs publicly ended Apple’s long feud with its archrival Microsoft, which agreed to continue developing its Office software for the Macintosh and invested $150 million in Apple.
Once in control of Apple again, Mr. Jobs set out to reshape the consumer electronics industry. He pushed the company into the digital music business, introducing first iTunes and then the iPod MP3 player. The music arm grew rapidly, reaching almost 50 percent of the company’s revenue by June 2008.
In 2005, Mr. Jobs announced that he would end Apple’s business relationship with I.B.M. and Motorola and build Macintosh computers based on Intel microprocessors.
By then his fight with cancer was publicly known. Apple had announced in 2004 that Mr. Jobs had a rare but curable form of pancreatic cancer and that he had undergone successful surgery. Four years later, questions about his health returned when he appeared at a company event looking gaunt. Afterward, he said he had suffered from a “common bug.” Privately, he said his cancer surgery had created digestive problems but insisted they were not life-threatening.
Apple began selling the iPhone in June 2007. Mr. Jobs’s goal was to sell 10 million of the handsets in 2008, equivalent to 1 percent of the global cellphone market. The company sold 11.6 million.
Although smartphones were already commonplace, the iPhone dispensed with a stylus and pioneered a touch-screen interface that quickly set the standard for the mobile computing market. Rolled out with much anticipation and fanfare, iPhone rocketed to popularity; by end of 2010 the company had sold almost 90 million units.
Although Mr. Jobs took just a nominal $1 salary when he returned to Apple, his compensation became the source of a Silicon Valley scandal in 2006 over the backdating of millions of shares of stock options. But after a company investigation and one by the Securities and Exchange Commission, he was found not to have benefited financially from the backdating and no charges were brought.
The episode did little to taint Mr. Jobs’s standing in the business and technology world. As the gravity of his illness became known, and particularly after he announced he was stepping down, he was increasingly hailed for his genius and true achievement: his ability to blend product design and business market innovation by integrating consumer-oriented software, microelectronic components, industrial design and new business strategies in a way that has not been matched.
If he had a motto, it may have come from “The Whole Earth Catalog,” which he said had deeply influenced him as a young man. The book, he said in his commencement address at Stanford in 2005, ends with the admonition “Stay Hungry. Stay Foolish.”
“I have always wished that for myself,” he said.
This article has been revised to reflect the following correction:
Correction: October 5, 2011
An earlier version of this obituary incorrectly identified the city where Mr. Jobs graduated from high school. It was Cupertino, not Los Altos.
Wednesday, Oct. 05, 2011
Steve Jobs, 1955-2011: Mourning Technology's Great Reinventor
Steve Jobs, whose death was announced on Wednesday night, wasn't a computer scientist. He had no training as a hardware engineer or an industrial designer. The businesses that Apple entered under his leadership — from personal computers to MP3 players to smartphones — all existed before the company got there.
But with astonishing regularity, Jobs did something that few people accomplish even once: he reinvented entire industries. He did it with ones that were new, like PCs, and he did it with ones that were old, such as music. And his pace only accelerated over the years.
He was the most celebrated, successful business executive of his generation, yet he flouted many basic tenets of business wisdom. (Like his hero and soulmate, Polaroid founder Edwin Land, he refused to conduct focus groups and other research that might tell him want his customers wanted.) In his many public appearances as the head of a large public corporation, he rarely sounded like one. He introduced the first Macintosh by quoting Bob Dylan, and took to saying that Apple sat "at the intersection of the liberal arts and technology." (See photos of the long and extraordinary career of Steve Jobs.)
Jobs' confidence in the wisdom of his own instincts came to be immense, as did the hype he created at Apple product launches. That might have been unbearable if it weren't for the fact that his intuition was nearly flawless and the products often lived up to his lofty claims. St. Louis Cardinals pitching great Dizzy Dean could have been talking about Jobs rather than himself when he said "It ain't bragging if you can back it up."
Jobs' eventual triumph was so absolute — in 2011, Apple's market capitalization passed that of Exxon Mobil, making it the planet's most valuable company — that it's easy to forget how checkered his reputation once was. Over the first quarter-century of his career, he was associated with as many failed products as hits. Having been forced out of Apple in 1985, he was associated with failure, period. Even some of his admirers thought of him as the dreamer who'd lost the war for PC dominance with Microsoft's indomitable Bill Gates.
Until the iPod era, it seemed entirely possible that Jobs' most lasting legacy might be the blockbuster animated features produced by Pixar, the company which he founded after acquiring George Lucas's computer-graphics lab in 1986. Instead, Pixar turned out to be, in Jobs' famous phrase, just one more thing. (See photos of Steve Jobs' TIME covers.)
Born in 1955 in San Francisco to an unmarried graduate student and adopted at birth by Paul and Clara Jobs, Steven Paul Jobs grew up in Silicon Valley just as it was becoming Silicon Valley. It proved to be a lucky break for everyone concerned.
He was only 21 when he started Apple — officially formed on April Fool's Day 1976 — with his buddy Steve "Woz" Wozniak, a self-taught engineer of rare talents. (A third founder, Ron Wayne, chickened out after less than two weeks.)
But Jobs had already done a lot of living, all of which influenced the company he built. He'd spent one unhappy semester at Portland's Reed College and 18 happy months of "dropping in" on Reed classes as he saw fit. He'd found brief employment in low-level jobs at Silicon Valley icons HP and Atari. He'd taken a spiritual journey to India, and dabbled with both psychedelic drugs and primal scream therapy.
Woz wanted to build computers to please himself. Steve wanted to sell them to make money. Their first creation, the Apple I, was mostly a warm-up act for 1977's Apple II. The insides of the II were the product of Woz's technical genius, but much about it — from its emphasis on ease of use to its stylish case design — reflected Steve Jobs' instincts in their earliest form. In an era when most computers still looked like nerdy scientific equipment, it was a consumer electronics device — and a bestseller.
In 1981, Woz crashed his V-tail Beechcraft and spent months recuperating, returning to Apple only in nominal fashion thereafter. From then on, Jobs was the Steve who shaped Apple's destiny. In 1979, he visited Xerox's PARC research lab in Palo Alto, California, and was dazzled by what he saw there, including an experimental computer with a graphical user interface and a mouse. "Within 10 minutes... it was clear to me that all computers would work this way someday," he later said.
At Apple, PARC's ideas showed up first in the Lisa, a $10,000 computer that flopped. They then reappeared in improved form in 1984's Macintosh, the creation of a dream team of gifted young software and hardware wizards led by Jobs. Launched with an unforgettable Super Bowl commercial that represented the IBM PC status quo as an Orwellian dystopia, the $2,495 Mac was by far the most advanced personal computer released up until then. Jobs said that it was "insanely great," a bit of self-praise that became forever associated with him and with Apple, even though he retired that particular phrase soon thereafter. (See the top 10 Apple moments.)
The Mac was insanely great — but it was also deeply flawed. The original version had a skimpy 128KB of memory and no expansion slots; computing pioneer Alan Kay who worked at Apple at the time, ticked off Jobs by calling it "a Honda with a one-gallon gas tank." In a pattern Jobs would repeat frequently in the years to come, he had given people things they didn't know they needed while denying them — at least temporarily — ones they knew they wanted.
Just as Jobs intuitively understood, PARC's ideas would have ended up on every computer whether or not the Mac had ever existed. But there's no question that he accelerated the process through sheer force of will.
"He wanted you to be great, and he wanted you to create something that was great," said computer scientist Larry Tesler, an Apple veteran, in the PBS documentary Triumph of the Nerds. "And he was going to make you do that." Whether Jobs was coaxing breakthroughs out of his employees or selling a new product to consumers, his pitches had a mesmerizing quality. Mac software architect Bud Tribble gave it the name it would be forever known by: the Reality Distortion Field. (See photos of the unveiling of Apple's tablet, the iPad.)
Jobs may have been inspiring, but he was also a high-maintenance coworker. He dismissed people who didn't impress him — and they were legion, inside and outside of Apple — as bozos. He was not a master of deadlines. He tormented hapless job candidates, and occasionally cried at work. And he was profoundly autocratic. (Jef Raskin, the originator of the Macintosh project said that Jobs "would have made an excellent king of France.")
Among the people whose buttons he increasingly pushed was Apple's president, John Sculley, the man whom he had famously berated into joining the company with the question "Do you want to sell sugared water for the rest of your life, or do you want to come with me and change the world?" Frustrated with Jobs' management of the Macintosh division and empowered by the Mac's sluggish sales, Sculley and Apple's board stripped him of all power to make decisions in June of 1985. In September, Jobs resigned.
Decades later, the notion of Apple deciding that it would be better off without Steve Jobs is as unfathomable as it would have been if Walt Disney Productions had sacked Walt Disney. In 1985, though, plenty of people thought it was a fabulous idea. "I think Apple is making the transition from one phase of its life to the next," an unnamed, overly optimistic Apple employee told InfoWorld magazine. "I don't know that the image of a leader clad in a bow tie, jeans, and suspenders would help us survive in the coming years."
Using his Apple millions and funding from Ross Perot and Canon, Jobs founded NeXT, a computer company that was even more Jobslike than Apple had been. Built in a state-of-the-art factory and sporting a logo by legendary designer Paul Rand, the NeXT system was a sleek black cube packed with innovations. Unfortunately, it was aimed at a market that turned out not to exist: academic types who could afford its $6,500 pricetag. After selling only 50,000 systems, NeXT refocused on software.
For a while, Jobs' second post-Apple venture, Pixar, also looked like a disappointment. Its $135,000 image-processing computer was a tough sell, and Jobs kept the company alive by pumping additional funds into it. As a sideline, however, it made computer-generated cartoons that started winning Oscars. In 1995, Disney released Pixar's first feature, Toy Story; when it became the year's top-grossing movie, it gave Jobs his first unqualified success in a decade. (By the time he sold Pixar to Disney for $7.4 billion in 2006, his career had reached such dizzying heights that the deal was merely a delightful footnote.)
Jobs later called the NeXT/Pixar years "one of the most creative periods of my life," and said that his dismissal from Apple had been "awful tasting medicine, but I guess the patient needed it." It was also the time when he went from high-profile bachelorhood — he had fathered a daughter out of wedlock and dated Joan Baez — to family man. He married Laurene Powell in 1991; by 1998, they were the parents of a son and two daughters. (Read "The Beginning of the Post-Steve Jobs Era.")
Meanwhile, Apple sans Jobs was failing on an epic scale. John Sculley had given way to a vision-free German Apple executive named Michael Spindler, who was replaced by Gil Amelio, a veteran of the computer-chip industry who was spectacularly unsuited to run Apple. He presided over $1.8 billion in losses in Apple's 1996 and 1997 fiscal years, and failed to sell the company to interested white knights IBM and Sun MicroSystems. The possibility of Apple running out of cash and ceasing to exist was not unthinkable.
Amelio did make one smart move during his 500 days at Apple. Just before Christmas of 1996, he paid $430 million to buy NeXT, thinking that its software could serve as the foundation of a next-generation Mac operating system. It would. (Every operating system that Apple created from 2001 onwards, including the one on the iPhone and iPad, is a direct descendant.)
NeXT's software came with a bonus: Steve Jobs. In a touching sign of naivete, Amelio apparently thought that he would cheerfully serve as a figurehead for the company he had co-founded. Instead, six months after the merger, Jobs orchestrated Amelio's ouster and accepted the position of interim CEO — iCEO for short — splitting time with his Pixar duties. "I'm here almost every day," he told TIME in 1997, "but just for the next few months. I'm really clear on it." He finally ditched the "i" in iCEO in 2000. (See TIME's video, "Are the New iPods and Apple TV Worth It?")
Steve Jobs' return cheered up beleaguered Apple fans, but few industry watchers expected miracles. "[T]he odds aren't good that he can do more than slow the fall, perhaps giving Apple a few more years before it is either gobbled up by a bigger company or finally runs out of customers," wrote Jim Carlton in 1998 when he updated his 1997 book Apple: The Inside Story of Intrigue, Egomania, and Business Blunders to reflect Jobs' comeback.
During his first months back at Apple, Jobs dumped board members, cut staff, slashed costs, killed dozens of products and accepted a $150 million lifeline from perennial bête noire Microsoft. (When Bill Gates made a remote guest appearance at the 1997 Macworld Expo keynote, looming on a video screen over Jobs, the audience booed.)
Jobs rolled out an advertising campaign — "Think Different" — that got people talking about the company again. And he presided over the release of the striking all-in-one iMac, which came in a translucent case crafted by Jonathan Ive, the British industrial designer who would be responsible for every major Apple product to come. In 1998, it became the best-selling computer in America.
Little by little, Jobs started acting less like a turnaround artist and more like a man who wanted, once again, to change the world. "Victory in our industry is spelled survival," he told TIME in 2001, when Apple was still on the rebound. "The way we're going to survive is to innovate our way out of this."
In May of that year, Apple had opened retail locations in McLean, Virginia and Glendale, California, the first of hundreds it would build. Big-box merchants rarely did a good job of explaining to consumers why they should choose a Mac over a cheaper Windows computers; now Apple could do the job itself, in the world's least cluttered, most tasteful computer stores.
The single most important moment in Apple and Jobs' redemption came six weeks after the 9/11 attacks. At a relatively low-key press event at Apple's Cupertino, California headquarters, Jobs explained that the company had decided to get into the MP3 player business. Then he pulled the first iPod out of his pocket. All of a sudden, Apple was a consumer-electronics company. (See how Apple is trying to sell the concept of Natural Language Computing on the new iPhone.)
Soon, it was an exceptionally successful consumer-electronics company. The iPod wasn't much more than a tiny hard drive with a headphone jack and slick software, but it became a cultural touchstone, especially after Apple made it work with Windows PCs as well as Macs. Even its white earbuds became iconic. iPods gained the lion's share of the media-player market, and never lost it.
At first, iPod owners got music by ripping their own music or "sharing" tracks from peer-to-peer networks such as Kazaa. Apple, seeing a need for a simple, legal source of music, introduced the iTunes Music Store in 2003. Unlike earlier music services, iTunes offered a proposition of Jobsian elegant simplicity: songs were 99 cents apiece, and you could play them on up to three devices and burn them to CD. Music companies weren't thrilled — they would have preferred higher prices and more restrictions — but consumers bought a million songs in the first week, and by 2008 they had purchased four billion of them.
Five years after Apple entered the music business, it had surpassed Wal-Mart to become the U.S.'s largest music retailer. By that time, iPods had screens capable of displaying video, and Jobs' company was a major distributor of movies and TV shows as well. (See more about the iPhone 4S.)
As important as the iPod was, it was ultimately just a high-tech Walkman. The iPhone, unveiled at a Macworld Expo keynote in 2007, was something far more: a powerful personal computer that happened to fit in your pocket. "Every once in awhile, a revolutionary product comes along that changes everything," Jobs said in introducing it, in a statement that — unlike some of the claims he'd been known to make at keynotes — turned out to be factual rather than fluffy. It instantly made every other smartphone on the market look like an antique.
For Jobs, it was a do-over: a chance to prevail in the PC wars that Microsoft had won the first time around. Typically, he responded not by aping the strategy that had worked so well for Microsoft the first time around, but by being even more like Steve Jobs. Like the first Mac, the first iPhone had obvious deficiencies. For instance, it shipped with a poky 2G wireless connection just as 3G was becoming pervasive. But its software was so radically better than anything anyone had ever seen that it didn't really matter.
In 2008, Apple introduced the App Store, which seamlessly delivered programs created by third-party developers to iPhones, giving Apple a 30% cut of all developer revenue along the way. The App Store was the only authorized way to get programs onto an iPhone; Apple regularly rejected programs that it deemed unsafe, offensive or disturbingly competitive with its own efforts. And yet the iPhone ended up with both the most apps and the best apps, making it hard to argue that Jobs' tight control had stifled the creativity of app developers.
The iPhone had serious competition, especially from handsets that used Google's Android operating system. But the iPhone ecosystem — phone plus apps, movies and music delivered through Apple services — contributed to Apple's success in a way no other company could match. By 2011, it was selling more than 220,000 iPhones a day and, according to one analyst, capturing two-thirds of the industry's profits.
In 2010, Apple followed up the iPhone with the iPad, its first effort in a category — tablet computers — that had existed for two decades without a single hit product. Apple sold 14.8 million iPads in 2010, a number that dwarfed the predictions of Wall Street analysts. (It also flummoxed competitors, who rushed into the market with iPad competitors that were far less appealing, and sometimes much more expensive, than the real thing.) By then, it wasn't surprising that Steve Jobs surpassed almost everyone's expectations; it would have been more startling if he hadn't.
By then, Apple's business model had come to bear little resemblance to that of other computer makers. The rest of the industry was deeply decentralized: a consumer went to Best Buy to purchase an Acer computer running Microsoft software, and then used it with Rhapsody's music service and a SanDisk MP3 player. Tech support was typically outsourced to some nameless firm halfway around the world. (See why the PC isn't dying, it's just evolving.)
Apple had ago stopped building its own stuff — one of its contract manufacturers, China's Foxconn, earned its own measure of celebrity — but otherwise, it controlled what Steve Jobs called "the whole widget." It wrote its own software, designed its own hardware and delivered services such as iTunes. It sold Macs, iPods and other products at its own stores, where face-to-face support was available for free at a "Genius Bar." Once you owned an Apple device, you filled it with movies, music, and apps from Apple's online stores. The company even started designing its own processors for the iPhone and iPad. In short, it came as close as it possibly could to fulfilling the Jobs vision down to the last detail.
Jobs remained the difficult, demanding, sometimes unreasonable perfectionist that Apple thought had been dispensable a dozen years earlier. But the NeXT and Pixar experiences had instilled him with new discipline. He still pushed boundaries, but in ways that more consistently worked in Apple's favor. And working with Chief Operating Officer Tim Cook, later to succeed him as CEO, he had turned the company into a wildly profitable exemplar of efficiency.
More than any major Silicon Valley company, Apple kept its secrets secret until it was ready to talk about them; countless articles about the company included the words "A spokesperson for Apple declined to comment." It wasn't able to stomp out all rumors, and in 2010, gadget blog Gizmodo got its hands on an unreleased iPhone 4 that an Apple engineer had left at a beer garden in Silicon Valley. Even if Apple detested such leaks, they became part of its publicity machine. (See if Steve Jobs' departure from Apple will hurt the economy.)
Minimalism came to typify Jobs' product-launch presentations in San Francisco and at Apple headquarters as much as the products themselves. Jobs 1.0 was known for his bow tie and other foppish affectations. Jobs 2.0 had one uniform — a black mock turtleneck, Levi 501 jeans and New Balance 992 sneakers. With kabuki-like consistency, his keynotes followed a set format: Financial update with impressive numbers, one or more demos, pricing information and one more thing. Even the compliments he paid to Apple products ("Pretty cool, huh?") rarely changed much.
He generated hoopla with such apparent effortlessness that many people concluded he was more P.T. Barnum than Thomas Edison. "Depending on whom one talks to," Playboy said, "Jobs is a visionary who changed the world for the better or an opportunist whose marketing skill made for an incredible commercial success." It published those words in the introduction to a 1985 Jobs interview, but they could have been written last week.
Still, even Jobs' detractors tended to think of him and his company as a single entity. Apple was demonstrably full of talented employees in an array of disciplines, but Jobs' reputation for sweeping micromanagement was so legendary that nobody who admired the company and its products wanted to contemplate what it might be like without him. Shareholders were even more jittery about that prospect: A stock-option backdating scandal that might have destroyed a garden-variety CEO barely dented his reputation.
Increasingly, though, the world was forced to confront the idea of a Jobs-free Apple. In 2004, he was diagnosed with pancreatic cancer and told he had months to live; further investigation showed it was a rare form of the disease that could be controlled. Jobs turned day-to-day control of Apple over to Chief Operating Officer Tim Cook, underwent surgery, recovered, and came back. During a 2009 medical leave, he received a liver transplant. He went on another medical leave in 2011 that became permanent when he resigned as CEO on Aug. 25, assuming Apple's chairmanship and handing off CEO duties to Cook.
Jobs was so obviously fundamental to Apple's success that many feared the company's amazing run would end the moment he was no longer calling every shot. Instead, Apple prospered during the period of his illnesses and absences. By 2011, the vast majority of its revenues came from products that hadn't existed when Jobs took his first medical leave. He had accomplished one of his most astounding feats: Teaching an entire company to think like Steve Jobs.
Always happier praising a new Apple product than talking about his private life, Jobs said little about his struggles with ill health. He did, however, address them briefly in the Stanford commencement speech he gave in 2005. And as commencement speakers are supposed to do, he gave the students — most of whom were about the same age he was when he co-founded Apple — some advice. "Your time is limited, so don't waste it living someone else's life," he said, sounding like the very thought of living someone else's life infuriated him. "Don't be trapped by dogma, which is living with the results of other people's thinking. Don't let the noise of others' opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary."
Steve Jobs' heart and intuition knew what he wanted out of life — and his ambitions took him, and us, to extraordinary places. It's impossible to imagine what the last few decades of technology, business, and, yes, the liberal arts would have been like without him.
Friday, Oct. 07, 2011
Steve Jobs: Remembering the Dissatisfied Man
Many famous Steve Jobs moments involve him speaking before enraptured audiences. One of my most vivid Jobs memories is of the time I saw him standing quietly at the back of one.
I was attending the SIGGRAPH computer graphics conference in Boston in 1989, and was trapped in a throng on the show floor watching one of Pixar's short cartoons. (The company hadn't started making feature-length ones, and wasn't yet a cherished icon of American entertainment.) The person next to me happened to be Jobs, who had bought George Lucas's graphics division in 1985 and bootstrapped it into independent existence as Pixar. (See photos of the long and extraordinary career of Steve Jobs.)
I'm not sure if anyone else noticed him watching the movie and observing the crowd — they were too busy being entertained. He beamed with quiet satisfaction, wearing the same cat-like grin he did at the launch of the original Macintosh in 1984. In recent years, as I attended Apple product launches where Jobs was the center of attention, I saw it repeatedly.
And yet,'satisfaction' is nowhere on the list of words that leaps to mind as I think about Jobs' life and career. He was famously dissatisfied — with products under development, with the people who reported to him, with Apple's competitors and partners, with the way the technology industry worked, with life in general.
I found that smile of contentment more compelling than the praise ("insanely great!") that Jobs routinely bestowed on Apple products. It looked utterly sincere. But it was also fleeting. A Jobs who had kept on being pleased with Pixar as it existed in 1989 would never have instigated the changes that made Toy Story possible, just as a Jobs who was happy with the Apple's lot in life in 2001 would never have presided over a decade's worth of ambitious expansion. (Watch a video of Jobs' greatest keynotes.)
To me, the most remarkable thing about his career wasn't the Apple II, the Mac, Pixar, the Apple Store, the iPod, iTunes, the iPad or any of the other blockbusters he guided into existence — even though any one of them would have been enough to assure a place in the history of business and technology. (Steve Wozniak contributed to only one of these hits, the Apple II, and will be rightly revered for it forever.) Rather, it's the durability and consistency of his vision that is astonishing.
When Jobs cofounded Apple with Woz in 1976, he was 21 and the personal computer industry barely extended beyond the meetings of the Homebrew Computer Club, the Silicon Valley institution that counted Jobs and Woz among its regulars. By 1977, the Apple II was on the market and Jobs was arguably the most influential figure in the business. What were the chances that he would still dominate it thirty-four years later — not by riding out his successes but by restlessly moving on to the next big thing, again and again?
Jobs didn't stay relevant by changing with the times so much as by sticking to his principles until the rest of the world had caught up with them, along with the capabilities of processors, displays, communications components, and other technologies. In the 1970s, he thought that the Apple II should be a consumer-electronics device rather than a piece of computing equipment and was willing to strip out features people thought they wanted rather than do them badly. It worked then, and it worked a generation later when the iPad sprung from the same philosophy.
If he hadn't been so unimaginably stubborn, the world would be a meaningfully different place today. It wasn't all that long ago, for instance, that some "experts" believed that Apple should ditch the Mac and start building fancy Windows PCs — a strategy that might have helped the company's bottom line in the short run but would have snuffed out the iPhone and iPad before they even existed. (Watch TIME's video "Jobs' Career in 2 Minutes.")
For all of the simplicity of Jobs' instincts and the undeniable success of the products that resulted from them, he still confuses people. A meaningful percentage of Apple observers find the magnitude of the company's success over the past decade impossible to process. They remember the Jobs who was squeezed out of Apple in 1985, the one who appeared to have lost the PC wars to Microsoft's Bill Gates, the one who returned in 1997 to an Apple so fragile that some pundits thought the main question was whether he'd be able to prop it up for a few more years.
Jobs never stopped being a control freak, but anyone who still thinks that was a liability stopped paying attention years ago. And anyone who still believes that the typical Apple customer is a Steve Jobs Mini-Me should visit an Apple Store and spy on the consumers milling about. They look like real people of all sorts to me, not style-obsessed cultists. (Read "The Beginning of the Post — Steve Jobs Era.")
For years, the conventional wisdom on Jobs' relationship with Apple was that his micromanaging ways had left it in shaky condition to thrive without him. The fact that it flourished over the past three years, during Jobs' two extended medical leaves suggests that wasn't true. Under the new CEO Tim Cook, the company stands an excellent chance of continuing to be vital, profitable and influential for years to come.
Apple has every reason to be proud of its cofounder's dazzling legacy. But if it ever spends too much time celebrating it, we'll know it's lost its way. May it continue, like Steve Jobs, to find only brief moments of satisfaction in the way things are and have been — an edgy, dissatisfied company less interested in its current success and storied past than in its boundless potential.
See the 50 best inventions of 2010.
Once in a while, there comes along a great entrepreneur with a special mindset, the ability to see great potential in something, and the drive to start a paradigm shift. That drive to create is the hallmark of the entrepreneur, a class that has historically been looked upon with distrust and disdain.
On Wednesday, the world lost a great entrepreneur and capitalist, a man whose creativity and genius sparked an entire industry and changed the face of computing and communication forever. That man was Steve Jobs.
Like many of you, I was shocked to hear of his passing. I knew that he was sick, but I had no idea his time was that near when he decided to retire as Apple‘s CEO. I have had a draft titled “Steve Jobs, Capitalist and Entrepreneur of the Decade” sitting in my drafts folder since sometime in early September, and I have been meaning to write up something about his long and storied career.
His name now belongs up there with many of the other great inventors and entrepreneurs of our age, but his success has come at the end of a long road. It takes more than a good idea to reach success: it also requires drive, determination, and passion. I’ve always wondered what his life story was like, what challenges he faced on the road to success, and what lessons we can learn from his journey.
Getting fired from the company you founded
Imagine what it would be like to get fired from the company that you founded. In 1976, Steve Jobs founded Apple along with Steve Wozniak and Ronald Wayne. The company quickly found success and started to rapidly expand, rising from a couple million in sales to a billion and beyond. However, a slump in the mid-80s as well as a power struggle led to Steve’s firing in 1985.
I can’t imagine what it must have been like, but I’m not sure words can be put to the feeling that someone would have after getting fired from a company that they had founded and built up just years earlier. However, Steve quickly put his creativity to work, and later said that it was the best thing that could have happened to him. He went on to found NeXT Computer, a company which built innovative computing solutions, and he also bought the company that would later be known as Pixar and go on to produce many memorable films in partnership with Disney.
Apple was lagging in the meantime, and the company was hurting from internal inefficiencies and a lack of direction. Steve eventually came back home when NeXT Computer was acquired by Apple in 1997. Many of the technologies went on to live in Apple’s products, and Steve regained his former job as the CEO of Apple.
Revolutionizing the industry
The rebirth and rise of Apple since Steve Jobs returned to the helm has been nothing short of stellar. The years since then have seen many innovative products come from the company, such as the iMac, iPod, and especially the iPhone. Steve Jobs was a master of marketing and design, and he knew how to take a product concept and turn it into something that would win over consumers.
I believe that the iPhone may be the most revolutionary product to have been developed by the company, and while the concept of a smartphone is part of the natural evolution of computing and communications technology, the iPhone is the first phone that changed the whole paradigm of the industry. Here was a phone that took touch and the user interface to a whole new level, and the company completely changed the game by integrating an app-store for third-party apps. The iPhone brought everything together in a way that was completely unparalleled at the time.
Of course, like any great invention, the iPhone also had its share of skeptics. Other companies laughed at the time and said “a touch phone? It’s just a fad that people will be over soon enough.” Well, those companies aren’t laughing anymore. The smartphone is not yet ubiquitous, but most people didn’t use the Internet just a decade and a half ago. Things move fast, and today, computers and the Internet are just about indispensable. With the spread of the Internet, information has never been easier to access or publish. You can survive without computers, but they just allow you to do so much more.
True philanthropy is creating something that people love
I am a big believer in adding value to the world through voluntary trade and exchange, and what better way to do this than to create something that people like so much that they are willing to give you something in exchange, out of their own free will? People give something up because, in their own perception, they are getting something better in exchange. Given the continued success of Apple and the reinvention of the industry, I would say that they were right.
There are people around the world who are getting onto the Internet for the first time through a mobile phone. In some places, they are skipping the wired infrastructure and going straight to wireless communications, a natural home for the mobile phone. The release of the iPhone completely shook up the industry and has fomented such a huge burst of innovation, competition, and progress. These changes are showing up everywhere, such as in the medical industry, in education, and just in plain old everyday life. Over time, the consumer receives ever-rising quality and choice, and pays ever-lowering costs.
I think what really hits home for me is that I just got such a sense of hope, optimism, and wonder from Steve Jobs and from Apple. He turned vision into reality and added true value to the world by creating products that people loved, and I daresay that the world became a better place for it. Contrast this to the fear, envy and greed that we often feel when it comes to politicians and the political stories of our times. Instead of creating wealth, politicians take it by force and destroy it, and instead of lifting us up, they turn us against each other and pander to the worst of our emotions. Steve Jobs might not have talked the political rhetoric, but I will go out on a limb and boldly state that he did far more good for this world than most of these politicians ever will.
Rest in peace, Mr. Jobs. You may be no longer of this earth, but your ideas and passion have changed the world forever.
No comments:
Post a Comment