Showing posts with label Life Of.... Show all posts
Showing posts with label Life Of.... Show all posts

Thursday 13 October 2011

Life of Dennis Ritchie

As co-inventor of Unix and the programming language C, he had a key role in shaping today's computing environment

Dennis Ritchie in May 2011, when he was awarded the Japan prize. Photograph: Victoria Will/AP Images for the Japan Prize Foundation

 The American computer scientist Dennis Ritchie, who has died aged 70 after a long illness, was one of the co-inventors of the Unix operating system and the C programming language. Unix and C provided the infrastructure software and tools that created much of today's computing environment – from the internet to smartphones – and so have played a central part in shaping the modern world.

The origins of Unix go back to the 1960s, long before the microchip and personal computers had been invented. The nearest thing to personal computing was the so-called computer utility. This consisted of a large mainframe computer that was used simultaneously, and at great expense, by a couple of dozen users sitting at typewriter terminals.

By the middle of the decade, the computer utility appeared to provide the way ahead, and a consortium of General Electric, Bell Labs and the Massachusetts Institute of Technology (MIT) embarked on a project called Multics (Multiplexed Information and Computing Service). Multics would be the world's largest computer utility, supporting several hundred simultaneous users. Bell Labs was responsible for the operating software.

Ritchie joined the programming division of Bell Labs in 1967. His father, Alistair Ritchie, had had a long career there, and had co-authored an influential technical book, The Design of Switching Circuits (1951). Dennis was born to Alistair and and his wife Jean in the northern New York suburb of Bronxville, and grew up in New Jersey, where Bell Labs had its Murray Hill site. He studied physics and applied mathematics for a bachelor's degree (1963) and computer science for a PhD (1968) at Harvard University.

Multics was in crisis when he arrived at the research organisation. Indeed, many big software projects were in crisis – people were just beginning to learn that writing large programs was horrendously difficult and costly. In 1969, after four years of development, Bell Labs pulled out of the project.

Ritchie and another lead programmer on Multics, Ken Thompson, were left somewhat bereft by the project's demise. Multics promised a wonderful computing experience, but the operating system was too complex to build. This led them to rethink their software philosophy. They would build a simpler, smaller system that they would call Unix – the name was "a kind of treacherous pun on Multics", Ritchie once explained.

The idea was not immediately appreciated by their managers, and they had to "scrounge around" for an obsolete computer to develop Unix. The computer had just 16 kilobytes of memory, and this alone was an encouragement to keep things simple. If Multics was the victim of baroque software architecture, then Unix would be pure Bauhaus.

Unix was designed over a period of a few months in 1969, and a prototype was running early the following year. Their colleagues remained unconvinced. However, by offering to write some text-processing software, Ritchie and Thompson managed to persuade the Bell Labs patent department to acquire a full-size computer and run Unix on it.

They decided to rewrite the operating system entirely for the new machine. The first version of Unix had been written in the computers' native machine code, which was difficult and slow. For this next version of Unix, Ritchie invented a new language called C, which bridged the gap between machine code and programming languages such as Fortran and Cobol.

C also had an interesting ancestry. The progenitor was a language jointly designed at Cambridge and London universities in 1964 and known as CPL (Combined Programming Language). CPL never survived, but one of the development team, Martin Richards, became a visitor at MIT. There he designed a simpler version of the language for systems implementation, BCPL (Basic CPL).

Once Thompson and Ritchie discovered BCPL, they decided to use it for writing Unix: to do so they squeezed it into 8 kilobytes and renamed it B. Finally, a new and improved version was developed and named C, which, Ritchie mused, "left open the question whether the name represented a progression through the alphabet or through the letters BCPL".

C made writing software immeasurably easier and it also made software portable – so that a program written in C could run on any machine. The new version of Unix was completed in 1973, and since it was written in C, it, too, was portable.

Because Bell Labs's parent, AT&T, was a regulated telephone monopoly, it was prohibited from competing in the computer industry, and so had no pecuniary interest in Unix. This allowed Ritchie and Thompson to distribute Unix free of charge to universities and research institutions, which loved its clean, economical design.

Universities began to train their students in Unix and C, and when they graduated they took the culture into industry, where it blossomed. In 1978 Ritchie and a colleague, Brian Kernighan, wrote a textbook, The C Programming Language, which became a bestselling programming primer for the next 15 years. Despite the prosaic title, it was equally a book about programming style, and it shaped programming practices worldwide.

Ritchie and Thompson got early recognition for their work when they received the 1983 Turing award of the Association of Computing Machinery, often dubbed the Nobel prize of computing. But the Unix story was just beginning. The Advanced Projects Research Agency of the US department of defence adopted Unix for the network research that eventually created the internet, and it remains the software glue that binds everything together.

Steve Jobs was a Unix devotee. When he was ousted from Apple Computer in 1985, he used Unix as the basis for his NeXT computer workstation. After his return to Apple ten years later, he brought Unix with him and it became the foundation for all of Apple's current products.
Unix is also at the heart of today's open-source software movement. In the 1980s, following deregulation, AT&T began to assert its intellectual property rights in Unix. A Finnish computer science student named Linus Torvalds decided that the world needed a free version of Unix, which became known as Linux. The system was written by hundreds of programmers, mostly steeped in the Unix and C culture, collaborating over the internet. Today, the free Linux operating system powers billions of electronic devices, from smartphones to set-top boxes.
Ritchie and Thompson – usually together – received many honours and awards, culminating with the National Medal of Science awarded by President Clinton in 1998. The citation described their inventions has having "led to enormous advances in hardware, software, and networking systems and stimulated the growth of an entire industry." Earlier this year, the pair won a Japan prize. Ritchie spent all his career at Bell Labs, retiring as head of systems software research in 2007.

Dennis MacAlistair Ritchie, computer scientist, born 9 September 1941; died 8 October 2011

Sunday 9 October 2011

Life of Bill Gates'

Born: October 28, 1955
Seattle, Washington

American businessman, chief executive officer, and software developer


Microsoft cofounder and chief executive officer Bill Gates has become the wealthiest man in America and one of the most influential personalities in the ever-evolving computer industry.

Love of computer technology

William H. Gates III was born on October 28, 1955, in Seattle, Washington. He was the second child and only son of William Henry Gates Jr., a successful Seattle attorney, and Mary Maxwell, a former schoolteacher. Kristi, his older sister, later became his tax accountant and Libby, his younger sister, lives in Seattle raising her two children. Gates enjoyed a normal, active childhood and participated in sports, joined the Cub Scouts, and spent summers with his family in Bremerton, Washington.
Although Gates's parents had a law career in mind for their son, he developed an early interest in computer science and began studying computers in the seventh grade at Seattle's Lakeside School. Lakeside was a private school chosen by Gates's parents in the hopes that it would be more challenging for their son's intellectual drive and curiosity. At Lakeside, Gates came to know Paul Allen, a classmate with similar interests in technology who would eventually become his business partner. Immediately, Gates and Allen realized the potential of the young computer industry.

Early experience

Gates's early experiences with computers included debugging (eliminating errors from) programs for the Computer Center Corporation's PDP-10, helping to computerize electric power grids for the Bonneville Power Administration, and founding with Allen a firm called Traf-O-Data while still in high school. Their small company earned them twenty thousand dollars in fees for analyzing local traffic patterns.
While working with the Computer Center's PDP-10, Gates was responsible for what was probably the first computer virus, a program that copies itself into other programs and ruins data. Discovering that the machine was connected to a national network of computers called Cybernet, Gates invaded the network and installed a program on the main computer that sent itself to the rest of the network's computers, making it crash (became damaged). When Gates was found out, he was severely punished, and he kept away from computers for his entire junior year at Lakeside. Without the lure of computers, Gates made plans for college and law school in 1970. But by 1971 he was back helping Allen write a class scheduling program for their school's computer.

The article that started it all

Gates entered Harvard University in 1973 and pursued his studies for the next year and a half. His life changed in January of 1975, however, when Popular Mechanics carried a cover story on a $350 microcomputer, the Altair, made by a firm called MITS in New Mexico. When Allen excitedly showed him the story, Gates knew where he wanted to be: at the forefront of computer software (a program of instructions for a computer) design.
Gates dropped out of Harvard in 1975, ending his academic life and beginning his career as a software designer. At this time, Gates and Allen cofounded Microsoft. They wrote programs for the early Apple and Commodore machines. One of Gates's most significant opportunities arrived in 1980, when IBM approached him to help with their personal computer project, code name Project Chess. Gates developed the Microsoft Disk Operating System, or MS-DOS. (An operating system is a type of software that controls the way a computer runs.) Not only did he sell IBM on the new operating system, but he also convinced the computer giant to allow others to write software for the machine. The result was the rapid growth of licenses for MS-DOS, as software developers quickly moved to become compatible with (able to work with) IBM. By the early 1990s Microsoft had sold more than one hundred million copies of MS-DOS, making the operating system the all-time leader in software sales. For his achievements in science and technology, Gates received the Howard Vollum Award in 1984 from Reed College in Portland, Oregon.
Gates's competitive drive and fierce desire to win has made him a powerful force in business, but it has also consumed much of his personal life. In the six years between 1978 and 1984, he took a total of only two weeks vacation. But on New Year's Day 1994 Gates married Melinda French, a Microsoft manager, on the Hawaiian island of Lanai. The ceremony was held on the island's Challenge golf course, and Gates kept it private by buying out the unused rooms at the local hotel and by hiring all of the helicopters in the area to keep photographers from using them. His fortune at the time of his marriage was estimated at close to seven billion dollars. By 1997 his worth was estimated at approximately $37 billion, earning him the title of "richest man in America."

The future for Microsoft

Many criticize Gates not just for his success, but because they feel he tries to unfairly—and maybe even illegally—dominate the market. As a result of Microsoft's market control, the U.S. Department of Justice brought an antitrust lawsuit (a lawsuit that is the result of a company being accused of using unfair business practices) against the company in 1998, saying the company had an illegal stronghold on the software industry.

Bill Gates. Reproduced by permission of AP/Wide World Photos.
Bill Gates.
Reproduced by permission of
AP/Wide World Photos
.
Gates maintained Microsoft's success over rivals such as Oracle and IBM was simply the result of smart, strategic decision making. U.S. District Judge Thomas P. Jackson did not agree, and in November 1999, he found Microsoft to be a monopoly (a company with exclusive control) that used its market power to harm competing companies. Because of the ruling, Gates faced the prospect of breaking up Microsoft. On January 13, 2000, Gates handed off day-to-day management of Microsoft to friend and right-hand man Steve Ballmer, adding chief executive officer to his existing title of president. Gates held on to his position as chairman in the reshuffle, and added the title of chief software architect.
In the spring of 2002 Gates himself was scheduled to testify on behalf of Microsoft. The final ruling on the fate of Microsoft has the potential to be a landmark decision on the future of the computer industry.

Gates as philanthropist

Aside from being the most famous businessman of the late 1990s, Gates also has distinguished himself as a philanthropist (someone working for charity). He and wife Melinda established the Bill & Melinda Gates Foundation, which focuses on helping to improve health care and education for children around the world. The foundation has donated $4 billion since its start in 1996. Recent pledges include $1 billion over twenty years to fund college scholarships for about one thousand minority students; $750 million over five years to help launch the Global Fund for Children's Vaccines; $50 million to help the World Health Organization's efforts to eradicate polio, a crippling disease that usually attacks children; and $3 million to help prevent the spread of acquired immune deficiency syndrome (AIDS; an incurable disease that destroys the body's immune system) among young people in South Africa. In November 1998 Gates and his wife also gave the largest single gift to a U.S. public library, when they donated $20 million to the Seattle Public Library. Another of Gates's charitable donations was $20 million given to the Massachusetts Institute of Technology to build a new home for its Laboratory for Computer Science.
In July 2000 the foundation gave John Hopkins University a five-year, $20 million grant to study whether or not inexpensive vitamin and mineral pills can help save lives in poor countries. On November 13, 2000, Harvard University's School of Public Health announced it had received $25 million from the foundation to study AIDS prevention in Nigeria. The grant was the largest single private grant in the school's history. It was announced on February 1, 2001, that the foundation would donate $20 million to speed up the global eradication (to completely erase) of the disease commonly known as elephantiasis, a disease that causes disfigurement. In 2002 Gates, along with rock singer Bono, announced plans for DATA Agenda, a $24 billion fund (partially supported by the Bill and Melinda Gates Foundation) that seeks to improve health care in Africa.
Although many describe Gates as cold and distant, his friends find him friendlier since his marriage and since the birth of his daughter, Jennifer, in April 1996. Further, he recognizes his overall contribution to both the world of technology and his efforts in philanthropy. In Forbes magazine's 2002 list of the two hundred richest people in the world, Gates was number one for the eighth straight year, coming in with a net worth of $52.8 billion.

For More Information

Gates, Bill, with Nathan Myhevrold and Peter Rinearson. The Road Ahead. New York: Viking Press, 1995.
Ichbiah, Daniel, and Susan L. Knepper. The Making of Microsoft. Rocklin, CA: Prima, 1991.
Wallace, James. Hard Drive: Bill Gates and the Making of the Microsoft Empire. New York: Wiley, 1992.

Bill Gates' 11 Rules of Life

Netlore Archive: Circulating via email, the text of a speech allegedly given by Bill Gates in which he sets out 11 rules for life kids won't learn in school.

Description: Email flyer
Circulating since: Feb. 2000
Status: Falsely attributed to Bill Gates

2003 example:
Email contributed by Sarah C., July 12, 2003:
BILL GATES' SPEECH TO MT. WHITNEY HIGH SCHOOL in Visalia, California.
Love him or hate him, he sure hits the nail on the head with this!

To anyone with kids of any age, here's some advice. Bill Gates recently gave a speech at a High School about 11 things they did not and will not learn in school. He talks about how feel-good, politically correct teachings created a generation of kids with no concept of reality and how this concept set them up for failure in the real world.

Rule 1: Life is not fair -- get used to it!

Rule 2: The world won't care about your self-esteem. The world will expect you to accomplish something BEFORE you feel good about yourself.

Rule 3: You will NOT make $60,000 a year right out of high school. You won't be a vice-president with a car phone until you earn both.

Rule 4: If you think your teacher is tough, wait till you get a boss.

Rule 5: Flipping burgers is not beneath your dignity. Your Grandparents had a different word for burger flipping -- they called it opportunity.

Rule 6: If you mess up, it's not your parents' fault, so don't whine about your mistakes, learn from them.

Rule 7: Before you were born, your parents weren't as boring as they are now. They got that way from paying your bills, cleaning your clothes and listening to you talk about how cool you thought you are. So before you save the rain forest from the parasites of your parent's generation, try delousing the closet in your own room.

Rule 8: Your school may have done away with winners and losers, but life HAS NOT. In some schools they have abolished failing grades and they'll give you as MANY TIMES as you want to get the right answer. This doesn't bear the slightest resemblance to ANYTHING in real life.

Rule 9: Life is not divided into semesters. You don't get summers off and very few employers are interested in helping you FIND YOURSELF. Do that on your own time.

Rule 10: Television is NOT real life. In real life people actually have to leave the coffee shop and go to jobs.

Rule 11: Be nice to nerds. Chances are you'll end up working for one.

2000 example:
Email contributed by Roman S., Feb. 8, 2000:

Bill Gates' Message on Life

For recent high school and college graduates, here is a list of 11 things they did not learn in school.

In his book, Bill Gates talks about how feel-good, politically-correct teachings created a full generation of kids with no concept of reality and how this concept set them up for failure in the real world.

RULE 1......Life is not fair; get used to it.

RULE 2......The world won't care about your self-esteem. The world will expect you to accomplish something BEFORE you feel good about yourself.

RULE 3......You will NOT make 40 thousand dollars a year right out of high school. You won't be a vice president with a car phone, until you earn both.

RULE 4......If you think your teacher is tough, wait till you get a boss. He doesn't have tenure.

RULE 5......Flipping burgers is not beneath your dignity. Your grandparents had a different word for burger flipping; they called it opportunity.

RULE 6......If you mess up, it's not your parents' fault, so don't whine about your mistakes, learn from them.

RULE 7......Before you were born, your parents weren't as boring as they are now. They got that way from paying your bills, cleaning your clothes and listening to you talk about how cool you are. So before you save the rain forest from the parasites of your parents' generation, try "delousing" the closet in your own room.

RULE 8......Your school may have done away with winners and losers, but life has not. In some schools they have abolished failing grades; they'll give you as many times as you want to get the right answer. This doesn't bear the slightest resemblance to ANYTHING in real life.

RULE 9......Life is not divided into semesters. You don't get summer off and very few employers are interested in helping you find yourself. Do that on your own time.

RULE 10.....Television is NOT real life. In real life people actually have to leave the coffee shop and go to jobs.

RULE 11.....Be nice to nerds. Chances are you'll end up working for one.

Friday 7 October 2011

Life of Steve Jobs

Steven Paul Jobs, 56, died Wednesday at his home with his family. The co-founder and, until last August, CEO of Apple Inc was the most celebrated person in technology and business on the planet. No one will take issue with the official Apple statement that “The world is immeasurably better because of Steve.”

It had taken a while for the world to realize what an amazing treasure Steve Jobs was. But Jobs knew it all along. That was part of what was so unusual about him. From at least the time he was a teenager, Jobs had a freakish chutzpah. At age 13, he called up the head of HP and cajoled him into giving Jobs free computer chips. It was part of a lifelong pattern of setting and fulfilling astronomical standards. Throughout his career, he was fearless in his demands. He kicked aside the hoops that everyone else had to negotiate and straightforwardly and brazenly pursued what he wanted. When he got what he wanted — something that occurred with astonishing frequency — he accepted it as his birthright.

If Jobs were not so talented, if he were not so visionary, if he were not so canny in determining where others had failed in producing great products and what was necessary to succeed, his pushiness and imperiousness would have made him a figure of mockery.

But Steve Jobs was that talented, visionary and determined. He combined an innate understanding of technology with an almost supernatural sense of what customers would respond to. His conviction that design should be central to his products not only produced successes in the marketplace but elevated design in general, not just in consumer electronics but everything that aspires to the high end.

As a child of the sixties who was nurtured in Silicon Valley, his career merged the two strains in a way that reimagined business itself. And he did it as if he didn’t give a damn who he pissed off. He could bully underlings and corporate giants with the same contempt. But when he chose to charm, he was almost irresistible. His friend, Heidi Roizen, once gave advice to a fellow Apple employee that the only way to avoid falling prey to the dual attacks of venom and charm at all hours was not to answer the phone. That didn’t work, the employee said, because Jobs lived only a few blocks away. Jobs would bang on the door and not go away.

For most of his 56 years, Steve Jobs banged on doors, but for the past dozen or so very few were closed to him. He was the most adored and admired business executive on the planet, maybe in history. Presidents and rock stars came to see him. His fans waited up all night to gain entry into his famous”“Stevenote” speeches at Macworld, almost levitating with anticipation of what Jobs might say. Even his peccadillos and dark side became heralded.

His accomplishments were unmatched. People who can claim credit for game-changing products — iconic inventions that become embedded in the culture and answers to Jeopardy questions decades later — are few and far between. But Jobs has had not one, not two, but six of these breakthroughs, any one of which would have made for a magnificent career. In order: the Apple II, the Macintosh, the movie studio Pixar, the iPod, the iPhone and the iPad. (This doesn’t even include the consistent, brilliant improvements to the Macintosh operating system, or the Apple retail store juggernaut.) Had he lived a natural lifespan, there would have almost certainly been more.

Behind any human being is a mystery: What happened to make him … him? When considering extraordinary people, the question becomes an obsession. What produces the sort of people who create world-changing products, inspire by example and shock by justified audacity, and tag billions of minds with memetic graffiti? What led to his dead-on product sense, his haughty confidence, his ability to simultaneously hector and inspire people to do their best work?
His gene pool was intriguing. His biological parents were Abdulfattah John Jandali, a Syrian immigrant; and a graduate student named Joanne Simpson. Unmarried when her son was born on February 24, 1955, Simpson gave him up for adoption. She later married Jandali and had another child, award-winning novelist Mona Simpson. Jobs grew up in a middle-class suburb with two loving parents, Paul and Clara Jobs. (He had a sister, Patti, who survives him.) Though he did make a successful effort to find his birth mother, he never seemed to warm to the theory that his drive was a subconscious reaction to a conjectured rejection. He always spoke highly of the family that raised him. “I grew up at a time where we were all well-educated in public schools, a time of peace and stability until the Vietnam War got going in the late sixties,” he said.
The turmoil in those sixties was also part of his make-up. “We wanted to more richly experience why were we were alive,” he said of his generation, “not just make a better life, and so people went in search of things. The great thing that came from those that time was to realize that there was definitely more to life than the materialism of the late 50’s and early sixties. We were going in search of something deeper.”

He went to Reed, a well-regarded liberal arts school known as a hippie haven, but dropped out after a semester, choosing to audit courses informally. (Including a class on calligraphy that would come in very handy in later years.) Jobs also took LSD in those years, and would claim that those experiences affected his outlook permanently and positively. After leaving Oregon, he traveled to India. All of these experiences had an effect on the way he saw the world — and the way he would make products to change that world.

Jobs usually had little interest in public self-analysis, but every so often he’d drop a clue to what made him tick. Once he recalled for me some of the long summers of his youth. I’m a big believer in boredom,” he told me. Boredom allows one to indulge in curiosity, he explained, and “out of curiosity comes everything.” The man who popularized personal computers and smartphones — machines that would draw our attention like a flame attracts gnats — worried about the future of boredom. “All the [technology] stuff is wonderful, but having nothing to do can be wonderful, too.”

In an interview with a Smithsonian oral history project in 1995, Jobs talked about how he learned to read before he got to school — that and chasing butterflies was his passion. School was a shock to him — “I encountered authority of a different kind than I had ever encountered before, and I did not like it,” he said. By his own account he became a troublemaker. Only the ministrations of a wise fourth grade teacher — who lured him back to learning with bribes and then hooked him with fascinating projects — rekindled his love of learning.

Meanwhile, his dad, Paul — a machinist who had never completed high school — had set aside a section of his workbench for Steve, and taught him how to build things, disassemble them, and put them together. From neighbors who worked in the electronics firm in the Valley, he learned about that field — and also understood that things like television sets were not magical things that just showed up in one’s house, but designed objects that human beings had painstakingly created. “It gave a tremendous sense of self-confidence, that through exploration and learning one could understand seemingly very complex things in one’s environment,” he told the Smithsonian interviewer.

After his call to Packard, Jobs worked at HP as a teenager. He later had a job at Atari, when the video-game company was just getting started. Yet he did not see the field as something that would satisfy his artistic urges. “Electronics was something I could always fall back on when I needed food on the table,” he once told me.

That changed when Steve Jobs saw what a high-school friend, Steve Wozniak, was doing. Wozniak was a member of the Homebrew Computer Club, a collection of Valley engineers and hangers-on who were thrilled at the prospect of personal computers, which had just become possible with the advent of low-cost chips and electronics. “Woz” was among several of the group who designing their own, but he had no desire to commercialize his project, even though it was groundbreaking in simplicity and also was one of the first to include color graphics.
When Jobs saw his friend’s project, he wanted to make a business. While other home-brewers were also starting companies, Jobs was unique in understanding that personal computers could appeal to an audience far beyond geeks.

“If you view computer designers as artists, they’re really into more of an art form that can be mass-produced, like records, or like prints, than they are into fine arts,” he told me in 1983. “They want something where they can express themselves to a large number of people through their medium, and their medium is technology and manufacturing.” Later he would refine this point of view by talking about Apple as a blend of engineering and liberal arts.
The most visible manifestation of this was the elegant case that housed the Apple II. Jobs paid a fledgling industrial designer named Jerry Manock $1,500 to design a plastic case with an earthy beige. (Manock wanted to be paid in advance because, he told author Michael Moritz, “They were flaky-looking customers and I didn’t know if they were going to be around when the case was finished.” Jobs talked him into waiting for his payment.)

“He told me about the prices he was getting for parts, and they were favorable to the prices HP was paying,” his friend Alan Baum said.nJobs would make these deals while Woz and a small team of teenage engineers worked in the Jobs family garage. Every so often Jobs would drop by and impose his views on the project. “He would pass judgment, which is his major talent, over the keyboards, the case design, the logo, what parts to buy, how to lay out the PC board so it would look nice, the arrangement of parts, the deals we chose … everything,” said Chris Espinosa, one of the original group. One other thing Jobs did was convince Wozniak to quit his job at HP and work full time for Apple. When Woz originally demurred, Jobs called all of Woz’s friends and relatives, putting so much pressure on that the gentle engineer capitulated. Once again, Jobs had gotten what he wanted.

Jobs gave thought to what kind of company he wanted Apple to be — once he told me his wish was to create “a $10 billion company that didn’t lose its soul.” He would call up the premier CEOs of Silicon Valley — Andy Grove, Jerry Sanders — and ask them if they would take him out to lunch so he could pick their brains. He later realized that he and Woz were an object of curiosity to people because they were so young. “But we didn’t think of ourselves as young guys,” he said. “We didn’t have a lot of time to philosophize,” he told me. “We were working 18 hours a day, seven days a week — having fun.”

The Apple II was a hit, and so was the company. But unlike Bill Gates, who founded Microsoft in the same period, Jobs did not run Apple. Realizing that his company might go farther if run by professional management, and not a barefoot 22-year-old with a Fidel beard and an abrasive personality, Apple hired a chief executive for adult supervision. Over the next few years, Apple became the most popular of the small field of personal computers, and on Dec. 12, 1980, Apple held an IPO. It was highly unusual for a company that young to do so, but it turned out to be the biggest holding that mantle until IBM entered the field in late 1981.
As Apple became a larger business, Job was somewhat adrift. “The question was, ‘How do I go about influencing Apple?’” he explained in 1983. “Well, I can run around telling people things all day, but that’s not going to result in what I really want. So I thought a really good way to influence Apple would be by example — to be a general manager here at Apple.”

In 1979, as part of the efforts to develop a more advanced machine called the Lisa, Jobs led a team of engineers on an excursion to Xerox PARC. He later described it as “an apocalypse.” He immediately declared that the principles of the Xerox Star — mouse-driven navigations, windows, files and folders on the screen — be integrated into Lisa, an effort which jacked up the cost of the machine almost five-fold. But Jobs’ management style consistently offended the Lisa team, and he looked elsewhere in the company for a group to lead. He found what he was looking for in a skunkworks project off the campus led by a talented computer scientist named Jef Raskin. The small team was working on a low-cost computer to be called Macintosh. “When Steve started coming over, Jef’s dream was shattered on the spot,” said Mac team member Joanna Hoffman.

The Macintosh was a turning point for Jobs, who worried about being branded as the guy who founded Apple, but not much more. Jobs was a relentless, even punishing leader. But his passion earned him the loyalty of the small young team. He encouraged them to think of themselves as rebels. “It’s better to be pirates than to join the Navy,” he told them. A skull and bones flag flew on their office building.

While the Lisa was inspired by the Xerox’s “graphical user interface,” Macintosh took it a step farther. It worked with even more simplicity, was faster, and had a distinctive shape — inspired by the Cuisinart food processor, an appliance Jobs admired. When I interviewed Jobs about the Macintosh in November 1983, he explained to me that while the Lisa team wanted to make something great, “the Mac people want to do something insanely great.”
During that interview I asked Jobs for an explanation on why he sometimes gave harsh, even rude assessments of his employee’s work. (Though in some respects Jobs became more mellow later in life, such blunt criticism became a trademark.) “We have an environment where excellence is really expected,” he said. “What’s really great is to be open when [the work] is not great. My best contribution is not settling for anything but really good stuff, in all the details. That’s my job — to make sure everything is great.” Even though Jobs made life hell at times for the brilliant young engineers of the Mac team, they generally regard the experience as the highlight of their professional careers, a magic moment. And indeed, the Macintosh experience provided a template for the culture of many startups, down to the lavish perks provided to the workers.

On Jan. 24, 1984, Jobs publicly unveiled the Macintosh. A night earlier, a stunning, cinematic Super Bowl ad for the computer galvanized the nation; many consider it the greatest commercial in history. The Mac was a sensation. It also cemented Jobs as a national figure, featured with major features in Newsweek and Rolling Stone. (Though he was disappointed that Rolling Stone did not put him on the cover. Jobs actually called publisher Jann Wenner to plead his case. Wenner told him, “Don’t hold your breath.” lI said ‘All right, but you ought to think about this more,’l Jobs futilely recounted. Later, Jobs’ demands for magazine covers would be eagerly accommodated.)

The Macintosh was arguably the most important personal computer in history. It introduced a style of computing that persisted for decades (sadly for Apple, most people experienced the graphical user interface via Microsoft Windows computers, not Macintosh.) It made computers sexy.

But the Mac did not initially sell as well as expected. This failure, as well as Jobs’ managerial shortcomings, put Jobs in jeopardy at the company he founded. For several weeks, he conducted a backroom battle with John Sculley, the former CEO of Pepsi he had personally recruited to run Apple in 1983. (Jobs had famously challenged Sculley by asking, “Do you really want to sell sugar water for the rest of your life?”) But Sculley outmaneuvered Jobs by winning the backing of the board. And on May 31, 1985, he fired Steve Jobs.

The ouster was cathartic for Jobs. “You’ve probably had somebody punch you in the stomach and it knocks the wind of you and you can’t breathe. That’s how I felt,” he told Newsweek. But he regained his breath by starting Next, a company that designed and sold next-generation workstations. The Next computer, a striking jet-black cube, never caught on (though Tim Berners-Lee would write the code for the World Wide Web on it), but its innovative operating system turned out to be of lasting value, and Jobs kept the company going as a software concern.

During those years, Jobs took on a second company besides Next. A struggling computer graphics studio founded by George Lucas was looking for a white knight, and Steve Jobs took the role. It was to be called Pixar. Under Jobs’ guidance, Pixar morphed from a software company into a movie studio. It produced the first full-length computer-animated feature, “Toy Story,” the first of a series of monster hits for the studio.

Running Pixar was a step in Jobs’ growing maturity. He was wise enough to focus on the deal-making and let the creative movie-makers, like director John Lassiter, do their work. He also got valuable experience in Hollywood. Eventually, he sold Pixar to Disney in 2006 for $7.4 billion.

But it was that other company, Next, that brought Jobs back to the company he co-founded. Apple needed a powerful new operating system, and the Next could provide one. Apple bought Next, but its troubles went far deeper. People were writing the company’s corporate obituary. In 1997, the board of directors fired CEO Gil Amelio and turned to one of its founders to revitalize the company. One of the first things he did was forge a deal with Apple’s blood rival, Microsoft.

While Jobs emphatically stated that he was only filling an interim role at Apple — “I hope we can find a terrific CEO tomorrow,” he said that August — he took to it so enthusiastically that it was no surprise that he removed the lowercase “i” from his iCEO title in 2000. By then he had made Apple profitable again.

A turning point was his introduction of the iMac in May 1998. Almost a year after taking control of Apple, Jobs called me and invited me to spend a few days with him as he launched his first big project. I got a glimpse of the exacting preparations he makes for a launch, monitoring every detail. (He nixed the sound of a clarinet on a video soundtrack to a clip because it sounded “too synthetic.”) When an employee showed him some work at one point he said simply, “This is a ‘D,’” and turned away. But at the launch itself, he was the picture of poise.
The iMac was a huge success, an all-in-one machine that sent the message that simplicity, beauty and power would be behind Apple’s comeback. He also simplified Apple’s product line to four computers — consumer and pro versions of desktop and laptop. “Focus does not mean saying yes, it means saying no,” he explained. “I was Dad. And that was hard.”

But with each iteration of computers, Apple was gaining fans. The one exception was Jobs’ introduction of a monitorless machine called the Cube. It was perhaps the most beautiful computer ever. But in this case, Jobs let his aesthetic instincts overwhelm his sense of the marketplace. It was a rare failure.

In 2000, he explained how competitors still didn’t understand Apple’s mix of art and science. “When people look at an iMac, they think the design is really great, but most people don’t understand it’s not skin deep,” he said. “There’s a reason why, after two years, people haven’t been able to copy the iMac. It’s not just surface. The reason the iMac doesn’t have a fan is engineering. It took a ton of engineering and that’s true for the Cube and everything else.”

In October 2001, Apple introduced a music player, the iPod. It broke ground as the first successful pocket-size digital music player. Because Jobs had a tremendous ability to locate and hire brilliant talent, his team produced it in less than a year. The process is indicative of the way Apple ran. Though Jobs could be overwhelming in pushing his point, he understood that ultimately, his products would not work if their best ideas were discarded. In the case of the iPod, hardware designer Tony Fadell knew how to get his best prototype approved by Jobs — he showed his boss three different designs, with one clearly superior, to give Jobs a chance to berate two efforts before saying, “That’s more like it!” with the last.

Sometimes, Jobs would dig in and only back down when the marketplace spoke. Again, the iPod was an example. Originally, he felt that the iPod should only work with Macintosh’s computers. But its instant popularity led him to agree with some of his employees who had been arguing for a Windows version. When iPod became available to the entire population, it really took off. Apple has sold over 300 million iPods.

“If there was ever a product that catalyzed what’s Apple’s reason for being, it’s this,” Jobs said to me of the iPod, “Because it combines Apple’s incredible technology base with Apple’s legendary ease of use with Apple’s awesome design… it’s like, this is what we do. So if anybody was ever wondering why is Apple on the earth, I would hold this up as a good example.”
What’s more, to support the iPod, Jobs began the iTunes music store, the first successful service to legally sell music over the internet. Though the record labels were notoriously conservative about such deals, “They basically trusted us and we negotiated a landmark deal,” Jobs told me. The iTunes store would sell billions of downloaded songs.

The iPod was a turning point for Apple and Jobs. Competitors never figured out how to top it. Every year, he would come out with a new set. One year he stopped selling the most popular model, the iPod mini, for a totally new model called the Nano. The product line would be laid out on a table. He’d talk about which color he liked best. Often he’d pick one up. Isn’t that amazing?

This satisfied him deeply because Jobs loved music. His heroes were Bob Dylan and the Beatles. I once asked him if his dream was to get Paul McCartney to perform one of those sweet two-song live sets that often close his keynotes. “My dream,” he joked, “is to bring out John Lennon.”

While Jobs reveled in his professional spotlight, he was more circumspect about his private life. He distrusted the most reporters, ever since a 1982 Time article mocked his pretensions and exposed his darker side. Jobs, who thought Time was going to make him Man of the Year (it chose “the personal computer” instead) was wounded. “I don’t mind if people don’t like me,” he said in late 1983. “Well, I might a little…but I really mind it when somebody uses their position at Time magazine to tell 10 million people they don’t like me. I know what it’s like to have your private life painted in the worst possible light in front of a lot of people.” Twenty years later, he would still be complaining about that article. (The writer, Michael Mortiz, later became a powerful venture capitalist, funding Yahoo and Google.) But Jobs would not comment on subsequent accounts of his life that detailed not only rude professional behavior but his original refusal to support his first child (later he accepted paternity).

Jobs was a proud, proud father of four children, three from his marriage to Laurene Powell. He was protective of them — whenever he shared a story about one of his children in an interview, he cautioned that the remark was to be off the record. (His widow and all four offspring survive him.) But he clearly took a huge pride in parenthood.

It was July 2004 when Steve Jobs learned he had a rare form of pancreatic cancer. He originally treated the disease without sharing much about it to the public. Critics wondered whether Jobs and Apple had skirted corporate disclosure regulations by not revealing more information. After what seemed to be a successful initial surgery, Jobs would vary from his circumspect stance just once, in his address to the Stanford graduating class of 2005. That speech, by the way, might be the best commencement address in history. When designing computers, Jobs and his team built the one they wanted for themselves. And now he gave a speech that Steve Jobs would have wanted to hear if he had graduated from college.

“No one wants to die, even people who want to go to Heaven don’t want to die to get there,” he told the Stanford graduates. “And yet, death is the destination we all share. No one has ever escaped it. And that is as it should be, because death is very likely the single best invention of life. It’s life’s change agent; it clears out the old to make way for the new … Your time is limited, so don’t waste it living someone else’s life.”

Steve Jobs never did that. After his cancer treatment, he took Apple’s biggest risk yet — developing a phone. Of course, it would not be just any mobile phone, but one that combined the media savvy of the iPod, the interface wizardry of the Macintosh, and the design style that had become his trademark.

As with all his products, Jobs was fanatical in monitoring every detail — including the press reaction. I was among the few journalists who got to test it before its release. Soon after I received the unit, I was walking down Broadway and my test unit got a call from “Unknown.” It was Jobs, ostensibly wanting to know what I thought, but actually making sure I understood how amazing it was. I acknowledged that it was extraordinary, but mentioned to him that maybe nothing could match the expectations he had generated. People were calling it the “Jesus phone.” Didn’t that worry him? The answer was no. “We are going to blow away the expectations,” he told me.

The iPhone did just that — especially after Jobs put aside his initial view that only a limited number of developers would be permitted to write applications for it. Apple’s App Store eventually included hundreds of thousands of programs, giving Apple a key advantage. As Apple’s current CEO boasted only Tuesday, the iPhone is the world’s most popular phone.
In 2008, observers noted that Jobs had lost an alarming amount of weight, and looked ill. People wondered whether the cancer had reoccurred. In what looks in retrospect to be misdirection, Apple released a statement calling it a “bug.” When I ran into him in Palo Alto in that time period, Jobs brought up the subject, elaborating in detail about how he was suffering a temporary malady unconnected with this cancer. But he got thinner, and seemed weaker, and took a leave of absence.

Despite his health problems, Jobs kept Apple on a steady pace of innovation. When he returned to Apple — after a liver transplant which was acknowledged only months later — his first appearance was an iPod event. “This is nothing,” he told me after the show. “Wait till you see what’s next.”

He was talking about the iPad, the tablet computer that he introduced in April 2010. Expanding on the touch-based interface of the iPhone, Jobs had pulled off a vision of computing that many (including his rival Microsoft) had been attempting for decades. The iPad instantly established tablet computing as a major category, and as with the iPod, competitors could not match it.
Earlier this year, he took a second medical leave of absence. Tim Cook, the operational wizard who had been appointed Chief Operating Officer, would become the temporary CEO. Jobs would still be involved in product design and strategic direction, but freed of everyday responsibilities.

Jobs came and went to Apple as he was able, driven by a town car to One Infinite Loop in Cupertino, centerpiece of the campus of the company he built, only a few blocks where he had gone to school. He would walk past the receptionist and take the elevator to his fourth-floor suite that included his office, a small staff, and a large boardroom where he had overpowered music executives, raked employees over the coals, and approved products that millions adored. With no daily chores to perform, no crowded appointment book, there could be a strange and tranquil sense of timelessness, even as he helped shape products in progress, and dreamed up new ones.

It seemed Jobs had come to terms with his fate. He would spend time with his family and do what he could at Apple.

In June he gave his last “Stevenote,” talking about iCloud. One could have hoped that he would give many more. But on August 24, he sent a note to Apple’s board that he could not resume the CEO role.

He took the role of executive chair and reported that he would continue to participate in product decisions and strategy. But clearly he was headed towards the end that came today, quietly surrounded by the people who loved him and knowing that many millions of people who never met him would miss him desperately. As he told the Stanford students:

Death is very likely the single best invention of life. It’s life’s change agent; it clears out the old to make way for the new.

The full legacy of Steve Jobs will not be sorted out for a very long time. When employees first talked about Jobs’ “reality distortion field,” it was a pejorative — they were referring to the way that he got you to sign on to a false truth by the force of his conviction and charisma. But at a certain point the view of the world from Steve Jobs’ brain ceased to become distorted. It became an instrument of self-fulfilling prophecy. As product after product emerged from Apple, each one breaking ground and changing our behavior, Steve Job’s reality field actually came into being. And we all live in it.