Steve Jobs: 1955 – 2011
October 6th, 2011
The news just broke. Gotta go to work. Needless to say, this man influenced the computer industry like none other.
Especially upon the event of his passing, his commencement speech at Stanford University is a particularly appropriate way to remember him.
Steve’s genius in the mid-1970s was seeing that things could be done Better than they were being executed.
I didn’t get into Apples until my high school got a lab of them in mid-1983, but all the collateral material — the user guides, the disk labels — plus the form factor and ergonomics of the hardware itself was first-rate, showing that people actually cared about the quality of the product they were making.
The rest of the industry in the 1970s were rather half-assed about all this stuff.
Steve could have worked his magic on any other computer company’s product line. The Apple II itself had its strengths, but also had weaknesses. Had Steve ran the Atari or Commodore product offerings they would have been much better for it.
Steve ran into the IBM PC buzzsaw in the early 1980s, that must have been a humbling experience, though by then the company was only committed to the Apple II because they had nothing to replace it yet, the Apple III having failed miserably already.
The PC only got usable with the XT, which came out in mid-1983. The Mac was supposed to be out by then, but by then Apple knew what it was and that it was going to be a better computer than IBM’s.
The PC had the strength of the company behind it, and Microsoft giving the industry the opportunity to write to the MS-DOS platform running on any IBM-compatible, which tilted the industry toward PCs thanks to the network effect and people willing to compromise to save money (compared to Apple’s sky-high pricing in the late 1980s).
Steve’s contribution to Apple (before he came back) ended with with getting the Macintosh 512 and Laserwriter out. That was the first semi-usable Mac, Macs with just dot matrix printing in the mid-1980s were in fact semi-useless compared to what emerged in the late 1980s with more memory, hard drives, larger monitors, better font technology, and multitasking capability.
Moving to NeXT, Steve made a no-compromises machine that was good but not really good enough compared to the II series, at least for me.
The first NeXT machines were in fact compromised by eschewing hard drives. That was dumb.
They were also compromised by eschewing color output. NeXT could have done a lot better had it gone more the Amiga 3000 route and tried to be more of a generalist than whatever pretensions it had in the personal workstation market, a market that clearly didn’t exist, though NeXT was good enough to give TBL the tools to make the first web server/client system, along with inspiring Java/C# managed code environments.
Apple 1997-2002 was only marking time until the company could create the Next Big Thing. I was at Apple when the iPod came out, and it wasn’t that world-changing, not compared to the iPhone, which had 3 elements that changed the world — first class industrial design (almost buttonless and the large high-PPI multitouch display), very high computational and graphics performance, and the AppStore.
Apple was first to put all these pieces together, just like it was first to put together the bitmapped display+Laser printing+Open OS pieces together in the 1980s.
I’m not a computer person so I don’t really know much about Steve Jobs, but I’m turning 51 soon and when someone in their 50s kicks the bucket, I sit up straight and take another handful of vitamins. You turn 50 and you’re entering the country of heart attacks, cancers, and other diseases. An apple a day doesn’t necessarily keep the doctor away.
Well, I recall the 1980s. That’s when I started working at big companies.
Companies liked buying IBM machines back then. IBM held the respect of business persons at the time. Apple was scene as more of a toy or novelty. I think Apple, that is, Jobs, made a huge mistake by clinging to the arrogance of high margins. This created the opportunity for Microsoft to co-opt the form factor and put it in IBM-cloned machines just before IBM themselves rolled out OS/2 which was an excellent operating system.
There’s a lesson in there somewhere. Gates came into market second, after Jobs, in graphical interface, yet Gates managed to steel the market share.
IBM came into the market second after Gates, yet Gates managed to hang onto the market share, with clearly an inferior product to both the Mac Architecture and the OS/2. In retrospect, IBM, having arrived late, should have given OS/2 away. Gates, for his part, in arriving late, benefited from Apple insisting on high margins which netted them low volume and almost pushed them out of existence around the time of Jobs departure from Apple.
The world would have been a better place if Gates would have lost out to either IBM or Apple in the late 80s early 90s. How many millions of productive hours were lost from windows machines locking up and destroying document creations? Thank God Shakespeare didn’t compose Romeo and Juliette on a DOS/Windows machine.
Perhaps, as a person with a strong liberal arts education, and one who used to design manufacturing applications, some of them very large and complex (a widespread automated customer order entry system that integrated customer demand – through edi interfaces that could translate any customer file format into production requirements – with production scheduling and shipping at Caterpillar’s engine division went in and worked first time perfectly after 16 months of work – of which 9 was analysis, 4 was design and only 3 was development, but easily maintained and understood by the documenation I created) what Jobs did doesn’t seem that novel to me.
Many if not most professions or occupations, involve the aligning of things: figuring out what is possible and pulling it all together to pull it off. In 1999 and 2000, when all my friends and I were working at Sprint and buying palm pilots or clones there of, we talked about total digital convergence: one device that did all things. If we could do that, then Jobs doing that does not seem that novel. What seems novel is that other CEOs didn’t envision that. What made Jobs different was that he owned both a computer hardware manufacturer and software company and so could do something about executing the vision. He also functioned as a consumer in chief and could force engineers and accountants to align.
The Iphone converged a lot of digital activities: computer, internet, cell phone, cameras, music, video etc… but it relied on the invention of a reliable and remarkable touch screen technology that hadn’t existed heretofor for very long. Once that technology was available and alineable and the others were now align-able with it, the opportunity was there for whichever manufacturer/designer saw it first. But there’s still more room for more convergance – though we are getting very close to total digital convergence. Readers are one of the outstanding items. The iphone form factor is too small for that. Samsung’s infuse comes much closer as it is 4g, has a bright screen, and 4.5″ screen and yet very thin and light.
I’m still waiting for a device that I can attach to my wrist like a watch, that extends up my arm 6 inches or less, that provides me with everything (see Pretator’s device). The actual pieces are now there. Mareware produces a ‘sport convertable’ cover for iphones and ipod touches that functions this way but the strap is made for the upper arm and is too long for the wrist (especially my tiny wrists). Nonetheless, I will be trying this out in the next few months on either my ipod or a used unlocked 3g iphone.
I think Apple, that is, Jobs, made a huge mistake by clinging to the arrogance of high margins.
Steve was already working on what became the NeXT machine in mid-1985 and not running Apple any more.
Apple’s arrogance in the late 1980s came from being first to market with a compelling desktop publishing solution.
I paid $6000 for my Mac IIcx in 1989 and loved every minute I had it. Paid for itself many times with all the work I got for it.
Hand-carried that m-fer + 13″ RGB monitor on the plane to Tokyo!
Apple’s main mistake was not going mass-market with the Mac in the 1980s, only waiting to ship the LC until after Windows 3 came out in 1990, which stole their thunder a bit. Windows 95 ate their lunch.
This created the opportunity for Microsoft to co-opt the form factor and put it in IBM-cloned machines just before IBM themselves rolled out OS/2 which was an excellent operating system.
IBM had the same problem as Apple in the hardware sector, cheap Taiwanese brands taking their margins away (and later, Dell) thanks to Microsoft licensing MS-DOS to everyone.
OS/2 came out in 1987, and the cloners were already going gangbusters a year or two earlier. Merits of OS/2 aside, IBM shot itself in the foot in 1987 with its PS/2 fiasco.
There’s a lesson in there somewhere. Gates came into market second, after Jobs, in graphical interface, yet Gates managed to steel the market share.
Android is history rhyming.
Apple insisting on high margins which netted them low volume and almost pushed them out of existence around the time of Jobs departure from Apple
Jobs left Apple in late 1985, and lost control of the company in May after failing to get the board to oust Sculley.
Apple was doing OK in the 1990s, but didn’t have the profits to really support much R&D.