Archive

Archive for the ‘Computers and the Internet’ Category

Building Your Own PC, Part 1

August 1st, 2010 22 comments

Cmc01At school this semester, we organized a Computer Making Club; the school gives any club a semester (4-month) budget of ¥50,000, enough to buy all the components for a decent-mid-range computer. Before we did this, I had always wanted to do this kind of thing, but never got around to it. In part, I figured there would be technical hurdles that would involve a great deal of study, and so put it off until I had time to dive into it. As it turns out, it was not all that difficult to put together– the process is pretty simple, though getting to understand the minutiae about parts is much more involved. I should say that if you know nothing about computers, and especially if you have a phobia about them, you’ll probably have to overcome a bit of a steep learning curve. It helps to know something about computer hardware in general beforehand.

Not that I’m suddenly an expert or anything, but fresh from having successfully slapped together my first PC, I’m beginning to get hooked. I thought I might describe the process here–though admittedly, one of the reasons is so I can look back in a few years and laugh at how naive and uninformed I was. But we did build a PC without much hassle or intensive study, and it did work, so why not describe it here.

First off, you won’t need to solder anything; it will involve buying perhaps 10 or so different parts that you more or less plug together. They include:

The CPU (the “brain” of the computer)
The Motherboard (the computer’s foundation which everything else connects to)
RAM (the more you have, the more apps you can run)
A Case
A Hard Disk Drive (SSDs are also available, but still expensive)
An Optical Drive (unless you go Blu-ray, these are commonly cheap)
A Video Card (optional)
A Keyboard and Mouse (cheap)
A Monitor (you might want to go used on this unless you can spare it)

Once you have the parts, the process is fairly straightforward: install the CPU (with cooler) and RAM on the motherboard, and install the motherboard into the case. Then put the hard disk and optical drives into the appropriate bays. Add the video card if you have one. Connect all the power and data leads, tie the cables off so they don’t get in the way. Close the case, attach your peripherals (mouse, keyboard, monitor), and start up. If everything works, then restart with the OS disk in the optical drive, and install the OS. That’s pretty much it in a nutshell.

Sounds easy, right? Well, in principle, it can be, but maybe we were lucky: nothing went wrong. If the computer doesn’t work, however, then I imagine it’s time to start studying. And though nothing went wrong, we did hit bumps in the road where we spent an hour or two trying to figure our way through a few things–mostly for the classic shortcoming in the world of computers, namely insufficient documentation. However, we did have the advantage of having people with knowledge and experience enough to figure some stuff out and make the right decisions, though none of us were anywhere near expert at this.

While you likely won’t have to learn the intricacies of the BIOS, and you won’t need to solder or really make anything, there is one thing you should do: familiarize yourself with the parts. Know what they do, and understand the different varieties and categories involved.


Cmc Cpu01The first thing you’ll want to do is decide which CPU you want to use. You could start with the motherboard, and who knows, maybe that’s the smarter choice, but frankly I see the CPU as being a better starting place. It is important to realize that the two must fit: different CPUs have different configurations, and can only fit into certain sockets on motherboards. You cannot just plug any CPU into any motherboard.

There are about half a dozen different common socket configurations, based on how many pins/contacts are on the bottom of the CPU, and how they’re laid out. Intel and AMD are the two big makers, and there are 3 or 4 configurations for each; Intel has the LGA775, LGA1156, and the LGA1366, for instance. The LGA775, as an example, is an older type which supports most currently available Intel CPUs including the later Pentiums and Celerons, and the Core 2 CPU line.

Both Intel and AMD make lots of different CPU models. In this post, I’ll just focus on Intel’s, as they tend to be more popular (and adding AMD’s line to the mix would complicate things). Let me give a quick list of major Intel CPUs, ranked very roughly in terms of speed and age:

Celeron
Pentium 4
Celeron D
Pentium D
Core 2 Duo
Core 2 Extreme
Core 2 Quad
Core i3
Core i5
Core i7
Xeon

If that’s not bad enough, each one can have dozens of different sub-models. The i5, for example, has the 400, 500, 600, and 700 series, with at least half a dozen different CPUs in all. The Core 2 line includes well over a hundred different CPU models released over the past five years.

Each chip has different features, and it can be difficult to figure out which is faster than which–primarily because speed is determined by a variety of features, including the number of processing cores, the amount of cache (on-chip memory storage), the processor speed (in GHz), and the processor architecture in general. You might think that a 6-core AMD Phenom II at 3.2 GHz would outperform a 4-core Intel i5 at 2.66 GHz, but not really. The page I just linked to allows you to see comparative benchmarks, and so get a better idea of what outperforms what.

Fortunately, your choice is simplified by a few factors. First, the Celerons and Pentiums are outdated; fine if you want to buy cheap, used parts and don’t care much about performance, but otherwise forget it. On the other end, chips like the i7 and the Xeon can be costly; unless you’re building a speed demon and price is not an issue, you’ll likely stay away from those as well. For someone with a budget, the Core 2 Duo, Core 2 Quad, i3, or i5 are probably your best bets. After that, look for what’s available in the your price range, and you’ll find you have a much shorter list to choose from. While some chips are much faster than others, CPUs in a similar range will have only minor differences in speed–so you probably shouldn’t fret too much over which is faster unless you’re trying to squeeze every last drop of speed from your system.

If you want to make a computer for less than ¥50,000, for example, the Core 2 Duo might suit best; for less than ¥70,000, the Core i5 would be nice. (I am figuring prices based on mostly new parts and including a video card.) If you want to make a super-cheap computer, then you’re in Celeron and Pentium range, and it starts making more sense to look at used parts.

Fcg41Next, the motherboard.

If you go for the Core 2 Duo, like we did (budget concerns), you will need a motherboard with an LGA775 socket. They generally begin at just under ¥5000, and go way up from there. The model we got–a Foxconn G41MX-K–has an OK set of features. It uses the G41 chipset (a chipset is, as it sounds, a set of chips used as a group on a motherboard to control its main functions), not great but respectable; it has a standard PCI slot set (PCI slots allow you to plug in extensions like video cards to improve performance; a PCI Express x16 is something you’ll want to have); and it uses DDR2 RAM (a type of memory; a newer type, DDR3, is newer and better, but not compatible).

Though your CPU will determine the socket type, you should also decide early on which motherboard form factor you want. The main types are based on ATX, the favorite for DIY computer building. The variants include (from largest to smallest) Extended ATX, standard ATX, and MicroATX. Also available is MiniITX, a low-power motherboard form. If you want to have a smaller CPU box, then MicroATX may suit you best. That and the standard ATX seem to be the most prevalent.

The motherboard also has most of the computer’s cable ports built in to its side (in the image above, see the left side near the top); ours had the standard legacy PS2 and Serial ports (for old mice & keyboards), 4 USB-2 ports, a network port, 8-channel sound, and 2 video out ports (one classic analog VGA and one newer digital DVI). More ports can be added in other ways–for example, most cases come with built-in USB ports in the front, which attach directly to the motherboard via interior cables.

We didn’t need HDMI because that would be available on the video card. FireWire is nice, but is being phased out due to USB’s dominance. USB-3 is dandy, but only needed if you have super-fast equipment.

Again, pricing narrows down your options–if you are on a budget, then the dizzying array of choices is made a bit more simple. Our choices were shaved down to maybe a dozen or so as we wanted to stay within ¥5,500.

Cmc Ram01After that, you get RAM. 2GB is generally enough for most purposes today, but I prefer maxing out as the budget allows. We went for 4GB of RAM, which will help the machine be viable for a year or two longer. An important point is how many slots there are for RAM on the motherboard; ours had 2 slots (some have 4 or more), and so we had to get 2 RAM chips at 2GB each. Fortunately, the pricing was not any more per GB that way, though 4GB chips start getting pricey. We got two 2GB chips for about ¥8000.


These three elements–CPU, Motherboard, and RAM–make up the heart of the computer, and were the most expensive parts (roughly half of our ¥46,000 hardware budget).

Before working with them, you should have prepared a non-conductive workplace. We used a cheap plastic dish rack lined with bubble wrap, though you might want to prepare something a tad fancier. Be careful handling surfaces and contacts; it’s best to have an antistatic wrist strap to avoid static shocks which could damage the components.

Installing the CPU on the motherboard is relatively easy: open the clamp on the socket, remove the protective shield, drop the CPU into the socket, then clamp down again. Simple.

Sp522S7-1However, most CPUs require cooling units, big ‘ol heat radiators with fans. We bought our CPU used (a Core 2 Duo E7400) for ¥8000, thinking we were saving money–before finding out that new CPUs came with cooling fans, and ours didn’t, so we had to spend another ¥2000 and didn’t save any money.

Installing the CPU cooler was a real bear. First, you must not forget to apply conductive grease to the base of the cooler after making sure the contact surfaces are clean. Without the grease (or “thermal compound”) your CPU will overheat and die. It usually comes supplied with the CPU cooler. Probably a good idea to not let it come into contact with your skin.

There were 4 legs or struts that involved pushing down and twisting so they locked. There’s probably a trick to doing that, but we sure couldn’t figure it out. The first two were easy; the third took a while, and the last one almost killed us. Once the cooler is in place, you have to plug in its power connector; these things are diagrammed out on the motherboard instructions for you.

RAM, on the hand, is a cinch. Make sure it is oriented the right way, then just push down into the slot until it locks. Presto.

There’s the heart of your computer, right there.

In a following post, I’ll talk about choosing a case and power supply, and installing the motherboard into it, then adding the HDD and Optical drive. Fortunately, these are somewhat easier to learn about.

Categories: Computers and the Internet Tags:

Yep. Dell.

July 1st, 2010 3 comments

I found this story in the NYTimes via Google News yesterday:

After the math department at the University of Texas noticed some of its Dell computers failing, Dell examined the machines. The company came up with an unusual reason for the computers’ demise: the school had overtaxed the machines by making them perform difficult math calculations.

Dell, however, had actually sent the university, in Austin, desktop PCs riddled with faulty electrical components that were leaking chemicals and causing the malfunctions. Dell sold millions of these computers from 2003 to 2005 to major companies like Wal-Mart and Wells Fargo, institutions like the Mayo Clinic and small businesses.

“The funny thing was that every one of them went bad at the same time,” said Greg Barry, the president of PointSolve, a technology services company near Philadelphia that had bought dozens. “It’s unheard-of, but Dell didn’t seem to recognize this as a problem at the time.” …

A study by Dell found that OptiPlex computers affected by the bad capacitors were expected to cause problems up to 97 percent of the time over a three-year period, according to the lawsuit.

Gee whiz, that sounds familiar.

At that time, we had a computer lab full of Dells at my school. And just after their 3-year warranty expired, the majority of computers in the lab failed, within a 2-month period. It was significantly evident that something was going wrong; we had 18 computers in the lab, and every week two or three different computers would fail in the exact same way. You turn on the PC and the fan goes into hurricane-force mode, the PC is frozen, and you cannot shut down short of unplugging the thing.

I asked our IT guy about it yesterday, but he just shook his head and said, “sho ga nai,” Japanese for “c’est la vie.” He didn’t think we could make anything of it. If it were me, I’d be calling Dell and saying, “hey, I just read this article and recalled all those computers you sold us which all failed at the same time in the same way.”

But then, in the U.S., they’re having to sue Dell over this, so Dell would not be likely to comp us anything willingly. Plus, this is Japan, so good luck suing anybody.

It would surprise no one reading this blog that I would long ago have switched our school to Macs had that been in any way my choice. But it’s not my choice, and to my dismay, the school is still, to this day, buying Dells. Not the smartest purchasing strategy, dude.

Categories: Computers and the Internet Tags:

Good to See Microsoft Is Still Innovating

June 29th, 2010 1 comment

One of the key new features in Microsoft’s next OS: an App Store!

Sure is nice to see that Microsoft has stopped copying Apple.

Well, they are doing some innovating: project Natal, now known as “Kinect,” may be integrated. The feature: you get logged in automatically when you sit down, and the computer goes to sleep when you walk away. Hopefully, all that tech will be used for more than just that. Sure, such a feature would be useful in some circumstances, but most people wouldn’t care much.

It’s definitely more for an office environment. But in the end, all it really does is save you a few keystrokes a day, maybe a minute or two over the whole workday, if (a) you are in a multi-user environment, (b) you want to maintain privacy, and (c) you leave and come back to your computer several times a day. Even at that, it’s a small benefit at best. Seems like a waste of tech to me.

The “near instantaneous” startup feature sounds best–but one has to wonder what the details are. Unless they’re talking about non-volatile RAM, then it’s probably simply a matter of making “sleep” the default option instead of “shut down,” and then improving on wake-from-sleep times. Me, I don’t have any problems with waking from sleep on my Macs. It’s already nearly instantaneous on my iMac (less than a second), and only about two seconds on my MacBook Pro. I hardly even notice it.

So, all these new features sound like they wouldn’t be much use at all to me. I already have the first and the third, and don’t care about the second one. Some businesses may make use of the facial recognition, but to me it comes across as one of those cool-but-specialized features most people like to show off but then never really use, like the fingerprint scanners on laptops and such.

Presumably, Microsoft has more than this under their belt.

Categories: Computers and the Internet Tags:

Touch Computers and Changing Form Paradigms

June 24th, 2010 3 comments

As of late, there has been some debate over how multi-touch computers will evolve. Microsoft has built multi-touch into their Windows 7 OS, but it is as if they don’t really know what to do with it from there. As has often been pointed out, a touch screen on a traditional computer is not really feasible. You simply won’t want to hold your arms out to touch a computer screen for any length of time, and will quickly revert to the keyboard and either a mouse or a trackpad. You hands want to touch the table, not a wall.

But it seems like everyone is missing a key element here: we do not need to keep the current form factor of PCs. Right now, there are a variety, but all consist of a screen which is more or less vertical, perpendicular to the desktop. Even the laptop form is not perfect for multitouch, yet we expect that will persist as well.

Why?

The advantages of a touch-screen interface are fairly clear, or should be. So the question becomes, why stay married to form factors which are not consistent with the new technology? Why not simply change the way a computer looks and operates, if that form will function much better? Up until now, the proposals have not been ideal. Jobs and Apple have come up with the best short-term (or small-format) form: the iPad, which you can place in whatever position suits you best.

For a larger computer, however, say a touchscreen more than 20“ large, the tablet form will not work; it needs to be planted on a desk somehow. Microsoft has their ”Big Ass Table“ concept (which they call ”Surface,“ but I like ”Big Ass Table“ much more), but that’s a nonstarter. Aside from being horrendously big and expensive, you won’t want your workspace to be flat like that. Instead, you’re going to want a computer which lays down at an angle,easy to both view and touch or even rest your hands on.

Touchscreen-Comp

This form factor presents itself naturally. Imagine buying a computer in, say, 2013–a 30” screen which is more or less presented to you just as that, a large visual surface which constitutes your work area. The guts of the computer (if they will not fully fit in the screen) will be in the base, which acts as a pedestal. No physical keyboard, no mouse, nothing visible but this slab of a screen, tilted gently up in front of you. You turn it on (it awakens instantly, no boot-up), and it’s just there. The interface is minimal, giving you all you need to work and nothing more, getting out of the way as much as possible. Perhaps the screen could, if you so wanted, detach from the base unit in which the computer itself is housed (wireless video connection) and you could lean back with the screen on your lap, big as it might be.

Look at the video of Jeff Han at TED in 2006 giving the world a sneak preview of multi-touch, only a year later to be used by Steve Jobs in the iPhone. Note how he has the computer set up: that’s the way computers will probably be.

Categories: Computers and the Internet Tags:

MS Word 2007, 2010 and MLA References

June 21st, 2010 5 comments

When Office 2007 was announced, there was a feature I was really excited about: References. A whole tab on the ribbon is dedicated to them, allowing you to choose your style and insert and manage citations. I was very happy at this news, because one of the things we try to get students to do in our program is make MLA references–but we have all kinds of problems. Chiefly, the students are not used to making citations (they don’t learn it here like we do in the U.S.), and MLA citations can be very difficult and complex, depending on the source. I imagined that MS Word 2007 would have an editor that would allow you to choose your citation type (e.g., book, periodical, etc.) from an exhaustive list taken directly from the MLA listing, and then prompt you for all the relevant data, and then automatically insert your in-text citations and the Works Cited list, all formatted to MLA standards.

Boy, was I ever disappointed.

First, the list of MLA citations in in MS Word’s dialog box comes up in one of those incurably idiotic miniature scroll windows that show only six lines at a time, and should have been done away with 10 years ago. Second, the list of citations is incomplete; for example, our students rely heavily on electronic sources, particularly from library database subscriptions. Word’s MLA list does not allow for these. Third, the citations are not inserted intuitively; since one is citing a stretch of text, you would expect that you would select the cited text and insert–but that just deletes the text you selected and replaces it with the citation. Fourth, the Works Cited list is not automatically added at the end–you have to insert it, and even though it is required to start on a new page at the end of the document, Word will not create that area, instead putting it wherever your cursor is set. Fifth, the Works Cited list is not formatted right–the title should be centered, the list double spaced, everything 12 point text–it’s not. Worse, Word does stuff like make certain text styled, like bold and blue, where it should not be. Dates are not expressed correctly, web page article titles not included, etc. etc. Sixth, the whole thing is done in a field, which makes it extremely difficult to edit and keep straight when you must make the necessary modifications that Word did wrong. Seventh… well, the list goes on, and on. You get the idea.

In short, it’s a failure, a mess, a complete disaster which only makes adding correct citations harder to do, not easier. Disappointed, I had to steer my students away from it.

But there was hope: maybe Microsoft would improve it with the next version of Office. Certainly I could not be the only one to notice how awful it was, and Microsoft would get off their butts and make the next iteration much better.

Nope.

I found that my college’s Citrix account had upgraded to Office 2010, so I went in and checked it out. Not only has it not improved even the tiniest bit, it didn’t even upgrade to the 7th edition of MLA, which made major changes in how citations are written. True, the 7th edition came out only about a year and a half ago, but certainly they could have done something. But nope–not only does Word’s Reference feature still suck, it now sucks and is out of date.

Categories: Computers and the Internet Tags:

Stop Trying to Help Me

June 12th, 2010 1 comment

One of the things that is annoying about living overseas is that when you surf the web, many sites “helpfully” detect your location, and switch you over to a version of the web site native to the country you’re in. For me, that’s Japan. For example, if I go to “http://www.youtube.com,” it’s all in Japanese. I can set the language to English, but for some reason, it won’t allow me to use America as a location–I have to say I am in the UK to get close to the focus on English videos that I want. And that setting will time out, so every week or two I have to re-set the language. Skype is the same way–whenever I visit their site, it’s in Japanese, and I have to reset the language there as well.

While this is all an annoyance (as are most attempts by programmers to be aggressively “helpful”), at least it is correctable–you can always find a way to steer back to a version of the site in your language.

Unless you are at Gizmodo, that is. In the past, they implemented the “we’re going to help you by redirecting you to our Japanese site” protocol, but put little flags at the top of the page that would allow you to navigate back. That worked for a while. But for the past 3-4 days, the flags don’t work. I even directly type in “http://us.gizmodo.com,” and it still steers me to the Japanese site. Apparently, no one outside the U.S. (or in Japan, at least) is allowed to see what’s on their U.S. site.

The solution is simple: I’m removing them from my bookmarks.

Thanks for being so helpful, Gizmodo!

HP Buys Palm

April 29th, 2010 2 comments

They actually seem to get it: that if you want a successful touch-based tablet device, you can’t succeed with a mouse-based PC operating system. That’s one of the reasons tablets failed before the iPad: they were PCs trying to act like tablets. Apple was the first successful company to realize that tablets were waiting for multitouch, and multitouch was waiting for tablets, and they couldn’t succeed without each other, at least not at first. So HP woke up and said, “crap, we’ve gotta get an OS that doesn’t suck on a tablet!” And so they bought Palm, probably because it’s the next most-used touch-based OS after iPhone and Android, neither of which they could buy and control. Whether it’s a good enough OS and will work for HP, or if HP can make it work, is another question.

Categories: Computers and the Internet, iPad Tags:

Looking for Suggestions on Buying a Windows PC

April 22nd, 2010 6 comments

Yep, that’s right. In this case, a Mac doesn’t fit, and while I have some fair leads myself, I want to investigate every possible avenue to give the best advice to a friend.

That friend is looking for a new Windows laptop. Size should be 14~16“, less than a thousand dollars, not a powerhouse but reasonable. Form factor is not a heavy consideration–this will likely be used on a desktop at home most of the time (it needs to be a laptop as the user will move to a new country with it)–but a relatively slim and light machine would be nice, so long as it’s not so much as to jack up the price.

As for specs, the user will not be doing heavy lifting, so something like a basic Core 2 Duo with 4 gigs of RAM and the usual accoutrements (WiFi, webcam, etc.) should do nicely. While extra bells and whistles like Blu-ray or HDMI slots are not bad, neither will they be really necessary. More important is a good, solid build that will last over time, and a generally comfortable user experience. Win 7 is acceptable, but an XP setup would be best, as the person is most familiar with that and would probably not want to deal with post-XP disorientation.

Thanks in advance!

Categories: Computers and the Internet Tags:

Oh Crap, I Hope Not

April 17th, 2010 Comments off

Someone has suggested that Apple will like the money made from its “iAds” service so much that they (or others, perhaps) will begin to include the ads in paid apps, or even the OS.

Good sweet dear lord no.

The thing is, that has just the right ring to it. Think of DVDs, or going to see movies at the theater, or a multitude of other things that you pay good money for, and yet they come loaded with ads. Much of the time it is not about making enough money to stay afloat (newspapers might be an exception there), but instead is just about making more profit.

I can only hope that Apple’s noted sense of good taste would prevent them from following their noted sense of making more money wherever they can, in this case at least. There’s a lot of crap I’ll put up with, but ads in paid apps or the OS?

I had not imagined this before, but now I can think of something that would make me very possibly switch away from Apple, and to Linux.

In the spirit of killing ads in stuff you paid enough to get ad-free, here’s a post with instructions on how to skip those annoying DVD previews & ads. You know, the ones that DVD machines are designed to make you powerless to stop them. (Gawd, I hate designs that work against the user.) I always did it by pressing “skip forward” when an ad starts, but would have to do that multiple times and the whole process would take a good minute or so. The stop-play, or stop-stop-play or other methods look a lot faster. Give it a look.

Windows 7 Suffering from Vista Perceptions?

April 3rd, 2010 5 comments

Interesting new data out: Windows 7 is not exactly selling like hotcakes. Sure, it’s selling much better than Vista, which only gained 1% market share per month, but it’s still not selling much faster than that. As the chart below indicates, it seems that 7 moved from 2% penetration at retail launch (when many were using the free beta) to 10% after 5 months–a rate of 1.6% per month. Not exactly flying off the shelves, especially considering that most of the adoption is likely from new PC sales rather than from upgrades.

Vista 7 Ms

This recently came to my attention when I asked students in my computer classes which OS they used; of the Windows users, only a few had Windows 7, only a few more had Vista–the majority were using XP. Ironically, one of the Windows 7 users was someone who had just bought a Mac (and was using 7 on Parallels–he just switched from XP on a PC).

To put this into perspective, in order to reach Windows XP’s current 65% market share, Windows 7 will require roughly 40.6 months–or almost three and a half years. Mac OS adoption occurs much faster; it took Leopard and Snow Leopard just 27 months to reach 81% share of Mac OS users.

I’ve been using Windows 7 recently, and to me, it’s a pretty darn good OS. I agree with the general consensus, that 7 is what Vista should have been. Granted, 7 rips off the Mac OS far more than I had previously suspected–perhaps why I like it more–but nevertheless, I see no independent reason to use XP anymore, and if my school’s computers weren’t still XP-bound, requiring me to teach that OS, I would switch over completely.

Which makes me wonder: why hasn’t 7 taken off? It’s certainly good enough, stable enough. OK, so it’s not nearly as cheap as upgrading to Snow Leopard, but upgrade prices aren’t that bad. So why the lukewarm adoption rate? The only thing I can figure is that people are now so used to Vista mindset–that any post-XP Windows version is crap–that they don’t even consider moving to 7. If true, that’s pretty damaging to Microsoft, especially with the Halo effect for Apple becoming stronger and stronger. The iPad looks like it may move iPhone levels of success into the computer arena. If tons of people get iPads, they will probably be even more likely to make their next computer purchase a Mac than was true with the iPhone or iPod. The fact that Microsoft is now more or less refusing to develop a version of Office for the iPad is not going to help them much, what with the $30 iWork office suite being available on what is bound to be a big hit in the mobile computing space.

Microsoft is far from crumbling into dust, but these recent numbers do seem to signify some calcification; with Apple springing around with innovation after innovation, and Microsoft’s new stuff being either slow to catch on (Windows 7), a year away (Windows Phone 7), or still mired in the concept stage (the Courier tablet), it certainly does not look too good.

Categories: Computers and the Internet Tags:

Standards

March 18th, 2010 1 comment

IE9 is now being tested, the third big update for Microsoft’s browser in the past couple years. Unfortunately, it’s as much a joke as the others. On the test drive page itself, it shows an Acid3 test–and scores a dismal 55%. While it is an improvement over the laughable 20% score IE8 coughs up, it’s still a joke. Four years working furiously on this app, and Microsoft can’t even pass a web standards test better than 55%? Chrome and Safari score 100%, Opera 99%, and Firefox 93% on my computer. Is Microsoft simply incompetent, or do they truly want to break standards?

As most web designers would agree, IE makes designing web sites harder than it should be. It’d be great if (a) more people knew what a piece of junk it is, and (b) the whole world would require Microsoft to give users the browser lineup that Europe requires. It’d be great if Apple did the same thing, BTW.

Categories: Computers and the Internet Tags:

Ideas Apple Stole from Windows

March 6th, 2010 3 comments

Computerworld, known for their occasional slanted reporting, does it again in style when reporting on the “Top 10 features that Apple stole from Windows.” In fact, they just reprinted the list from an InfoWorld article from last October–but what makes it pretty pathetic is that they didn’t bother to fact-check what was roundly criticized as a badly-written article. I swear, there seems to be hardly any more editorial filtering any more.

The list provides a few solid cases of Apple swiping ideas from Microsoft, but some charges are backwards and others so bizarre as to be staggering. A quick overview:

1. Apple’s Finder Sidebar is really the Windows Navigation pane. This is mostly true. Tree Directories are a pretty old concept, going back to UNIX days. What Apple stole was the idea of putting a jump-to navigation area in a sidebar on the left side of file management (“Finder”) windows.

2. The Mac Path bar is a copy of the Windows Address bar. This is at best a stretch. Paths predate Windows, and Apple’s path display is not that much like Windows’. You can only say that Apple “copied” it because it put the information in a file management window. But such a window is the only logical place for such a feature, and Apple varied from Windows about as much as one can imagine in what is essentially a classic OS element. It would be like saying that this year’s Toyotas stole from last year’s Hondas by putting handles on the car door.

3. Apple copied Windows’ Back and Forward navigation buttons in its folder windows. Um, no. Windows took that from its own Internet Explorer, which brazenly stole them from Netscape Navigator, which got the idea from the original hyperlink software. It’s an idea that goes way back. Microsoft put that feature into its OS as part of integrating the browser so deeply that it could not be separated, and in so doing killing off the competition in a rather illegal manner. Not to mention, back and forward buttons are a pretty dead-basic concept.

4. Apple minimizes a window to app icons. Actually, NeXT did this first, and NeXT is the precursor to OS X.

5. Apple has Screen Sharing, copying Window’s Remote Desktop Connection. Um, no, Timbuktu had screen sharing on Apple way before Windows got the same thing, and it was around on older OS software (e.g., Remote Login) before that.

6. Time Machine is really Backup and Restore. Backing up data? Really? Again, it’s like saying that Mazda stole brakes from Ford.

7. Apple’s System Preferences are a rip-off of Window’s Control Panel. This is a real “WTF?” moment. Apple’s original Mac OS had something actually called a “Control Panel” which Microsoft blatantly copied from Apple–in almost its exact form. Then again, older OS’s grouped preferences together, so the idea is not new–but Apple copied nothing from Microsoft here, while Microsoft clearly ripped off Apple’s presentation.

8. Apple has support for Microsoft’s ActiveSync and Exchange 2007. Again, WTF? These are licensed technologies. Apple no more “stole” them than Microsoft “stole” TrueType fonts or support for FireWire.

9. Apple’s Command-Tab rips off Windows’ Alt-Tab. FINALLY, here’s something that Apple blatantly stole from Windows. Probably the only clear-cut theft in the entire list.

10. Apple’s Terminal is Windows’ Command Prompt. Once again, WTF. Seriously. UNIX, anyone? Heck, I think Apple’s first computer, before Microsoft even had and OS, had a command prompt.


In the world of computers, there is a lot of borrowing and stealing, but creating “top ten” lists equating Apple’s theft of OS ideas from Microsoft to Microsoft’s from Apple just smacks of false equivalencies–trying to be “fair and balanced” by saying “both sides are equally bad” when that is clearly not the case. Everyone ripped off ideas from everyone else, but there is no question that Microsoft is the champion of ripping things off.

Some claim that Apple ripped off Microsoft’s Task Bar with its Dock–but that’s kind of like saying that the Segway ripped off its idea from roller skates. Microsoft, however, did rip off Apple’s Dock in Windows 7’s Task Bar remake. Aero Peek and especially Flip 3D are blatant rip-offs of Apple’s Exposé, and much of Windows’ basic design is stolen from Apple’s original implementation of the GUI.

Some say Apple stole from third parties–most notably that they stole Dashboard and its widgets from Konfabulator. However, Apple didn’t steal it as much as it reclaimed it–Konfabulator “stole” the idea from Apple’s original Desktop Accessories feature. And I would not be at all surprised if that idea had been present in some form somewhere else.

Even some rip-offs are not as much a rip-off as one would imagine. Take the GUI, for example–many would say that Microsoft stole it from Apple, seeing it in the original 1983 Lisa and then rushing to put Windows 1.0 on the market. But then others will point out that Apple ripped off the GUI from Xerox. That’s not exactly true, however–Apple hired Jef Raskin, who pointed Apple to Xerox PARC, but Raskin had brought some of those ideas to Xerox in the first place–and those ideas stem from work done by Douglas Engelbart at SRI as far back as the late 60’s. Engelbart invented the mouse–not Xerox–and Apple paid SRI, Engelbart’s employer, for use of the patented device.

The idea of stealing in the OS world is a bit of a spectrum: on one side of the spectrum, you have features which are natural ideas which would be difficult to do any other way–like expressions of the directory path, for example. These are things that can’t be stolen any more than you can “steal” the idea of some kind of steering device on a vehicle. On the other end of the spectrum, you have either unique features or very specific implementations of basic features which can very much be ripped off. Microsoft happens to regularly inhabit that end of the spectrum, more than just about anyone else. Internet Explorer was nothing but a rip-off of Netscape Navigator. Apple steals, but it does so less. When it does, it is usually either a feature widely recognized as useful, or it is recreated with new functionality. The theft of Microsoft’s alt-tab window switcher is an excellent example of both: it was a feature that was a no-brainer to include, and Apple did a much better job of implementation, both graphically (admit it, Apple’s version looks ten times better) and functionally (e.g., Apple allows you to quit programs while going through the list). Not that they didn’t rip it off, of course–they very much did.

Competency 101: Doing the Obvious

February 24th, 2010 2 comments

In 2002, CEO’s from the leading technology companies in America called on the federal government to adopt a goal to give 100 Mbps Internet connections to 100 million homes and small businesses by 2010. This was hardly a pipe dream: Japan’s “e-Japan” policy called for 30 Mbps nationwide by 2005–and they achieved it a year early. 100 Mbps connections are now ubiquitous here, and 1Gbps connections have been available for more than a year now. Of course, this is easier to do in Japan, but hardly impossible to do in America.

Bush gabbed about such goals in 2004 and probably at other times, but never actually did anything. This was typical of Bush where high-minded tech and science goals were involved: taking credit for calling for stuff but then never funding it or moving forward in any meaningful way. For six years after the industry leaders called for a federal plan, Bush did jack about it–and so the U.S. now lags behind lots of other countries, when it should be in the vanguard.

In comes Obama, and a year after taking office, he is doing what Bush should have done eight years ago: actually moving forward with something. The FCC, which under Bush actually hindered progress, is moving forward to demand nation-wide 100 Mbps Internet access by 2020. Yeah, pretty late–but that’s what happens when the previous administration trashes the place: you have to start from scratch. The important thing is, the Obama administration realized that something had to be done, and it is doing it.

Categories: Computers and the Internet Tags:

Brave New World

February 14th, 2010 2 comments

As people talk more and more about the ups and downs of the Apple ecosystem–the closed nature of the App Store on the iPhone and soon the iPad–one theme always comes about: Apple is being oppressive and controlling. This viewpoint, however, comes from the perspective of what we have had up until now, which is not entirely objective–nor is it without its own ups and downs. It helps to step back and take a look at the bigger picture, trying to understand the forest instead of noting vague shapes beyond the individual trees we’ve come to feel comfortable around.

Think of the current system and then the App Store ecosystem as societies. Our current setup is, to be frank, kind of like a Joss-Whedon style dystopian anarchy with overtones of corporate oligarchy. Competing major corporations (Microsoft, Apple, Google, etc.) offer the only real structure to what’s happening, and the denizens of this society often align themselves with these organizations. However, most of society is independent, trying to live freely on their own in the anarchy that exists outside the immediate corporate structures–but they can’t escape some level of corporate control as they depend on what the corporations produce. They grumble about the prices they have to pay to the oligarchy and they way things are run.

For that reason, many join the pirate culture, stealing from the corporations because they can, and because they feel they have paid enough already and are entitled to. But anarchy means that it’s not just the pirates stealing from the corporations–lawlessness abounds everywhere. Most people are beset by malware and scammer crime, and live amongst mountains of spam littering the streets lined with gaudy neon Flash billboards. They must hire anti-virus bodyguards and yet still watch their wallets and not fall prey to lures. Once in a while you may even be targeted by a professional hacker, god help you. Just as the anarchy allows you to be a pirate without much fear of punishment, the anarchy lets the element aimed at you work just as freely. Some avoid this by living closer to the oligarchy and paying full price for everything, others attempt to inhabit the Apple and Linux islands of relative stability. The Apple island has high rent, but it’s even easier to be a pirate and you’re safer from the anarchy pointed at you–but you get branded as an elitist snob who is a willing slave to Apple. The Linux island is sparsely populated and not well-supplied, but has more independence and is less stigmatized.

At some point, Apple declares that they’re forming a new state, the App Store Federation. It’s a territory pioneered by the iPhone contingent, soon to be joined by the iPad population, and who knows where it will expand to next. This new state has a rather structured form of government, introducing regular but not too excessive taxes–you’d be paying about the same most of the time in the anarchy anyway, unless you were really good at working the system just right. Apple is the government, and the OS is the constitution. They exert a certain amount of control, and they make the laws. It’s not a Democracy, it’s more like a benevolent dictatorship. But it’s clean, safe, and simple to live in. They’re not oppressive–they don’t arrest you or impose fines for misbehavior–but they do try to make you live the way they feel is best. You may not agree with what the government dictates, but most of the time it’s pretty good. There’s a certain amount of censorship to go along with it.

The society is nice, modern, bright. and relatively clean. As with the Apple island in the anarchic oligarchy, the rent is high. However, food, clothing, and entertainment are pretty cheap–mostly cheaper than you paid for before. It’s harder to be a pirate, but there’s also a police force to keep you safe. While there’s still quite a lot of spam litter and some scam artists lurking around, government regulation keeps Flash ads from making things seedy and the police force keeps most of the crime under control. You feel safer walking the streets. It’s a more comfortable life, but those who enjoyed the freedom under the anarchy feel chafed by the level of control exercised here. That’s the trade-off. If you don’t like that level of control by the government, you can always go back to the anarchy–but you lose the benefits of living here. There are some in the anarchy who try to replicate the Ecosystem without having the control, but they tend to be expensive themselves, and as copycats trying to get a quick buck, they tend not to be as stable, with shaky foundations and only superficial wealth. Google is making the best go of it, but is a bit disorganized and split between their Chrome and Android personalities.

But people often want the best of both worlds–they want the nice, clean, safe, and modern lifestyle the Apple ecosystem provides, but they also want the free-wheeling, independent, live-as-you-like and do-what-you-want lifestyle the anarchy afforded. So a splinter group formed the Jailbreak community, setting up in the foothills just outside the Apple ecosystem, living off the controlled lifestyle but at the same time sticking it to the man–who discourages the practice and tries to cut off their supplies from time to time, but otherwise just kind of lets them be. Most people commute, living partly in the Apple Ecosystem and partly out, so the control isn’t so bad even for those whom it chafes. But people can foresee a time when they may have to choose permanent residency, and are wary about what that would be like.

Apple is experimenting with a new computing culture, and computing society is reacting to it, forming new communities around it. The other major corporations are looking on warily, knowing their most of their business is still safe at the moment, but also aware that this could grow into something bigger later on. If enough people are drawn to the Apple ecosystem, it could become the new paradigm, replacing the old anarchic oligarchy with something new. Google is trying to set up its own ecosystem, but they’re less organized. Microsoft, meanwhile, just wants to maintain their current dominance in the oligarchy, but is willing to change systems if they see that things are moving that way–they’re used to watching Apple’s lead and moving in if there’s profit to be had.

Expect Apple to eventually bring the Ecosystem culture from the mobile community to computing at large–either by bringing it to laptop and desktop computers, or by having mobile devices become primary computing machines. I doubt very much that they’ll want to stop with the iPad–this system is too good for them, if they can make it work.

Where would you like to live in this world?

Categories: Computers and the Internet, iPad, iPhone Tags:

OS Adoption

February 10th, 2010 2 comments

A recent survey taken by a gaming site claims that Windows 7, after just three months in retail, has already been adopted by 29% of Windows users. That sounds impressive, except for a few small points: first, the survey was of gamers, and although the site tags gamers as “deeply suspicious,” they are nevertheless not a representative sample of the market as a whole and are more, not less, likely to adopt a new OS version than the general public. And second, the same report shows 43% of PC users still using XP, an OS nearly a decade old. Not an impressive statistic.

More objective numbers tell a worse story for Windows: according to Net Applications (which changed its Mac-to-PC methodology recently, but still is a good indicator of use within each OS sphere), a full 72% of all Windows users are still using XP, an OS that was released in 2001. And while gamers may have already voted for Windows 7 over Vista, most people haven’t; 19% still run Vista, as opposed to 8% running Windows 7.

What’s really odd is that Windows 7 has almost exclusively grown at the expense of XP–which means that while Vista isn’t growing (not surprisingly), neither are Vista users switching to 7. Virtually all of the people switching to Windows 7 are those updating from a 9-year-old OS–and much of that would be due to people just buying a new computer and getting Windows 7 installed by default. And while Windows 7 is seeing a growth rate double that of Vista, it’s still only 2% per month–meaning that at this rate (which seems to be holding steady so far), Windows 7 will see 50% adoption in 21 months. So, after Windows 7 will have been out for two years, only half of Windows users will likely have switched to the OS–even though it started with 70% of Windows users stuck with a decade-old OS.

On the Mac side, adoption of new OS versions is much stronger. Despite Snow Leopard offering very few visible new features, already 35% of Mac users have upgraded (an average of 6% per month). OS 10.5 users dominate with 46%, meaning that 81% of Mac users are running an OS released since 2007 (as opposed to 27% of Windows users doing the same), and adding in Tiger (10.4), 96% of Mac users have an OS released since 2005. Snow Leopard is currently growing at a steady 4% per month, meaning that it will have reached 50% adoption in just 9 months since release.

Just FYI.

Categories: Computers and the Internet Tags:

Why the iPad Is Deceptively Good

January 31st, 2010 5 comments

A lot of people are panning the iPad, voicing a variety of complaints. It’s not revolutionary, they say; there’s nothing new here, it’s just a giant iPod Touch. It’ll be too heavy, too awkward, I don’t see how I will hold it or use it for such-and-such an application. It doesn’t replace other devices like the iPhone did, putting the features of the cell phone, iPod, and PDA all in one place. There’s no multitasking, no front-facing camera for video conferencing, there’s no USB or video out without an adaptor, no HDMI at all, and Flash doesn’t work on it. The battery can’t be replaced. The screen is a bad aspect ratio for watching widescreen video, I hate touchscreen keyboards, and an LCD monitor is bad for my eyes when I read. And the name is terrible, just look at all the feminine hygeine jokes.

So, the iPad is the biggest disappointment in history relative to its hype, right? From how these people are complaining about it, you would think so. It seems like articles based on the “iPad sucks” thesis are in vogue now. The question is, are they right? Is the iPad being trashed for good reason? Well, you can easily see from the title of this blog entry that I disagree. So let me explain why. It helps to break down the complaints into categories: lack of features, lack of novelty, and the user experience.

Lack of Features

Many people are upset that the iPad lacks many things they expected. This is often because they heard about such features in pre-release rumors, and came to think of them as part of what the iPad should be. It has a powerful enough CPU, so there should be multitasking; why won’t Apple support Flash animations; the device is a natural for video conferencing so where’s the camera; and why doesn’t it have the ports I want?

There are three answers to cover all of these questions. First, some features are software-specific, like multi-tasking. As with the iPhone, multitasking can and will be added with a software upgrade. If you get an iPad today, expect improvements to come without having to purchase a new device. Just like early iPhone adopters eventually got features like the App Store and cut-and-paste despite them not existing on the original device, your iPad will similarly receive updates, and multi-tasking is an obvious one–not to mention that it is implied in OS upgrades even now being tested.

Second, some physical features were not included in the original model, but they will be eventually. Yes, there’s no camera–but you can fully expect the feature to come with a future model. Again, just like the iPhone originally had no GPS, no video camera, and no compass, the iPad comes with a relative paucity of features. This was an obvious thing to expect; I predicted it myself in a blog post published ten days before the iPad was announced. This is simply the way many products are released. If you feel that a front-facing camera is a must-have, then simply wait for the next model to come out.

Third, some features were not included for design and esthetic reasons. We all know that Steve Jobs is a stickler for seamless designs; it’s the reason he never added a separate, physical right-click button to any Apple mouse. Few people agreed with him, and maybe this aspect of his design preferences is unnecessarily off-base. But this is part of the overall package, both the good and the bad, and what it means in the end is just that there’s no seam for a removable battery, and fewer ports along the edges. Fewer ports may also be a pricing or manufacturing concern, but whatever the case, most of these issues can be worked around, or don’t matter as much as many may think. You can add USB, SD card, and video out with adaptors. HDMI adaptors may come in the future (just as third-party HDMI adaptors came out for the MacBook Pro), but VGA should suffice in most situations if you want to use it as an output device. As for the battery, ten hours is more than almost anyone would use the device in a single day, and plugging in the device to recharge at night is not a hardship.

Some people complain about the lack of sufficient storage. I myself am peeved by Apple’s pricing tiers: $100 is way too steep for an extra 16 or 32 GB of memory. They clearly want to lure people in with the base price, but get them to end up spending the extra cash on more memory after having decided to buy one. However, there is a possible reason why the amount of internal storage won’t matter as much: networking. The iPad is not designed to be a storage device any more than the iPhone is. You don’t store your entire film and music libraries on the iPhone, you leave them on your main device and then sync the media with iTunes; same with the iPad. With the iPhone, wireless syncing was not included due to certain issues, battery life being the most significant. With the iPad, that may not be an issue. If you need a file, then from what I hear, you will be able to get it from your main computer using the WiFi network. Most stuff will be stored over the network, and so more storage on the iPad won’t be a big issue.

That leaves the lack of Flash support, and that was not an oversight: Apple intentionally left it out. They did so because they see Flash as more of a vulnerability than a benefit. Flash is slow, buggy, and opens up security holes. Personally, I detest Flash; although it can be used beneficially in controlled moderation, most Flash designers go way overboard, creating a web-surfing blight unmatched by any other, including the animated GIF and the “blink” tag. Apple is right to abandon it–and not just because it would open up the iPhone and iPad to hacking attacks, which is a good enough reason by itself. Flash is so Internet Explorer 6, it’s the Floppy Disk of software. Apple abandoned floppies years ahead of Windows PC makers, and they are similarly ahead of the curve where Flash is concerned. HTML5 is where it’s at.

IducttapeLack of Novelty

The next category of complaint is that the iPad isn’t revolutionary. We again see the problem–once more, as I predicted before the iPad was debuted–where expectations raised by the rumor mill led to disappointment. Everyone was looking forward to something completely new, a revolutionary OS or a stunning new design. Instead, Apple came out with what was essentially just a big iPod Touch. Why did it takes years for the Apple design team to start from scratch several times over to come up with something so basic?

It helps to remember that Apple’s challenge here was not to make something completely new and unexpected; Apple’s challenge was to make a tablet computer that would be practical and fun to use. People just assumed that this would naturally involve something new and revolutionary. I was personally nervous about the rumored “steep learning curve” of the tablet: if Apple made it too revolutionary and different, then people might not be able to use it. Just look at the iPhone’s touchscreen keyboard–hardly a huge new concept, but people freaked out at the idea.

The lack of novelty in the iPad might be explained by the old saying, “That’s a feature, not a bug.” As Steve Jobs pointed out in the unveiling, there are about 75 million people who will know exactly how to use this device from the word go. Apple chose the exact opposite of a steep learning curve, and once you think about that in light of the challenge of making a tablet computer easy to use, it makes perfect sense. The iPad is not intended to wow you with its novelty, it’s intended to be comfortable and convenient. People who complain that it’s just a big iPod Touch are completely missing the whole point of this new device.

One other consideration along these lines is the iPad’s place in the spectrum of usability. Many have noted that it doesn’t replace anything, save possibly for ebook readers. The iPhone, for example, replaced the need for lugging around a cell phone, PDA, ipod, digital camera, and video recorder. That’s wonderful, but that doesn’t mean that every device has to accomplish the same goal. The iPad was not design to replace existing products, it was designed to fulfill an existing need. That need was for a mobile device which was more capable than a smartphone, but easier to tote and carry than a laptop. It may not be the widest category of need you can imagine, but a lot of people will greatly appreciate and desire exactly such a device. Students will go nuts over what this will do for textbooks, for example. People who want color, backlit ebook readers will love it. How many people have complained about laptops being too heavy, or burning their legs with the excess heat, but can’t do what they want on a tiny smartphone screen? And then there are the uses that nobody thinks they need right now, but the iPad will open up for them–a holy grail in product design.

The User Experience

That brings us to the last category of complaint: it looks like I won’t like it. It looks too heavy and awkward to hold, the size is wrong, the screen won’t be good for me, the touchscreen keyboard is no good. The problem is, people who have only seen the device and have never held one in their hands are already making judgments about what it feels like to use one. That may be why almost all of the criticisms are coming from those who have never had a hands-on with the device. Look at the reviews by those who have played with the device, however, and you’ll encounter the same advice that Jobs gave: you have to use it before you understand how right it is. Once you use it, you may find that your concerns were unwarranted or have easy solutions. It may be heavy, but so are some books; we compensate by holding such objects while resting them on our laps or whatever surface is available. The touch keyboard may seem awkward, but so did the iPhone’s, and most people seemed to have little trouble adapting to that. I myself took just a few hours to get used to it, and now type on my phone almost as fast as I do a full-sized keyboard (a miracle relative to the numeric-keypad hell that I avoided for so long). The screen may be brightly backlit, but that’s what the brightness control is for.

This is not to say that the iPad will be for everybody. Some will never get used to a virtual keyboard; others will never be comfortable holding it; many may be bothered by any level of light from a backlit LCD screen; some may hate the design and esthetics, or may never get over their high expectations from the pre-launch days. Apple has always had its haters, and always will. That doesn’t mean that the product is bad or doomed to failure.

Dispelling Criticisms Is Not Proof of Excellence

You may have noticed that I have spent the entire blog post so far explaining why the negative reviews are off base, and have not really explained why the iPad is “Deceptively Good,” as I claim in the title. So let me take a whack at it. The answer lies in two aspects: the user interface, and the product’s future potential. Both are inextricably linked, and both are right now vastly under-appreciated.

The UI

OlduisWhen the first “personal computer” came out, it was fully a geek’s plaything. The Altair computer had no monitor, no keyboard–just a few rows of switches and blinking lights to allow for communication in binary code. Very few people could actually use one for anything. A few years later, the “trinity” of PCs–the Apple II, the Commodore Pet, and the Tandy TRS-80–introduced a “CLI,” or a text-based interface. You either remember or have somewhere seen the old “green-screen” text displays. This allowed people who were not comfortable in binary to use the machines, although you did usually have to learn the language that the computer understood, which still kept most people too distant from the PC experience.

It only took seven years after that for the first commercially popular PC to use the GUI–the graphics user interface with visual metaphors like the Desktop, folders, icons, and menus–that we have become so accustomed to. The GUI was a godsend because it made the computer interface more recognizable, something we could relate to more easily. We understood that a desktop is a place where you begin your work, that you choose from menus, and that folders contain documents. Suddenly, almost everybody could use a computer, and PC sales took off. But we’ve had the GUI for a quarter of a century now, and it’s beginning to show it’s age. What’s next?

The answer is multitouch. Using a mouse may be a step up from a text-only interface, but it is still uncomfortable and clunky. Surely you have seen people trying to move something on the screen farther than their mousepad gives them room for, and clumsily attempt to pick up the mouse and reposition it–in fact, you may well have been that person, several times. The flaw with the mouse, and the trackpad as well, is that you are not directly controlling the content on the screen. It is one step removed from a “hands on” experience.

To get a good sense of how significant that is, try drawing a picture. Do it on paper first–I draw a pretty good Snoopy, for example. Then open a drawing app on your computer, and with the mouse, try drawing the same picture. You’ll most likely find the results appalling. A trackpad may not fare much better, unless you’re experienced at it. Whenever your hands and fingers are removed from the immediate action, you lose dexterity and control. Current cursor devices like the mouse and trackpad are remote devices; multitouch allows direct access, which is far more natural, comfortable, and accurate. However, you won’t realize this until you’ve actually used a device like the iPad where multitouch comes into far more appropriate use than it does with the smartphone.

The problem with multitouch is how the screen is placed when you’re doing your hands-on controlling. A desktop screen is much too distant, and even a laptop screen would require holding your hands out in an unnatural fashion. A smartphone screen is more suited for that, but it’s too small to do much with. The tablet PC is, if you’ll forgive the cliche, just right. Anything you control with your hands has to be in your hands. Yes, there are disadvantages, but the payoff in control will far outstrip any of those.

A good example is Apple’s multitouch trackpad on the MacBook Pro. When it came out, I thought it was cool, but not really revolutionary. I figured that I’d be able to do a few new things on it, but did not expect it to change they way I use computers. However, I only recently realized that I had completely stopped using a mouse–something I had depended upon for years with previous laptop models. The multitouch screen is the next step up from that; after getting used to it, you’ll laugh at how clunky a mouse is. But the catch is, you won’t realize it until after you’ve used it for a while. The true utility of the touchscreen sneaks up on you.

One Word: Potential

That brings us to the real promise of the product. A lot of people look at the iPad’s current state, and what we already know about using iPhone apps, and see that as the end result. That’s a big mistake. What you have seen is only the beginning. Most of what the iPad will wow you with hasn’t come out yet.

To get a better sense, watch the keynote, and pay special attention to the software demos. Pay attention to how Jobs used the photo viewing app. Watch what Phil Schiller does with programs like Numbers and Keynote, how the multitouch comes into play. Watch the Nova game demo, and note the grenade-throwing and door-opening gestures. Be sure to watch the users’ hands, not just the screen. These are just a few examples of what can be done, but there is far, far more. It is limited only by what software developers can come up with, and you’ve seen the amazing stuff people have come up with on the iPhone App Store. The closed ecosystem provides a sheltered environment which not only helps prevent malware incursions, but slows piracy so that apps can be sold more cheaply. But most significantly, it allows the individual, the small-time software tinkerer, to immediately offer their wares for sale in one of the biggest marketplaces in the world. And now the iPad blows that wide open by combining the novel and powerful multitouch interface with enough real estate to make almost anything possible.

I can appreciate the benefit to apps whose layouts have traditionally been hard to control, like Filemaker Pro for instance; creating, resizing, and placing fields and buttons has always been a bit of a pain. I can easily imagine multitouch being used to make that not only easier, but a lot of fun to boot.

Conclusion

The features most people have focused on so far–the music playing, movie viewing, browsing and email, and even the ebook reading–are all just background. They are little more than examples of what can be done with the machine. Once you take in the full potential of the device, you will come to understand that the concerns people are airing today miss the point entirely. Panning the iPad because the screen size doesn’t fit the aspect ratio of certain movies is like saying that your Porsche is abysmal because the gas cap is the wrong shade of grey. The iPad is way, way more than just one application. Watching movies on it is a perk, not a raison d’etre. Same goes for many of the other concerns.

Apple’s mission was very simple: make a platform, and they will come. The idea was not to introduce something with whiz-bang flashing lights that would knock people’s socks off, it was instead to do what computer makers have been trying for nearly a decade and failing at: creating a tablet computer which has enough going for it that it can succeed as a product category. Apple has, by all appearances, succeeded in doing that. By building on the achievements of the iPhone platform and the introducing full-scale multitouch UI in a low-cost product where that feature can flourish, Apple has created something which is truly groundbreaking.

Remember, ground-breaking innovations are not always appreciated or understood when they come out. A lot of people sneered at the original Mac, many thought the iPhone would fizz out after the buzz dissipated–heck, even the PC itself was dismissed as an expensive toy at first back in the late 70’s. So don’t count the iPad as DOA before it even arrives. It’s far more than it seems.

Hey

So, by now, you have probably thought, “If you’re criticizing others for coming to conclusions about the iPad sucking before they get their hands on it, how can you claim that the reverse is true if you’ve never held one yourself?” Well, you got me. Part of it is an educated assessment–I’ve been looking at this kind of technology for a while. But that’s not enough.

Call it an article of faith.

Ditch Explorer

January 22nd, 2010 5 comments

Banie2You’ve probably heard about how China has been spreading malware and using it to hack in to email accounts to spy on activists, journalists, and god knows who else. Of course, this should not be surprising: the Chinese government has quite a track record of acting like complete pricks and never facing consequences because everyone is afraid of losing access to a market with more than a billion customers and a very cheap labor force. So, old news, we all knew China’s doing crap like this.

What’s more immediately interesting are some reactions to this. France and Germany, for example, have begun urging their citizens to dump Internet Explorer–all versions–in order to avoid security breaches like those committed by China, which exploited security holes in the browser to invade people’s privacy. Of course, I fully support this; Internet Explorer is the Worst Browser Ever, with issues not just concerning security, but also concerning standards–IE completely fails standards tests, and not just by a little; whereas Safari gets 100% and Firefox gets close to that, IE scores in the 20-30% range–even including IE8 and the as-yet-unfinished IE9.

In short, IE is a bad joke and should be abandoned by everybody for Safari, Firefox, Chrome, or Opera. That said, to be completely fair, if IE hadn’t been around, China would have probably just hacked a different browser. None are inviolable, but IE is known to be particularly open to attack.

An amusing postscript to the story: Microsoft is now advising users to drop not only IE, but Windows as well! Of course, they’re not telling them to drop Microsoft products entirely, just old versions–they advise upgrading from IE 6 to IE 8, and from Windows XP to Windows 7. In other words, they are using this attack to sell their new OS.

However, an upgrade is indeed called for:

Upgrade

Categories: Computers and the Internet Tags:

They’re Back!

January 21st, 2010 Comments off

Screen Shot 2010-01-21 At 2.02.51 Pm

The Shiba Inu mama Kika has had a new litter of 5 puppies (3 red, 2 cream), born a bit less than a week ago. Popularly known as the “SF Shiba Inu Puppy Cam,” this became a sensation a while back as 3 million people watched the puppies on a more or less regular basis.

If you have missed your regular fix of Shiba Inu puppies, or if you wished you could have started watching them when they were younger, then here they are.

Categories: Computers and the Internet Tags:

Ballmer Again

January 8th, 2010 1 comment

Steve Ballmer on the tablet computer:

This morning, I interviewed Ballmer and asked him about the market for tablet/slate computers. He made the excitement sound like empty chatter. He claimed to believe that there isn’t a sizeable market for the tablet.

“They’re interesting,” he said. “But it’s not like they’re big numbers compared to the total number of smart devices in the world.”

Well, Ballmer’s an expert in the field, isn’t he? Here’s Ballmer three years ago, on the iPhone:

There’s no chance that the iPhone is going to get any significant market share. No chance. It’s a $500 subsidized item. They may make a lot of money. But if you actually take a look at the 1.3 billion phones that get sold, I’d prefer to have our software in 60% or 70% or 80% of them, than I would to have 2% or 3%, which is what Apple might get.

Well, there you go.

It might have something to do with the fact that Ballmer had just attempted to steal Steve Jobs’ thunder by showing off three tablet computers at the CES in Las Vegas, to underwhelming disinterest. Since his presentation was a flop, it has to be because tablets just won’t work at all, right?

Cue Steve Jobs, January 27th.

Psystar

December 21st, 2009 2 comments

So, Apple has killed Psystar. isn’t that for the second or third time? I forget. But whatever the case, Apple has again successfully prevented third-party Mac clone makers from getting a toehold.

In some ways, you could see this as bad: it means that Apple has a monopoly over its domain, that there is no competition to drive down prices, no alternate choices which could lead to great Apple software running on much cheaper machines.

But the more you look at it, the more you have to admit that Apple is right to do what it does. The mistake comes from seeing computers and OS makers as being separate, which is the Microsoft model, also followed by other makers of OS software. And maybe if Apple had the 90~95% worldwide market share that Microsoft has, it would be more of a monopolistic concern.

However, that’s not the case. Apple never intended to sell software and hardware separately; it is designed to be an integrated system. Think about other makers who do similar things: what if I made a new DVR, but took the OS software from Sony’s DVRs to make it run? Sony would shut me down and nobody would think Sony was out of line. Hardware makers do that kind of thing all the time: create closed, integrated hardware and software systems. In fact, everything that’s not a PC sold as a PC is designed exactly that way, from cell phones to cars: the manufacturer creates the operating system to run with the hardware, and they see both as something they own. If a user wants to tweak the system after they buy it, then fine–but if a for-profit company wants to tweak it and then sell it for a profit, potentially robbing sales from the original designer by using their designs and concepts–that’s different. As far as I know, Apple has never tried to go after any private users, even for things like software piracy–Apple has far fewer safeguards and hurdles against such things relative to Microsoft.

So while the freedom-to-tinker part of me wants to see clone makers succeed, the I-made-it-I-control-it part of me sees how it’s the right of the creator to prevent someone else from making money selling hardware based on Apple’s work. (I don’t think that the “I own Apple stock” part of me is really influencing what I think here, but that’s harder to say.)