Home > Computers and the Internet > Analog to Digital

Analog to Digital

August 4th, 2007

Things have changed. I remember when I was a little kid, my father used to work at SRI, a research institute, and took me in to work sometimes. I remember the big computers lining the walls of some rooms, the desktop calculators almost as big as some desktop computers today. But that technology, in the late 60’s/early 70’s, was still almost a decade from even starting to infiltrate the home.

0807-Dotmatrixsample-175I first worked on a computer at home around 1980, maybe 1981. It was an Apple III, a honking big thing intended to outclass the Apple II. Word processing then was like coding a web page today using a text editor–you had to type in style changes manually. The printout was from a dot matrix printer with a resolution of maybe 8 dots tall for 12-point text. Hopelessly primitive by today’s standards, it was top-of-the-line for home use then. Before that, my brother and I enjoyed using our father’s home terminal that connected to the mainframe at SRI via a modem which you activated by placing a telephone handset onto it (we played a lot of text-based Star Trek games on it). Before that, aside from trips to my father’s work, we got glimpses of the upcoming technology from our father, an engineer who did educational stuff for us at home like building a binary calculator on a plank of wood.

I bring this up to point out that I was privileged, in a way, to witness a transition. As a child, I grew up in an analog world, and have watched it change to a digital one. Very much like people who lived before the Space Age and after it, or before the Nuclear Age and after it. The transition that I and everyone else who has lived through it is much more significant. Nuclear power has ebbed in its importance and frightfulness; space travel has not changed us as much as many thought it would, especially after it was mostly abandoned after the Apollo program. The Information Age, on the other hand, has transformed how we live.

One important aspect of this is to remember that virtually all information can be made digital–text, numeric data, sounds, images, video… and perhaps more information when sensory technology improves in the future (touch, textures? tastes and smells?). Computers can then juggle and sort the digits, doing just about anything with the data, limited only by our imagination and what software authors can bang out. It is so early on in the Information Age that we have not yet even scratched the surface of what is possible.

I refer to it as the Information Age and not the Computer Age because computers (not counting hand-driven machines like the abacus) have been around for quite some time (the first ones were in the mid-19th century, running on steam; the first electronic ones came in the 1930’s); however, the greatest impact from computers has come with the Personal Computer, and the ability of computers to affect all our lives, not just researchers and corporations. In fact, the real impact came not in the 80’s, when more and more households got PCs, but in the mid-to-late 90’s, when computers started becoming ubiquitous, and the Internet started to get popular.

I remember researching an essay in high school. I remember going to the library and searching for books, using the card catalog. I remember looking through tables of content and indexes, skimming through chapters while standing between the book stacks, checking to see if the book had information which could be useful for me.

Today, I see my own students sitting down at a computer and pulling up Google or going into WikiPedia, and pulling out far more data, far more focused and relevant, in just a few minutes–work that would have taken me hours when I was their age, and which would not have been as fruitful. It is in this observation that you begin to see the impact of the Information Age, in just one of its aspects.

And yet, I still get badly-researched papers from some of my students–not because the technology failed them, but rather because they did only the minimal work necessary to bang out an essay. I shake my head and refrain from giving them the “when I was your age” speech; I don’t need more than the gray hairs I’ve already got to make me look like a geezer.

But look at what else there is. Ordering books and whatever else you can think of from Amazon or other online sellers. Finding a good restaurant (did that just yesterday–tomorrow is Sachi’s and my first anniversary since meeting), or checking movie times. Buying music, or videos online. Getting news from countless sources. To mention just a few of the more popular activities now possible using the technology.

Sure, most of that is stuff you could have done before; you could order through catalogs, look through issues of newspapers, visit local shops, or subscribe to any number of magazines or perhaps see them at the local library. But many of these older options included travel, cost, or both, and netted far less depth of information, far less wealth of choice.

Then there is communication; sending email instead of posting letters as I used to when I first came to Japan. Making expensive long-distance telephone calls instead of using Skype or other messaging software.

So much of this may be trite, stuff you know or have considered before. But it is part of an ongoing process so long and vast in the making that, I believe, most people overlook it and do not see the significance of the change. Instead, you get a lot of people complaining about the down sides, about viruses and spam and Internet-based crime. That’s falling off now–I remember back in the late 90’s and early 00’s when it was chic for the media to report on this crime and that crime committed via the Internet. Baseless in meaning–after all, they never ran stories about how telephones or surface mail helped propagate crime, any more than they focused on how criminals use cars. But the Internet, being new and big and scary, got the blame for people who used it instead of older media.

We seem to have transitioned into the realm of the digital without fully appreciating how it has changed us. And that is not significant just so that we can say “wow!” or wonder at gadgetry; it is significant because after seeing where we were and where we are now, we can get at least a vague sense of where we will be in another twenty or thirty years–time enough not only for digital technology to permeate virtually everywhere, but for transmission speeds and data storage capacities to make the unthinkable today mundane tomorrow.

But even more important than the technologies is how they will be put to use: what we’ll be doing digitally, what we’ll have access to, how technology will help us find, sort, evaluate, and execute. As much as computers and other electronics seem to have advanced, we are still in the infancy of the Information Age. The Internet has only been widely used for a decade or so; the GUI-based computer just over two decades. Barely enough time for us to get introduced to the field, and not nearly enough for us to discover what we can really do with it. What applications will be serving us in 2030?

In a few minutes, Sachi and I will be going out to see the latest Pirates of the Caribbean movie. I’m not too thrilled, and neither is Sachi. We bought advance tickets a month or more ago, and the timing was never right to go see it. This is our last chance before it leaves theaters, considering our schedules. Sachi is not thrilled because it has to be the late show, starting at 8:40 and ending not long before midnight. I’m not thrilled due to stomach cramps, which will ebb perhaps just enough to be tolerable throughout the show. But it’s a choice of going now or tossing the tickets.

Why mention this? Because in ten or twenty year’s time, our choices will be significantly different. We’ll probably have the option of seeing first-run movies at home on Ultra-HD video over the Internet, with the flexibility of choosing times and moods to fit our schedules far better than now.

That might not sound like much to you, but it appeals to me greatly at this moment. And that is just one, small, tiny corner of what will change between now and then.

Editor’s note: I wrote this yesterday, when I had zero time to edit and polish, so I did the editing and polishing this morning. A paragraph or image added here, words changed or tacked on there, the odd spelling error corrected. Sorry for the delay.

Categories: Computers and the Internet Tags: by
  1. August 5th, 2007 at 11:43 | #1

    Unfortunately, this is the misinformation age as much as the information one. The Internet has allowed a lot of “facts” that are really opinions to be distributed and embraced as having the same validity as what you get in a carefully researched and well-written book. Wikipedia has the potential to be the most malignant of such sources since so many people turn to it for shallow details and never bother to hit the library (which is still the only place you’re going to find some depth when doing research). We’re in a McDonald’s information age, fast and crappy, whereas before we were in more of a fine dining information age, slow but excellent.

    If you are the type of person who reads books on topics rather than skims the Internet or reads news snippets, you find that people have limited perspectives and miss a lot of points when pontificating authoritatively on their pet topics. Instead of people offering lay opinions on issues based on knowledge they’ve acquired from a variety of deep and complex sources, you have people offering pat arguments and engaging in weak “discussion” tactics to deflect well-researched counter-arguments because they were so sure they were right in the first place and are now at a loss when someone provides them with new information. In general, all discourse, whether it be academic or otherwise, is becoming more akin to two children arguing something they don’t understand very well and trying to “win” the argument at all costs rather than to strive for a full and balanced understanding of issues and topics.

    Given that attention spans are contracting rather than expanding and people aren’t likely to give away information they’ve spent a lot of time acquiring (by placing it on the Internet for free consumption), I think this is likely to get worse as time goes by, particularly as more people grow up regarding this sort of shallow understanding and emphasis on “winning” rather than “learning” as the norm. Eventually, not even teachers will know how to really do research or be able to distinguish a well-researched paper from a poorly-researched one as they will have grown up only knowing how to do the latter.

  2. Luis
    August 5th, 2007 at 12:37 | #2

    Absolutely there is a quality problem, but I think that you are overlooking Sturgeon’s Law: “90% of everything is crap.” Maybe the Internet pushes this to new boundaries, but I would argue against the idea that information was more pure and accurate, or less mis-informing, before the Internet came along. There is certainly a much greater ability for a user to choose biased sources of information, but a careful researcher can likely get a great deal more accurate information from the Internet today than one could get from a library in the past. There is a great deal of in-depth information as well as commentary available online, if one takes the time and effort to search for it. And if it’s not on the Internet, then one is much more likely to find out what to look for much faster and effectively by using Internet searches to see what to look for at the library… and if it’s not available at the library, then to get it from Amazon.com.

    And the bias that exists and is growing is not an artifact of the Internet, but rather of the intentions of the people using it–just as it has been with other media in the past. Hearst’s newspapers are just one examples of this in the past, but history is filled with examples of media distortions. It’s not the technology, it’s the people using the technology.

    Which is one of my points: the Internet is a great tool. Of course it can be abused; it can be used to disseminate incorrect information, and the lazy user can easily get fooled by what they see. But again, as I said before, that can happen in any media. Look at Fox News. Look at talk radio. Look at the Washington Times. Look at popular magazines and tabloids sold in supermarkets. Look at so many of the books sold today on political and social issues. Are television, radio, newspapers, magazines, and books in general to blame?

    I think you’re making just the error I pointed out in the post: blaming one media over others for the abuses people commit using every media.

    I truly think that the unique benefits of Information Age technology far outstrip the dangers presented by the unique abuses possible with the technology. If anything, problems such as those you mentioned are the fault of the user for not being careful about verifying their sources and doing proper information-gathering.

  3. Anonymous
    August 5th, 2007 at 12:52 | #3

    Interesting point of view Luis. God forbid the common people obtain means to mass media. What could the old propaganda machine do but try to adapt?

  4. August 5th, 2007 at 15:16 | #4

    I’m not overlooking some adage made up by a science fiction author who wants to advance a cynical and unsubstantiated notion. Anyone can make up numbers and create sayings and get a Wikipedia entry if enough people want to believe it. I’d agree there’s some “crap” out there but 90%? I’d like to see that one “proven”. I guess you can say anything you want when you’re relying on subjective observation and opinion as your yardstick. Granted, the signal to noise ratio on information sure got a whole lot worse with the Internet on the job but I wasn’t talking about the Internet but about professional level information gathering and dispersal. There’s no way 90% of that is “crap”.

    As for bias, I never mentioned bias as an issue in what I said. I’d agree there is bias in all content. What I said was that the information is shallow and opinion is offered as if it were fact. That has nothing to do with bias.

    I also never mentioned pop media outlets such as Fox news, rags, pop magazines or talk radio because the topic at hand (at least in your post) was research. AFAIK, none of those types of things are legitimate research sources. They all fall under the heading of “entertainment” unless you’re dumb enough to believe otherwise.

    In both of these cases, you’re moving the topic around rather than addressing the original points. These have nothing to do with what I said nor does the “anonymous” comment about “common people obtaining means to mass media”. The topic is using the internet as a source of information for research purposes, not whether or not the plebs should have a voice.

    I don’t see how you can even argue with the conclusion that entire books on topics derived from scholarly research are superior to slapdash, short pieces culled from internet searches and anecdotal observations. This is not “blaming the media”, it’s being realistic about the processes behind each media and the sort of results those processes tend to get.

    Internet writers are all about frequency of content because there’s pressure to update often to keep readership up. They don’t take the time to edit their pieces 99% of the time, let alone outline their thoughts, think them over, research all angles, and re-write.

    Authors of scholarly journals and books write to different ends because they have to stand up to the scrutiny of their peers as well as create a product of sufficient polish and length to qualify for submission (journals, unlike the internet, do not offer up every bit of tripe that is offered to them) or to get published.

    The interesting thing about the “anonymous” comment, btw, is that it is at the heart of why so much of what is on the internet is crap. Everyone feels they know everything about everything and are ‘right’ in their conclusions and they get all worked up at the notion that their voice doesn’t carry the same weight or value as that of people who work for a living offering and obtaining information. They balk at the very notion that excluding some people from a form of information delivery may actually increase the value of the information being offered since the process is more selective and only the wheat is allowed to participate while the chaff is rejected. It’s a bitter pill for bloggers to swallow that everything they have to offer isn’t a valuable fact that people should take seriously and incorporate into their world view.

    While I believe wholeheartedly that everyone should have their say and blog to their heart’s delight, I don’t believe there is 1/10th the value of content on the internet as there is in the books and professional journals at a good college library. Most of what is out there is just more entertainment. There’s nothing wrong with that so long as people don’t delude themselves into believing it’s fact. That makes them no different than those who believe Fox News is actually “news”.

  5. Luis
    August 5th, 2007 at 17:11 | #5

    …I wasn’t talking about the Internet but about professional level information gathering and dispersal. There’s no way 90% of that is “crap”.This is one example of where we seem to be talking past each other. I don’t think that Sturgeon was trying to say that 90% was a carefully measured figure; I think he was trying to express that in pretty much all endeavors–with academic studies being no exception–a large amount of what is dealt out is of dubious quality. I’m pretty sure he was being glib about “90%,” and such was my intent. You mentioned “a carefully researched and well-written book,” and I flashed on so many books out there today which claim to be well-reseached and a lot of people subscribe to, but which really panders to bias. And that which does not pander to bias can still be completely wrong; my father, an expert witness at trials, speaks about how often “experts” are full of it (and no, the irony of his also being an “expert” is not lost on me). If you speak of peer-reviewed work–a horse of a different color in many cases–then that’s different.The topic is using the internet as a source of information for research purposes, not whether or not the plebs should have a voice.As one example, I would point you to scholar.google.com, but there are a lot more like it. The Internet is not all blogs or new sites; there is a substantial amount of professional research and information available to one who knows where to look.I don’t see how you can even argue with the conclusion that entire books on topics derived from scholarly research are superior to slapdash, short pieces culled from internet searches and anecdotal observations.I think that you are also overlooking the point I made–that the Internet is no more responsible for poor research methods than is the book industry. I mentioned those other media types to gave examples of other media which have poor-quality information. A student in a library can depend on just as crappy sources as you mention on the Internet, and do their research in the library in just as slapdash a manner. Or they can turn to in-depth information which come from reliable sources and read these carefully to give well-founded conclusions–which is exactly what they can do on or through the Internet as well.Internet writers are all about frequency of content because there’s pressure to update often to keep readership up. They don’t take the time to edit their pieces 99% of the time, let alone outline their thoughts, think them over, research all angles, and re-write.You mention “Internet writers,” and I think you are in so doing ignoring the rather large number of scholarly works available on the Internet, and instead focusing more on bloggers and other informal writers. But that would be like dismissing the print-publishing industry because of the books you see on sale at Target or the magazines and tabloids you see as you pass the supermarket check-stand. Both print and Internet media have their popular and professional areas.

    Now, the main distinction may be that on the Internet, most popular media is free to the public, and the professional stuff–whether offered as an e-book, part of a subscription database, or even in the hybrid area of print materials ordered via the Internet–is paid and so is not accessed without special circumstances. My own students–like most students in most colleges and probably more and more high schools as well–are thoroughly introduced to a wide variety of library subscription services, including ProQuest, Ebsco, and others. They can even read a wide variety of books over such Internet subscription services, and so get the same level of information you speak of.

    I think that this is where we talked past each other–I did not specify such sources, but figured that they were a given for any student in school these days. That’s what I was talking about when I mentioned being disappointed with students who used only Wikipedia and such sources, instead of using the wide variety of scholarly material available to them.

    And, of course, the key advantage to digital as opposed to analog information is that digital information is searchable; you can find books which have your target information far faster and in much greater quantity, and then find all references to a specific topic or line of information in the same manner.

Comments are closed.