Showing posts with label technology. Show all posts
Showing posts with label technology. Show all posts

Wednesday, August 1, 2012

Humans Made Better With Technology


It is often interesting to watch the future unfold in real time.  In many areas of human development, the small changes accumulate casually, soundlessly, but add up to one grand spectacle when one take the time to observe it and realize they’re looking at something that just a little time ago would have appeared to be the exclusive domain of science fiction.  The recognition of that progress can be a subtle personal breaking point.  It can be either a negative breaking point – jarring one with the realization that the world is morphing by a series of huge steps into something virtually unrecognizable; or it can be a positive one – welcoming a sense of exhilaration as one comes to gain a clear perspective on the lovely places his world seems destined to go.

I’ll be the first to admit that I am prone to be threatened by changed, including and especially technological change.  I am terrified of the myriad ways in which we seem not only willing but eager to throw our humanity away in an endless quest for convenience, and easy security, and ephemeral connections to an increasingly impersonal, electronic world.  But several recent events have given me a sense of the other side of that coin, the exciting promises that come of our eager, whole-body embrace of new technology.

As much as technology swaddles us with petty conveniences and frivolous distractions, the multiplication of those things goes hand in hand with the growth of technologies that demonstrate potential to really transform not just human experience, but human beings themselves.  While the trends have certainly been building for some time, with the rapid development of prosthetics, and of personal electronic devices, and the social acceptance of a constant technological presence in individual lives, it rather seems to me that in the blink of an eye we were on the verge of the widespread technological enhancement of human beings.

That trend is realized in ways that may lie anywhere on the spectrum from subtle to unmistakable.  On the side nearer to familiarity there is the 2008 Olympics.  While the nation and the world were busy watching Michael Phelps make history by scoring eight gold medals at the Beijing games, they may have missed the fact that it wasn’t just Phelps, but also several of his competitors who were systematically shattering former world records in each event.

This wasn’t just a result of that year’s competitors having been a particularly exceptional crop of swimmers.  Advancements in swimwear technology, led by Speedo, effectively made times before and after 2008 incomparable by quite literally reshaping the actual competitors into something more hydrodynamic, squeezed tight in all the right places to make them glide through the water with reduced drag.

It might be hard to conceptualize mere garments as high technology, but however you look at it, the gear that modern industries have produced for their athletes have served to dramatically increase performance and raise the bar for “personal best.”  Technology doesn’t just aid natural abilities; it enhances them.  This is true in other events, as well.  Running shoes have steadily collapsed the ratio of strength to lightness, with Adidas having developed a shoe that redirects power into the turn for long distance track and field competitors.  In that same category, the design of javelins has both increased outcomes and decreased risk of injury by premiering innovative design materials to limit the wobble of the pole upon release without transferring that force into the thrower’s shoulder.  In every one of these instances, if the competitor is capable of performing at a higher level in a high tech outfit than he could do naked, or if different individuals can perform differently based on the design of the object they’re holding, then we’re effectively enhancing the natural capabilities of a human being, even without drugs.  In a way, we’ve been doing this for decades, but it has gotten far more dramatic very quickly, especially in light of the outcomes of the 2008 Olympic Games.

The current Olympic Games showcase something rather more interesting, albeit something that requires a little more speculation to see how it supports my thesis.  South Africa’s Oscar Pistorius has qualified and been permitted to run in the men’s 4 x 400 meter relay.  Pistorius is a double-amputee who will be running on two prosthetic legs against able-bodied opponents, and he purportedly has a real chance of taking a medal.

This probably comes across to many observers as a nice human interest story, but to my mind this is a terrific portend of things to come.  The significance of the story is arguably more social and technological than it is personal.  The Olympics are the ultimate testing ground for the quality of prosthetics.  If false legs are now capable of holding their own in direct competition with the real limbs of athletes in their prime, there can be little doubt that we’ve designed technology capable of replicating the fullest capabilities of the human body.  If we’ve managed that so early in the twenty-first century, how long can it really be before we have prosthetics that actually exceed human abilities?

Unless Pistorius’ prosthetics fail catastrophically, there is simply no way that he will stand alone for long as an example of a formerly-disabled world-class athlete.  Again, with this development, technology has plainly shown itself to be capable of dramatically enhancing the abilities of un-equipped human beings.  Granted, in the present case, it may only be serving to bring a disabled man’s abilities back up to the baseline, but that in itself is truly remarkable, and it leaves it easy to imagine that the technology of the near future can amplify the performance of already able-bodied individuals in similar measure.

All right, let’s not mince words; there’s no more sane way to say this.  I’m talking about cyborgs.  We’re close to having cyborgs among us.  Depending on how you define the term, we may already have them.  I feel downright silly typing that, as the concept still seems far-fetched to me, but I have to reconcile that with the fact that I read a week or two ago about the human cyborg Steve Mann, who has apparently been experimenting with wearable technology since the early 1980s.

Mann made headlines in mid-July after he was assaulted by employees of a Paris McDonalds for wearing his EyeTap Digital Glass camera, which is permanently attached to his head and cannot be removed without special tools, leading the website io9 to brand the attack as the “world’s first cybernetic hate crime.”  Overstated or not, that puts the immediacy of such seemingly futuristic technology into sharp focus.  Technologically-enhancement of human beings is a definite reality, and not just as one-off experiments in distant government labs.  There is at least one individual who is living with such enhancements on a daily basis, blended in with mainstream society.

Taking all of these indicators together, I can’t help but wonder what the future holds and when it will show it to us denizens of the present.  Perhaps it will still be a generation or more.  Then again, Steve Mann’s developments went on in society’s background for thirty years and when I came aware of them I felt they had snuck up on me.  And while I knew how well prosthetic technology was developing, I never anticipated seeing an amputee run in the Olympics.  That too, snuck up on me.  The pace of change is stunning.  If you’re not paying close attention to the patterns, it’s easy to underestimate what the future holds and how soon the future comes.

Monday, February 20, 2012

Existential Questions and the Hiring Process

I’m doing some consulting work that has required me to look over some academic materials regarding hiring procedures. This has gone a long way towards reminding me of my personal distaste for formulaic assessment of human worth. Is this symptomatic of the computer age? Are we subjugating even character judgments to algorithms and statistical analysis, in lieu of personal judgment?

In the past, when I’ve taken personality tests and questionnaires as part of the process of applying for run-of-the-mill jobs, I’ve bristled at the notion that my answers to a series of seemingly disconnected questions was trusted as a means of gauging my work ethic, attitude, or personal character. Such experiences also constituted some of the first instances of my feeling cheated by my own ethics, as I worried that interviews and positions tended to go to people who were willing to lie favorably about themselves. I even asked an employer once if their assessments took this into account. Her somewhat sympathetic response was to tell me that the entire thing was handled by an outside company – a fact which I think makes my point even more clearly. Not only are hiring decisions often not made face-to-face, they’re often not made in the same building or within the same professional framework as the prospective job.

The research I’ve lately done on the topic vindicates my concerns at least slightly. Written tests that seek to gauge professional virtue do include a scatter of questions that are designed to judge the honesty of the applicants by encouraging brownnosers to select unreasonably optimistic answers. Still, I think these sorts of tricks are sufficiently obvious that if you’re both dishonest and a careful reader you’ll have no problem exploiting the system despite being a seriously flawed applicant.

My problem with these kinds of practices is that they evidently try to generate a rather nuanced understanding of another person, of the sort that would usually be derived from days or months of interaction with him. And they try to do it at a significant remove, in perhaps as little as a half an hour. Perhaps the best example of this hubris from the materials I’ve been reading is the biographical information blank. As a hiring technique it is apparently almost a century old, though I am not personally familiar with it. It strives to correlate information about the potential employee’s background with indicators of his potential success with the company.

If I were to face the questions associated with this hiring practice, I would feel even more immediately and egregiously misrepresented than I have already felt in the presence of “honesty and integrity tests” or “personality and interest inventories.” I may be unique in this, but I find myself uncomfortable with practically any answer I can give to such quizzes, because there is at least some degree of vagueness behind most questions. Anything that asks me to rank my response to a statement on a scale of one to five prompts a lot of hand-wringing as I try to determine whether to round up or down or how to interpret what would really characterize neutrality on an issue.

One would think this wouldn’t be an issue with a biographical questionnaire, which asks for straightforward short-answer responses to direct questions. But some of the examples that I’ve encountered suggest that my overly analytical nature would make even this distressingly complicated. When it’s printed on paper and I have no opportunity to discuss interpretation with the asker, a question such as “at what age did you leave home?” prompts me to silently wonder what is meant by leaving home. Does going to college count if you remained a dependent of your parents? If a person stayed for several months with a nearby friend and then returned to his family, would that count as having left home? And additionally I wonder, what correlation is such information supposed to have with job performance? But at least that curiosity doesn’t affect how an individual would answer the question.

However, in the case of the question, “How large was the town/city in which you lived as a child?” I feel as though there should be an established standard for how to answer the question if the responses of different people are being judged against one another. It’s easy to answer that question, but it’s pretty likely that different people are going to have different concepts of comparative size. What confuses me about these methods of analysis is the question of how much exposition is needed. I feel like reviewers would want these things to be brief and easily digestible, but I also feel like if they’re supposed to genuinely represent a person’s background they can’t be.

But maybe I’m just insane. I can’t imagine that a lot of other people look at questions like “did you ever build a model airplane that flew?” and think to themselves, what constitutes flying? How much distance does it have to cover relative to its size for it to be considered a successful flight? Also, if it was assembled from a kit, does that count as building it? Is there any way to weight the two scenarios against each other?

I imagine answering to “were sports a big part of your childhood?” and I say, define “big.” Also, define “sports.” And “childhood.” The question doesn’t use the word “playing,” so if a person watched a lot of sports on television, would he get to answer in the affirmative? Is miniature golf as much a sport as football? For the purposes of the question, is late adolescence childhood? If I was heavily involved in martial arts training between the ages of eight and nine, and then again between thirteen and seventeen, does that count?

“Do you play any musical instruments?” Well, how much practice does an applicant have to say yes to this one? What if it’s just the kazoo? Is playing a musical instrument indicative of suitability for the job? It seems to me that even in the case of biographical information an applicant can manipulate the evaluation in his favor by bending the truth to make himself look more impressive than he is. That, however, would never be my impulse. When I face things like this, I need to make myself look as much like myself as possible.

Certainly, I need to reach a personal breaking point after which I’ll be able to let go of some measure of my obsessive need for precision. (I’m not sure precisely what measure of that need I need to get rid of.) But at the same time, I think my neurosis has something worthwhile to say about these types of evaluations, and the powerful elements of society need to reach a breaking point after which they no longer arrogantly think that a person’s background or overall character can be determined from a series of multiple choice questions and short answers. No matter how sophisticated our business literature or computer algorithms, they can’t reproduce acquaintanceship, interpretation, or understanding.

Sunday, February 19, 2012

The Tragedy of the Modern Library


I try to listen to A Prairie Home Companion each Saturday evening, in large part because, despite being politically and socially liberal, I am personally quite conservative and prone to nostalgia and wistfulness for a purer experience of things that it seems I was denied by the unrelenting progress of history. This week’s broadcast featured an episode in the adventures of Ruth Harrison, reference librarian, a character who is rather similar in that regard. She is educated, non-combative, socially permissive, but often silently critical of people’s tastes and a widespread loss of noble ideals.

In this latest episode she editorialized for a moment in conversation with her twenty-eight year-old intern, Trent (not the other one, Brent, who is thirty-seven) after he had helped a patron find a thriller that showcased truly heinous crimes. Miss Harrison, voiced by the highly talented Sue Scott, commented: “In library school we were taught that the role of the library is to educate, to uplift, not to cater to every whim.” I didn’t even go to library school, but I have always had the same image of libraries.

On hearing that line of dialogue, I thought of the last couple of trips I have taken to the Central Library in the City of Buffalo. It has come a long way from the libraries that were so domestically familiar to me throughout elementary and high school. These days, when you walk around a library, you find that the stacks are deserted but that a sea of people stretches throughout the computer banks. On an occasion when I lost my internet connection, I had to carry my laptop to the library in order to borrow its wireless connection for a day. Doing so made me feel sort of cheap and disloyal, and it also gave me an opportunity to occasionally observe the behavior of the other patrons, which in turn made me feel worse.

I noticed a middle aged couple sharing a long game of solitaire on one computer. Elsewhere, a man about my age was watching Youtube. My eyes have passed over various computer screens each time I’ve been back there, and I find that these are extremely commonplace activities. Many different kinds of games are played in the Buffalo library – first-person shooters, adventure games, bejeweled and similar puzzles. A significant portion of the library patronage these days, perhaps the majority, is evidently poor people who have no access to such entertainment at home and utilize the library for the idle passage of time instead.

Oh, to be poor but also have such free time or the means of transportation to frequent the region’s most expansive library! I understand not reading because you simply don’t have the time amidst your exhausting and low-paying work, and I understand having little access to either books or technology, particularly in a town where everything is so spread-out. But here the people I’ve seen at the library have the opportunity to beautifully enrich their lives with the information and artistry that surrounds them in a variety of media, and they choose to play dull games. It is a tragedy that libraries are used this way, that they are little more than the low-rent internet cafes and LAN parties of the twenty-first century.

Even if people ventured away from the computers, I find that the most prominently featured books aren’t all that much better. I want to believe that there are a few librarians who work in that building and react to the public much as does Ruth Harrison, diligently pointing them towards the popular fiction with easily digestible plots and few themes, then lamenting that she could have recommended Hemmingway or Faulkner. I’ve found that those sorts of lamentations often meet with comments along the lines of, “Hey, anything that gets kids reading.” That’s not the least bit persuasive to me. The mere act of allowing one’s brain to process typewritten words doesn’t in and of itself make for a richer intellectual experience than other alternatives. Is a child really better off reading Stephanie Meyer or Dan Brown than watching Carl Sagan’s Cosmos on DVD or listening closely to a Brahms symphony?

The sentiment of “as long as they’re reading” speaks to what I think is the underlying misconception that drives the degradation of libraries and of collective appreciation of art and literature. It also speaks to the difficulty that we face in reversing the trend. I resent what libraries have become, but I see no way of changing them back into grand temples of information and culture. In order to draw in the public and avoid closure, they have to provide the type of access that people want. And as a matter of principle, anything that qualifies as information or culture should have a place there, regardless of its intrinsic quality. So it’s not as if there is any cause for libraries to restrict people from being able to use them in such frivolous ways. But so long as easy escapism can be found there, the public will surely continue to gravitate toward it.

We need a collective breaking point to overturn the misconception, which drives both trends, that a greater quantity of information is effectively the same as a greater quality. I’m inclined to think that libraries think they are providing an adequate public service and that the public thinks it is adequately utilizing that service simply because, between the books and the high-speed internet, there’s a lot of information that’s directly accessible to the entire public. It doesn’t seem to matter how it’s utilized. But the danger to libraries is the danger to all of society – that as everything comes to be more and more at our fingertips, we will grow increasingly complacent about it and let the petty distractions dominate our attention. Since everything else is still there, such allowances seem to come at the expense of nothing, but in fact they come at the expense of our very minds.

Wednesday, December 28, 2011

Life, Googled

I saw a television advertisement for Google yesterday. Not for any particular service offered by Google, just for the Google brand as a whole. I find it kind of strange that a powerful company with virtually no competition for its major services would run advertisements in the popular media simply promoting its own name. But I suppose it’s aimed not at encouraging people to use Google, but at encouraging them to use Google for everything. I’m taking the fact that they’ve seen fit to run the ad as a good sign that Google does still have competition and people are not yet flocking to it for all their worldly desires.

Yet the style and content of the ad does give the impression that that’s precisely what they are promoting. It consists of a lengthy montage of web searches, e-mail messages, videos, status updates, and so forth, and clearly the main idea is that every facet of life can be served by a Google application. It’s a familiar style of advertising – one that tries to saturate the viewer with beautiful or inspiring imagery to make them desire a more intimate connection with the world being depicted on the screen. And the consumer is meant to come away from it thinking that the given brand will help them to obtain that closeness.

I have two pieces of commentary to bring to bear on Google’s application of this advertising style. One observation is general to the commercial, and one is specific to a brief part of it that I find objectionable.

My general criticism is that the advertisement as a whole falls flat in its effort to inspire me with a barrage of imagery, drawn from disparate corners of human experience. It’s a type of content that I’ve considered effective elsewhere, for instance in the 2008-9 Discovery Channel “I Love the World” campaign. There’s a straightforward reason why I consider the Google ad to fail where that one succeeded. Google’s montage presents every scene as being two steps removed, rather than just one.

The images included in its montage are fairly familiar, on the whole. They are simply of people talking, or of significant but commonplace daily events like a child’s first bike ride. These things are perfectly accessible without a technological medium, and yet when I see them on the television screen, it is perfectly clear that they are being channeled through something external to both me and the person being depicted. Where the visual is of a Google+ chat session or the like, I find myself looking at a screen upon a screen, and that leaves me quite far from the reality of another person’s life. And where the scene is not affixed to a separate little box, it is a poor quality image, shaking as someone films the event on a handheld video camera.

The Discovery Channel ad was similar in basic intent to the Google ad, in that it was offering a mode of access to other events, experiences, and parts of the world through an intermediary, whether television media or computers. But two things differentiated the Discovery Channel visuals: They were professionally produced and they depicted experiences that were clearly remote and uncommon. Thus, I enjoyed crisp, almost lifelike views of African tribal ceremonies, and skydiving, and undersea exploration, and I got the impression that the Discovery Channel was capable of bringing me closer to things that I could not easily or quickly access on my own.

By contrast, the Google ad reminds me that the use of some of their services might actually put additional barriers between me and the people or circumstances I wish to access. And if what I’m trying to access is just people roughly like me and experiences similar to those that I’ve had, I can step out my front door and gain access to something of the same kind without Google’s help. And personally, I think I would be better off doing so in many cases. As so often happens, I worry that I’m practically alone in that thinking. I worry that most Americans have eschewed any breaking point on this subject, and that they think it’s actually preferable to use a technological middle man for everything they used to do for themselves.

That brings me to my particular gripe with the scenes depicted in the Google ad. At one point it shows someone Googling the phrase, “How to be a better dad.” Have we really come to a point where we think that even that is the sort of question that Google can resolve for us. I know some people think that widespread access to the internet means it’s no longer necessary to memorize any factual information whatsoever. Are we now at the point that retaining ethereal information, standards of personal behavior, and methods of character development are also considered obsolete?

There are some things that you don’t Google. I don’t care how sophisticated their algorithm becomes; no information that can be posted to the web takes the place of experience, practice, and acquired wisdom. Anyone who would Google the phrase “How to be a better dad” has no business being a dad. After all, he seems to be under the shockingly erroneous impression that effective parenthood is easy, and that the problem of child-rearing can be resolved with the click of a mouse, as opposed to, say, rigorous study and earnest commitment.

It troubles me to think that Google is encouraging people to lean on their brand to resolve fundamental human questions for them. Just so that I can beat them to the punch, I would like to recommend against Googling the following phrases, in case their next ad suggests that a web search will provide the answer to any of them:

“How to live my life.”

“How to believe in the one true faith.”

“Why do bad things happen to good people?”

“Should I commit suicide?”

“Do I have a soul?”

“What is justice, Polemarchus?”

If at any time you have Googled one or more of these phrases, go outside and talk to somebody.

Tuesday, December 20, 2011

Entertainment Without Experience

I still rent movies in the form of physical DVDs, because I like to feel personally engaged with the media that I consume. When I decide to watch a film, I settle myself in front of the television, usually with dinner on my coffee table. As it is now winter, a movie usually means swaddling myself in a blanket and seeing that a pot of hot tea is near at hand. Food and drink are my only distractions, and far from being genuinely distracting, they usually enhance my enjoyment of two hours or so of closely watching a film. I am perhaps too obsessed with small rituals, but many of my activities do require suitable circumstances, and I am rather proud of that fact. It makes me feel as if I am getting the fullest sense of fulfillment from whatever I am doing, even if it is something as banal as watching a television screen alone in a dim room.

Some of the DVDs that I rent begin playback with a commercial for “Blu-Ray with digital copy,” and thus give me what I think is a glimpse of the exact opposite of valuing direct engagement with activities and their settings. Digital copy is a service that allows you to download a copy of a Blu-Ray disc you’ve purchased to your laptop, smart phone, or other electronic device, because apparently there is significant demand for high-definition entertainment on the go. The demand does not actually surprise me, but I thought such demand was already fulfilled by a product called everything that exists in the real world.

The commercial for Digital Copy includes a housewife addressing the audience and explaining that her family loves movies, but they just aren’t always home to enjoy them. Since she speaks directly to me through the fourth wall, I think it’s pretty unfair that I can’t talk back to her, because I have questions. If your family isn’t home to watch movies, it’s probably because they’re out doing other things, right? Why, then, would they perceive any need for electronic entertainment? Do you want to be able to keep up with the Kardashians when there’s a lull in your child’s recital and she’s not actually on stage? Is a basketball game not exciting enough if you can’t squeeze in a couple scenes from Die Hard between periods? If you’re not always home to watch movies, just wait. Movies are specifically for when you are at home.

If you think those aren’t the sort of circumstances to which the woman was referring, you haven’t seen the commercial, because one of the examples that it actually depicts of Digital Copy in use is a boy sitting on a bench outside at a basketball court, dressed in athletic wear, watching a movie while two other boys play basketball behind him. This scene is offered essentially without comment, and it frightens me to think that that might mean that other people are not baffled by it, as I am. I look at it and I see a product being advertised by showing something fun happening off in the background, where the product is specifically not being used.

The best possible explanation I can give for such a scene is that the advertisers are trying to convey that the solitary boy has something to do while he waits for one of his friends to rotate out of the game. But that’s hardly better than suggesting that the kid just watch a movie instead of participating in the other activity in the first place. Our participation in the world around us requires more than just phasing in when action is required of us. In the case of a basketball game, what about cheering on your teammates? It’s not irrelevant that there are other people on the court, and it’s easy to imagine that they may be offended to see that you need to delve into fantasy while they’re in the game. What about watching your opponents to gain some insight into their technique, strengths, and weaknesses? What about just enjoying the game itself as a form of entertainment? If you can’t be bothered to do any of that, and would rather load up a movie while you’re just waiting your turn, I can’t draw any conclusion except that you’re no more than half-invested in the activity in the first place, and probably shouldn’t be bothering with it at all

Still, at least in the basketball scenario the interaction between people is secondary. The same cannot be said about raising one’s child, which is a major part of the commercial. The ad returns to a mother’s narration, and she explains about how digital copy allows her to get more accomplished while she entertains her child. As an illustration of this, we see her grocery shopping while her small child sits in the back of the shopping cart staring at a handheld gaming device or some such. I can’t help but bristle at the woman indicating that she believes her job as a mother is to entertain her child, rather than to invest herself in raising it.

It seems to me that it’s a terrible parental attitude if you think of your child as an obstacle that you have to overcome while you go about your daily routine. I still distinctly recall working on the floor in a retail store and hearing a child screaming at the other side of the aisle. It wasn’t crying, or screaming about anything in particular, it was just making a rhythmic, piercing noise that carried throughout the building. It went on for minutes, and as the child was in my line of sight, I could see that it’s mother was standing beside the cart in which the child was sitting, and was going about her shopping while plainly ignoring the noise. At one time, society might have faulted that mother for failing to intervene with her child’s bad behavior, and teach it why what it was doing was wrong. Now it is apparently coming to be accepted that the solution to such a problem is not parenting, but technology. I wish it was better recognized that that alternative serves the parent, but never the child.

Ever since the advent of television, parents have apparently treated home entertainment as a way of ignoring their children. It’s flawed thinking that guides a parent to suppress her child’s impulse to act out with technological distractions, rather than correcting that behavior. But even if the child has no such impulse, it’s flawed thinking that guides a parent to offer distractions lest the child be bored. Your everyday interactions with your own children are perhaps more valuable than the activities into which you specifically intend to include them. There are a lot of things that kids need to learn about the adult world – the real world – as they’re growing. By instructing him to watch Finding Nemo for forty minutes while she shops for groceries, the hypothetical mother in the digital copy commercial is missing numerous important opportunities to teach her child about nutrition, about money and budgeting, about etiquette and social interaction. I would be surprised if the ascendant tendency to keep children’s attention distant from parental activities did not retard their social development over time.

But what’s retarded social development if the entire social structure is changing so as to no longer expect direct interaction? I find that with every passing year there is a larger proportion of people who are shocked, frightened, or personally offended by being spoken to by someone they don’t know personally. I see more people going out of their way to avoid eye contact with strangers on the street. I still don’t have an iPod, and remember being upset by seeing them gain prominence to such an extent that I came to naturally expect people to be walking around with their ears plugged at all times. And that doesn’t just bother me because it prevents people from hearing the voices of those who might otherwise have spoken to them. What really makes me pity the perpetually distracted is that it prevents them from hearing the entirety of the world’s day-to-day sound. To me, that remains an important part of human experience. It puts your life in context with where you are, and assures some measure of diversity of perception, beyond that which you personally seek out for entertainment.

I witnessed the ascent of the iPod and saw it as the end of natural hearing, and now with the growing access to television and film in all times and place, I feel that I’m witnessing human beings sacrificing the sense of sight, as well. Amidst this constant change, it’s very easy for me to envision current trends as leading eventually to some dystopian future, wherein human beings are constantly plugged into electronic distractions that assure productive complacence and see that nobody ever looks at the sky or listens to a bird song. Honestly, it’s gone so far in that direction that someone thinks the TV Hat is a good idea. Sure, the thing looks utterly laughable, but it also looks like something we would have laughed at as ridiculously over-the-top and implausible if we saw it as part of a depiction of the twenty-first century in a science fiction film from the eighties.

I live a painfully dull life. Few things could be more tragic to me than the thought that in the future, my insular, impoverished existence may be more experience-rich that that of most everyone else, as they’ll all be so accustomed to constantly having something to watch or listen to that they’ll never be fully present to anything they do in this enormously diverse world. The demands for constant entertainment passed the threshold of ridiculousness for me a long time ago. Will there ever come a breaking point when the rest of society agrees that the demand for distraction has outstripped the number of things there are to be distracted from? Or will we keep following the same trends until distraction itself becomes the entirety of our experience?

Thursday, October 6, 2011

I'm Done with Google

I’ve mostly stopped using Google. I use Google mail as my primary account, and it would be difficult to immediately change my essential contact information, so I will likely continue to use that service for the foreseeable future. But I try not to use it for web searches anymore, relying instead on Bing. And I certainly no longer read Google news.

This has been a long time coming. It was bothersome when I found that Google was saving my search information. I don’t typically search for the same things over and over again, so I don’t need to see old results every time I begin a search with the same letters. It just makes things look cluttered to me, and it’s very much against my tastes. I don’t want an all-purpose search engine to reflect my personal use of it. I want it to be a blank slate each time I access it – demonstrating equal accessibility for everyone who uses it, regardless of IP address.

I found it creepy when on top of storing personal information, search results ended up being specific to me. I don’t like the fact that Google plainly knows exactly where I am every time I am searching for something. When I type a word like, say, “cemetery,” or “restaurant” I don’t want the search results to be a list of cemeteries or restaurants that are in my area unless I’ve specified that. It feels like an invasion of privacy, and it’s not only that the system is acknowledging the source of my IP every time I access it. I know that that information has always been available, but it was more acceptable when it was in the background and I didn’t get the impression that I was actively being identified every time I sought information.

But more than that, sometimes when I type in a noun that describes a place or establishment, I really am just looking for general information. It’s presumptuous of Google to tailor the results to my location, and perhaps to my search history, when that information may actually be completely irrelevant to what I want to know. There was a time when the internet was a place I could go to find information that I was looking for, and not to be told by a third party what information I’m supposed to be looking for.

That same trend was what irritated me about Google News badges. Rather than continuing to allow what is objectively important to take center stage, that new feature sought to begin customizing each individual’s news according to a series of indicators that, while he may have demonstrated, he did not acknowledge or consent to. Much like my prior search results, I don’t need to retain a roster of news stories that I’ve read in the past. My interests change, and they change in dynamic ways. What I was reading last week or last month should have no bearing on what news is made most accessible to me now.

One of the major influences on how my interests change is according to what is happening. I want to know what’s on the cover of the New York Times and the Washington Post not because it fits with my preferences, but because there are people whose jobs are to identify current events that are of importance to the society in which we all collectively live. I trust them to tell me what matters to a greater extent than I trust myself, especially if I have no idea what has happened in the past twelve hours and all I have in order to filter my news is the history of my own base desires. That’s essentially the direction in which Google News badges were moving us. If that’s considered a good way to disseminate news to the population, the vast majority of Americans are going to end up knowing in detail the results of voting on American Idol but have no idea who the Republican frontrunner is. You may think you have better priorities, but I’m sure that important things do sometimes happen that fall under categories that you don’t typically read about.

But that was just a trend that Google was experimenting with. I could deal with that. I figured that opting out of the news badges service would keep my news objective. Then one day I signed in to my Gmail account, clicked over to news, and momentarily wondered why the hell the Buffalo Bills and Buffalo Sabres were national top stories. That was the breaking point that drove me away from Google altogether. Despite my best efforts to ignore their push for invasiveness, they continued to try to corrupt the information that was presented to me. And the source of that corruption, apparently, was me. Or rather, it was me as understood by a series of algorithms striving to represent me as a self-replicating digital entity. That is distinctly different from me as a human being, which is incidentally the me that wants to decide for my goddamn self what news to read and what general information applies to my professional and personal lives on any given day.

So I won’t be using Google as a search engine or a news aggregator anymore. I’ll wait until they start ranking my e-mail messages against my will and sending targeted advertisements directly to my inbox before I drop them as a mail client, as well. I know that Bing will probably trend in that direction, too. For now, I’m pleased to know that they prominently display the option to turn off all search history, although I do have to click it again every day. Even that is heartening, though, as it suggests that they aren’t saving my preferences based on IP address.

I earnestly hope that that behavior keeps up and they prove me wrong in my assumption that ultimately every large company in the information technology business trends towards hideous invasions of privacy and assertions of content control. If Bing or any other reliable search engine or news aggregator were to actually build their brand on the basis of their being the guys who let the customer make his own decisions, I would stick with them for the long haul.

For now, all I can say is that I’m done with Google, and that I’ll move on from each next option either until someone rediscovers the concept of boundaries or until I become the weird guy who spends all of his time in the library, has a subscription to the last remaining newspaper, and uses an old laptop as a writing table.

If you’ve got nine minutes, watch this TED Talk on the same topic: