Showing posts with label film. Show all posts
Showing posts with label film. Show all posts

Friday, March 9, 2012

Horses, Lambs, Children, and Conflicting Ethics

In the March issue of The Atlantic, Darcy Courteau writes about the consequences that have been faced by the horse market in the years since the last slaughterhouse that produced horse meat in the United States was forced to close. I remember that story well, as I felt at the time that I was at odds with what I perceived as a widely shared instance irrationality in American culture. It seemed absurd to me that the Department of Agriculture should place a value judgment upon the production of horse meat, which differed from that applied to all other livestock.

I disagreed in no uncertain terms with that bit of interference with free enterprise. I disagreed with it on rational grounds despite being a vegetarian and a person highly concerned with animal rights. I simply don’t see how the overwhelming public support for the removal of horse meat and only horse meat from the American market could have stood up to any measure of introspection. It relies on a false distinction between one type of animal and all others.

Being a vegetarian and an animal rights advocate, I want to see that society avoids the mistreatment and slaughter of all animals, not just the ones that I like. How I feel about the creatures is irrelevant; right and wrong are never contingent upon personal attitudes. It may be contingent upon the objective nature of different things, but this doesn’t seem to apply to the situation of horses and other livestock. I don’t see how anyone could realistically argue that horses possess personalities that, for instance, cows or lamb lack, or that horses are better able to experience pain, discomfort, or fear.

Unless one earnestly believes that horses are intrinsically different from other animals, which belief they would have to hold in absence of real evidence, I can only assume that their impulse to suppress the slaughter of horses while allowing it for other animals is on the basis of the personal relationships people sometimes have with horses.

But that doesn’t really make a difference when we’re talking about just the concept of slaughtering them for meat. It’s not as though opponents of horse meat had personal relationships with this or that particular horse. Some people have personal relationships with particular rabbits, or snakes. It’s not unheard of for someone to keep a pig as a house pet, or to feel affection for a cow that is kept solely for dairy production. Rarely is any of this used as grounds to argue that the entirety of society ought to disallow the killing of or production of meat from any animals of a certain species.

If one recoils with horror at the very thought of horse meat, but never bats an eye when filling his shopping cart with pork and beef, he is wedging an artificial dividing line into the application of his principles. Such selective defense can only be irrational. And if one is concerned with consistency of his own beliefs or ethics, instances like that ought to lead to one of three outcomes: a change in attitude leading to universal application of the principle, even if potentially inconvenient; abandonment of that principle; or production of a satisfactory account of why the dividing line is not artificial.

If a person utterly opposes the production of horse meat but neither opposes the slaughter of all other creatures nor truly believes that the mental lives of horses are significantly and objectively different from those of all other creatures, then that person is trying to hold two contrary views at once: that killing a sentient, autonomous being that’s called a horse is wrong, and that killing a sentient, autonomous being that’s not called a horse is okay.

Cognitive dissonance is the enemy of breaking points. When you give yourself license to hold views that are in opposition to one another, you strip yourself of the crucial motivation for intellectual or moral growth. Breaking points arise of conflict, and sometimes it is a conflict between two opposing ideas that you yourself maintain. A person who is concerned with rational consistency will keep an eye out for such conflicting views, and his breaking point will entail a sudden realization that either one of his ideas is wrong, or he doesn’t actually know why each of them is right.

In the case of the ranching of horses and the slaughter of them for meat, the tension between views goes well beyond the simple difference in perception of horses and other animals. Cognitive dissonance is easy when you’re operating on pure intuition. When those intuitions are directly challenged by pragmatic concerns, it’s much more difficult to make glib pronouncements that a certain action is simply wrong. Courteau writes of the fallout from the closure of the last US horse meat producers:

“In states across the country, reported cases of equine abuse, neglect, and abandonment skyrocketed. And the kill buyers of yesteryear aggregated into rarer but still more haunting boogeymen, purchasing for the abattoirs of Canada, or, worse, Mexico, where horses at some slaughterhouses are reportedly subject to torturous conditions.”

Consequentialism makes for complex ethical calculations, and if one wishes not only to release the United States from the stigma its citizenry attaches to the slaughter of horses, but to actually reduce the suffering experienced by American horses, then such a person’s intuition that it was good to force closure of the slaughterhouses is probably in error. But that error and the larger error of deliberate cognitive dissonance are both based on the same mistake of thinking that your knee-jerk intuition is sufficient grounds for all moral judgments.

When one really starts to analyze the consequences of people’s intuitive moral pronouncements, we see that cognitive dissonance is quite easy to come by once all the nuance of principle and pragmatism is taken into account. In other words, what a person thinks is wrong often fails to align perfectly with why he thinks it is wrong. We cannot permanently avoid the moral burden of having to occasionally choose the lesser of two evils. Unfortunately, this doesn’t seem to occur to many people who have non-inquisitive, black-and-white views of morality.

The other night, I was watching the documentary Sweetgrass, and the depictions of some of the operations at the sheep farm brought to mind these same questions of ethical complexities. The opening scenes of the film largely focus on the beginning of lives for sheep on that farm, and I was somewhat shocked by the dismissive treatment by the ranchers of both newborn lambs and nursing mothers. But if watches with a measure of objectivity, one quickly comes to realize that given such high volume of sheep, the farmers are doing what they can to promote survival of the highest number possible.

Some years ago, I had a good friend who was a devout, even zealous Buddhist. He was exceptionally sensitive to implications of animal mistreatment, and aggressively, immediately judgmental of perceived wrongdoing. It occurred to me while watching Sweetgrass that he certainly would have found the farmers’ behavior to be unforgivable, but that any alternative behavior that would have resulted in the survival of fewer sheep would have elicited just as much disdain from him. While their rationalizations were grounded in Buddhism instead of Christianity, this friend’s social and political views were decidedly conservative, and probably didn’t differ very much from those of his Christian parents.

His moral judgments, like those of many conservatives, and indeed like those of many people of any political leaning, were severely averse to nuance. I recall discussing abortion with him on one occasion and using the word “complex” to describe the breadth and seriousness of the associated ethical questions. That evoked fiery indignation from him, and he said, “No. You can kill or you can not kill. It’s actually really simple.”

And it would be simple if that’s all it came down to, if there weren’t any genuine questions about what qualifies as killing, if there weren’t any other ways of being responsible for another creature’s suffering. What my friend believed seemed simple on the surface, but at a deeper level of analysis it becomes clear that he was keeping it simple by ignoring the hard questions.

No doubt he would have agreed that his moral concern was with decreasing the suffering of sentient beings, a utilitarian concern. That view means it is reprehensible to do anything that promotes or permits the death of, say, sheep or horses. But it must also make it reprehensible to do anything that promotes or permits the hunger or severe discomfort of the same creatures.

In the case of the sheep in Sweetgrass, keeping all of the lambs alive meant separating them from their mothers immediately upon birth, forcibly compelling ewes to nurse lambs to which they had no connection, and hastily handling the creatures as if they were inanimate objects. The alternative would have been to handle them more delicately, more compassionately, but chances are that in light of the enormous numbers of sheep that needed to be handled by just a few farmers, that would have resulted in some of the lambs being neglected, and thus starving or being killed by competing sheep.

Both alternatives may well be similarly unethical, but it’s unhelpful to simply reject whichever alternative is current simply on the basis of its perceived wrongness. The choice of one wrong action is, in cases like this, the direct consequence of the rejection of another.

It may strike some people as hideously dehumanizing to draw such a parallel, but the pragmatic circumstances surrounding the abortion debate can be elucidated by thinking of the entire human race as a correlate to a herd of livestock. As population increase, the rate of survival within that population, or at least the average utility available to each individual, naturally decreases. Mandating the birth of more young is tantamount to mandating the provision of more suffering. A person who opposes either abortion or the neglect of newborn lambs or the slaughter horses doesn’t have to accept that fact as a justification of the contrary position, but he does have to acknowledge the consequences of what he’s advocating.

In fact, I find that most people refuse to do this. They are, instead, happy to embrace cognitive dissonance, presumably because it is easier to live in a fantasy world in which right actions never have unintended consequences than it is to willfully struggle with moral dilemmas. That perception, however irrational, may help an individual to remain admirably committed to his own ethical obligations, but it also results in unfair judgments predicated upon others.

It’s not rational to demand that a creature with little access to resources must both birth its child and feed it. The acceptance of cognitive dissonance results in dissonant demands and no-win situations. That is the cognitive dissonance of, for instance, anyone who repudiates abortion without compromise, but also rejects the provision welfare. Essentially, the two views in concert pronounce that it’s wrong both to terminate a pregnancy and to have a child while poor.

Again, a rational person whose views are at odds with one another must apply the relevant principle, abandon it, or explain how they can be reconciled. In the given case, if a person claims the principle of defending the lives of innocents, he must apply that principle by providing material support to unsupported children. If that is too inconvenient, he must rethink his stance on abortion, or else explain why it’s worth defending an unborn child but not one who has truly entered the world.

It’s not easy to decide upon coherent ethical theories as to what constitutes right and wrong, but even once you have, it’s not easy to determine how to apply those theories. If you want children to have both a chance at life and at least basic comfort once they’ve begun that life, you’ll eventually have to confront a situation in which those desires stand in opposition. If want the lambs to avoid both starvation and mistreatment, you’ll be horrified, when you look closely enough, to realize that it sometimes takes one to avoid the other. You can save the horses from the abattoir, but you may thus doom them stable that does them even greater harm.

There is a certain sense in which my Buddhist friend’s pronouncement is still correct. It’s very simple: you can either kill or not kill. But the operative word there is “you.” The individual often has privileges that are absent to society at large. You can choose to carry your own unintended pregnancy to term, but if you can then feed that child without fail, you’d better thank God that you never really had to face the choice between depriving a child of life and subjecting it to exquisite hardship. And you can’t conflate either situation with the broader hypothetical in which the nation is inundated with a million additional young lives that must be supported and defended.

If you raise horses and you’re uncomfortable with them being either slaughtered or abused and underfed, you can do as Ms. Courteau’s father had always done and refuse to sell them to kill buyers. But when such sales are no longer an option and the reduced demand causes the prices of horses to fall, lowering your revenue to the point where it is no longer possible to take adequate care of the horses you have, the dual ideals of defending all life and defending against all suffering are no longer sustainable.

This has been the situation of horse farming in the United States for the past four or five years. I remember it being mentioned by some as a possible consequence at the time that the last slaughterhouse dealing in horse meat was closing. But mostly I remember objecting to the irrationality of it all. I remember this very well, but somehow I missed the fact that the Congress resumed funding for these slaughterhouses in November, which may result in some reopening this year.

I won’t be happy to see domestic horses go back to slaughter. Indeed, I hope that someday in the far-distant future they all close again, but that they do so then right along with those that deal in every species of animal, and that it be on the basis of the universal application of moral principles, not on the basis of an absurd double-standard.

But despite the fancifulness of that hope, I’m not naïve about the implications. I know that many animals will suffer and die from lack of care during any possible transition away from their slaughter and consumption. But if I could be alive when that time comes, I would say that that is the unhappy consequence of doing right in a way that is more crucial to our future moral standing. It is a great tragedy of the social aspect of moral existence that we sometimes have to prioritize our values against one another. But our collective morality gains not a bit from pretending that there is no such problem.

The nuanced demands and consequences of collective ethics are discomforting, in that they may require us to accept things that don’t feel right to us. Intuition is a powerful tool in making moral judgments, but it can only lead us so far. If it guides a situation towards less obvious but more serious harms, we’ve probably made the awfully mistake of eschewing rationality in order to appease the short-sighted demands of immediate perception. Only reason, and not intuition, is capable of handling nuance and recognizing indefensible cognitive dissonances.

Rationality is a skill that must be learned for the sake of coherent, far-reaching moral behavior. It draws the dividing line between those who think they are doing the right thing and truly are, even if they appear not to be.

Courteau writes of the reversal of the double-standard regarding horse meat, “Many pet lovers are furious, but PETA actually supports the reversal, arguing that the suffering of unwanted horses increased after the demise of the kill plants.” If PETA, which is often so prone to over-the-top displays of self-righteous, black-and-white morality, can learn the value of nuance and circumstance, anyone can.

Friday, February 10, 2012

What I've Been Watching: Twin Peaks

It’s been a while since I’ve posted any commentary about film or television here. Toward the end of changing that, it seems worthwhile to point out that I recently finished watching the entire run of Twin Peaks. It was wonderfully compelling, in large part because of the skillful blend of soap-operatic, don’t-miss-an-episode plotlines, potentially revelatory themes, and wonderfully artistic imagery. Despite an almost perfect pilot movie, a thoroughly satisfying season and a half, and a finale that was the most eye-poppingly surreal bit of television I’ve seen since the last episode of The Prisoner, Twin Peaks has also allowed me to experience, twenty years after the fact, one of the most rapidly disappointing declines in television history.

I am assuredly not going to be saying anything that hasn’t been said a million times since 1991, but it is stunning how clearly season two of Twin Peaks demonstrates the destructive potential of television networks. If any controlling interest in work of fiction has ever made a more glaring error in judgment than ABC did in insisting upon a solution to the Laura Palmer murder, let me know about it. I cannot fathom how anyone could have thought that that would be a good idea, when even the most casual observer should have recognized that that was what held the show together, that without unresolved questions about the slain prom queen the show didn’t really exist.

If they’d wanted to resolve the storyline out of some sense of duty to their audience, I suppose I can understand that, especially if they were confident enough in the rest of the series to expect it to maintain its appeal on its various other merits. From what I understand, though, the network never had much faith in the show, even when the public was obsessed with it. Of course, the ratings at the end of the first season and the beginning of the second should have made it clear that the accountants and executives didn’t know what they were talking about, and they should have thus been motivated to stay the hell out of the way.

I can begin to understand insisting on the resolution to the initial A storyline, but how in God’s name could anyone see value in ending it mid-season? I started viewing the show in almost complete ignorance as to what to expect, and I certainly didn’t realize that the murder would be solved when it was. Even so, the revelation was extremely powerful for me, and I was quite disappointed in myself for not determining who the killer was ahead of time. But upon viewing that extremely early climax and knowing how many episodes were still left, I assumed that what I had watched was just the beginning of the end, and that the rest of the season would entail the characters pursuing a killer whose identity the audience now knew. I also assumed that they would continue to piece together layers of the mystery, tying the minor storylines into narrative of Laura Palmer’s death.

It’s not that the latter half of season two is bad, but when neither of those things happened, and when the primary driver of the plot was wrapped up very quickly and neatly, I was utterly disoriented. I imagine that the majority of fans of the show, like me, had to force themselves past that point just to see what happened next. Once I came to terms with the fact that the bottom fell out of the show, I rediscovered its appeal from a completely different standpoint. But even though it remained good television, what could hurt a series more than several episodes of the audience collectively wondering why they were suddenly watching an altogether different show.

Because of the catastrophic effect of ABC’s insistence on something that was about as obviously bad as bad ideas come, it’s a terrible disappointment that Twin Peaks didn’t constitute a breaking point in the tradition of interference by moneyed interests in art and media. But unfortunately that was just a particular high water mark in a still-ongoing history of networks and studies derailing promising projects and preventing good stories from becoming great.

Twin Peaks was still great, but it could have been great for so much longer. Oh, there were other problems with the post-resolution portion of the show, too. The recovery period might have been quicker and smoother if it weren’t for the fact that even minor storylines were dropped and new ones with new characters had to be unexpectedly introduced in large part, based on what I’ve read, because Kyle McLaughlin was unabashedly shitting where he ate. Nevertheless, all of this built through the development of a beautifully complex mythology to a final episode that kept my mind constantly racing to keep up with a parade of nightmarish riddles.

And that final episode was intellectually gratifying but emotionally devastating. David Lynch and Mark Frost had apparently arranged to put multiple characters in peril in hopes of generating demand for continuation to a third season. This also was something I didn’t know going into it, so I, and presumably many of those who watched it when it was on television, was expecting a closed ending. But when I read the final scenes that way I came away from the show grappling with the implication that there are dire personal consequences for trying to do good in a world that possesses precious little hope.

I heard whispers that the film, Fire Walk With Me might provide some resolution to the oppressively bleak ending of the show, but it didn’t. It tauntingly hinted at the possibility of a resolution, but it didn’t follow through with it, so I get the impression that David Lynch just enjoys screwing with his audience for kicks. That said, I’d be happy to be screwed with again. Hopefully I am part of an ongoing growth of retrospective interest in Twin Peaks. If so, I honestly think Lynch should revisit the town, since Twin Peaks has the unique distinction of having specifically placed one early scene twenty-five years in the series’ future. That’s right around the corner, underneath the sycamores, behind the red curtain.

Wednesday, January 11, 2012

Movies for Escaping a Desert Island

I’m a little bit late with this topic, but I was offline for several days, so why don’t you give me a break?

It seems that Friday was Matt Zoller Seitz’s last column for Salon.com before he became television critic for New York Magazine. As a finale slideshow, Seitz chose the topic “movies for a desert island,” and detailed his list of ten films, one short, and one series of television that he would keep as entertainment if he was stranded on a desert island with “an indestructible DVD player with a solar-recharging power source.”

Seitz prompted his readers to come up with their own lists, and I clicked into the comments section to see some of them. “Part of the fun of this exercise,” Seitz wrote, “is figuring out what you think you can watch over and over, and what you can live without.”

For me, though, the main part of the fun, and perhaps the frustration, of watching other people undertake the exercise was rediscovering how differently my mind approaches entertainment, as compared with most of the people around me. I can’t say that I took the time to dream up my list of twelve pieces of visual media, but I’m sure that mine wouldn’t have looked a thing like the others.

It may well be that I’m missing the entire point of the exercise, and applying a kind of logic to it that has no place in such purely academic challenges. But I can’t get past the fact that for me, the phrases “desert island films” and “current favorite films” do not mean the same thing. And that’s all that I seemed to be seeing in the author’s and the commenters’ lists. They were lists of a dozen items that each person thought he or she would find endlessly entertaining; a dozen things that would distract the person from the monotony and desperation of his surroundings. I simply can’t help extrapolating from the hypothetical and concluding that each person who participated in the exercise took it for granted that he was resigned to the fate of being trapped on a desert island for an indeterminate length of time, and possibly forever.

“Best to make the most of it, and see that I have some of my favorite entertainment on hand, so I can be as happy as possible while I’m here.”

Even in an utterly unrealistic hypothetical, I can’t take that attitude. It doesn’t reflect the way that I engage with media. I love film, but I hate escapism. I can’t think of a thing that I’ve watched on my own accord that I didn’t watch with an eye towards relating it to my own life and circumstances, or learning more about the world through it, and generally using it as a surface for reflection.

It would be no different on a desert island. So if I had access to a visual media there, I would damn well want it to be media that reminded me of my surroundings and circumstances, rather than distracting me from them, and that motivated me towards the goal of either getting the hell off of a desert island or building an idealized society on one. That’s not to say that the films I choose would have to have identical settings, but they would have to all possess themes that seemed personally significant, whether about freedom, or emotional fortitude, or encroaching insanity.

The closest thing that I saw to that line of thinking was that several commenters included The Matrix in their lists. I could see watching that in any circumstances wherein my freedom was constrained (i.e. the only circumstances I have known), because it’s explicitly about getting free by being in touch with reality when forces around you are compelling you to flee from it. But as far as Seitz’s challenge was concerned, I think that based on the content of the rest of their lists, those commenters chose The Matrix because it was an entertaining sci-fi action/adventure film that they had thoroughly enjoyed when they were younger.

Am I making unfair assumptions about the motivations of the respondents to the exercise? They could each find the content of their chosen films so personally poignant that they give them hope when things seem most hopeless. They might choose comedy and pure entertainment because they know they will function better towards some greater end if they can laugh and feel good amidst everything else. I don’t think that’s it, at least not on the whole. In response to one person placing Groundhog Day on his list, another commenter questioned the selection. “Don’t you think that would hit a little too close to home on a desert island?” he asked.

It’s still hard for me to accept, but apparently hitting close to home is not something that other people want in film and television. I, however, want little else. It sounds narcissistic, but if something isn’t in some sense about me, it isn’t worth watching.

I’m not sure what would qualify if I was on a desert island, though. I’ve never been there, so I don’t know what would speak to me. I’ll stick with The Matrix unless I come up with a better ten. I’d probably include some sort of nature documentary, likely Winged Migration, to put me more in touch with what beauty I would still have access to on my island. Perhaps I’d include Powaqqatsi as a way to remind myself both of the beauty that’s possible in the habitation of natural settings and of the beauty that I’d left behind in the rest of the world. Cast Away might make the list because even though it’s far from being one of my favorite films, it very well might become that when I’m in practically the exact same situation.

The only thing I can say with fair certainty, though, is what my television series would be. With its theme of resisting the circumstances in which one feels trapped, no matter how overwhelming they are, I think The Prisoner would be as poignant for me on an actual island as it is on the metaphorical island on which I now live.

Now, would you like to rethink the question for yourself? What films would you want with you if you were stranded on a desert island, could watch movies, but still cared about the fact that you were stranded on a desert island?

Tuesday, December 20, 2011

Entertainment Without Experience

I still rent movies in the form of physical DVDs, because I like to feel personally engaged with the media that I consume. When I decide to watch a film, I settle myself in front of the television, usually with dinner on my coffee table. As it is now winter, a movie usually means swaddling myself in a blanket and seeing that a pot of hot tea is near at hand. Food and drink are my only distractions, and far from being genuinely distracting, they usually enhance my enjoyment of two hours or so of closely watching a film. I am perhaps too obsessed with small rituals, but many of my activities do require suitable circumstances, and I am rather proud of that fact. It makes me feel as if I am getting the fullest sense of fulfillment from whatever I am doing, even if it is something as banal as watching a television screen alone in a dim room.

Some of the DVDs that I rent begin playback with a commercial for “Blu-Ray with digital copy,” and thus give me what I think is a glimpse of the exact opposite of valuing direct engagement with activities and their settings. Digital copy is a service that allows you to download a copy of a Blu-Ray disc you’ve purchased to your laptop, smart phone, or other electronic device, because apparently there is significant demand for high-definition entertainment on the go. The demand does not actually surprise me, but I thought such demand was already fulfilled by a product called everything that exists in the real world.

The commercial for Digital Copy includes a housewife addressing the audience and explaining that her family loves movies, but they just aren’t always home to enjoy them. Since she speaks directly to me through the fourth wall, I think it’s pretty unfair that I can’t talk back to her, because I have questions. If your family isn’t home to watch movies, it’s probably because they’re out doing other things, right? Why, then, would they perceive any need for electronic entertainment? Do you want to be able to keep up with the Kardashians when there’s a lull in your child’s recital and she’s not actually on stage? Is a basketball game not exciting enough if you can’t squeeze in a couple scenes from Die Hard between periods? If you’re not always home to watch movies, just wait. Movies are specifically for when you are at home.

If you think those aren’t the sort of circumstances to which the woman was referring, you haven’t seen the commercial, because one of the examples that it actually depicts of Digital Copy in use is a boy sitting on a bench outside at a basketball court, dressed in athletic wear, watching a movie while two other boys play basketball behind him. This scene is offered essentially without comment, and it frightens me to think that that might mean that other people are not baffled by it, as I am. I look at it and I see a product being advertised by showing something fun happening off in the background, where the product is specifically not being used.

The best possible explanation I can give for such a scene is that the advertisers are trying to convey that the solitary boy has something to do while he waits for one of his friends to rotate out of the game. But that’s hardly better than suggesting that the kid just watch a movie instead of participating in the other activity in the first place. Our participation in the world around us requires more than just phasing in when action is required of us. In the case of a basketball game, what about cheering on your teammates? It’s not irrelevant that there are other people on the court, and it’s easy to imagine that they may be offended to see that you need to delve into fantasy while they’re in the game. What about watching your opponents to gain some insight into their technique, strengths, and weaknesses? What about just enjoying the game itself as a form of entertainment? If you can’t be bothered to do any of that, and would rather load up a movie while you’re just waiting your turn, I can’t draw any conclusion except that you’re no more than half-invested in the activity in the first place, and probably shouldn’t be bothering with it at all

Still, at least in the basketball scenario the interaction between people is secondary. The same cannot be said about raising one’s child, which is a major part of the commercial. The ad returns to a mother’s narration, and she explains about how digital copy allows her to get more accomplished while she entertains her child. As an illustration of this, we see her grocery shopping while her small child sits in the back of the shopping cart staring at a handheld gaming device or some such. I can’t help but bristle at the woman indicating that she believes her job as a mother is to entertain her child, rather than to invest herself in raising it.

It seems to me that it’s a terrible parental attitude if you think of your child as an obstacle that you have to overcome while you go about your daily routine. I still distinctly recall working on the floor in a retail store and hearing a child screaming at the other side of the aisle. It wasn’t crying, or screaming about anything in particular, it was just making a rhythmic, piercing noise that carried throughout the building. It went on for minutes, and as the child was in my line of sight, I could see that it’s mother was standing beside the cart in which the child was sitting, and was going about her shopping while plainly ignoring the noise. At one time, society might have faulted that mother for failing to intervene with her child’s bad behavior, and teach it why what it was doing was wrong. Now it is apparently coming to be accepted that the solution to such a problem is not parenting, but technology. I wish it was better recognized that that alternative serves the parent, but never the child.

Ever since the advent of television, parents have apparently treated home entertainment as a way of ignoring their children. It’s flawed thinking that guides a parent to suppress her child’s impulse to act out with technological distractions, rather than correcting that behavior. But even if the child has no such impulse, it’s flawed thinking that guides a parent to offer distractions lest the child be bored. Your everyday interactions with your own children are perhaps more valuable than the activities into which you specifically intend to include them. There are a lot of things that kids need to learn about the adult world – the real world – as they’re growing. By instructing him to watch Finding Nemo for forty minutes while she shops for groceries, the hypothetical mother in the digital copy commercial is missing numerous important opportunities to teach her child about nutrition, about money and budgeting, about etiquette and social interaction. I would be surprised if the ascendant tendency to keep children’s attention distant from parental activities did not retard their social development over time.

But what’s retarded social development if the entire social structure is changing so as to no longer expect direct interaction? I find that with every passing year there is a larger proportion of people who are shocked, frightened, or personally offended by being spoken to by someone they don’t know personally. I see more people going out of their way to avoid eye contact with strangers on the street. I still don’t have an iPod, and remember being upset by seeing them gain prominence to such an extent that I came to naturally expect people to be walking around with their ears plugged at all times. And that doesn’t just bother me because it prevents people from hearing the voices of those who might otherwise have spoken to them. What really makes me pity the perpetually distracted is that it prevents them from hearing the entirety of the world’s day-to-day sound. To me, that remains an important part of human experience. It puts your life in context with where you are, and assures some measure of diversity of perception, beyond that which you personally seek out for entertainment.

I witnessed the ascent of the iPod and saw it as the end of natural hearing, and now with the growing access to television and film in all times and place, I feel that I’m witnessing human beings sacrificing the sense of sight, as well. Amidst this constant change, it’s very easy for me to envision current trends as leading eventually to some dystopian future, wherein human beings are constantly plugged into electronic distractions that assure productive complacence and see that nobody ever looks at the sky or listens to a bird song. Honestly, it’s gone so far in that direction that someone thinks the TV Hat is a good idea. Sure, the thing looks utterly laughable, but it also looks like something we would have laughed at as ridiculously over-the-top and implausible if we saw it as part of a depiction of the twenty-first century in a science fiction film from the eighties.

I live a painfully dull life. Few things could be more tragic to me than the thought that in the future, my insular, impoverished existence may be more experience-rich that that of most everyone else, as they’ll all be so accustomed to constantly having something to watch or listen to that they’ll never be fully present to anything they do in this enormously diverse world. The demands for constant entertainment passed the threshold of ridiculousness for me a long time ago. Will there ever come a breaking point when the rest of society agrees that the demand for distraction has outstripped the number of things there are to be distracted from? Or will we keep following the same trends until distraction itself becomes the entirety of our experience?