Showing posts with label Social Criticism. Show all posts
Showing posts with label Social Criticism. Show all posts

Sunday, February 19, 2012

The Tragedy of the Modern Library


I try to listen to A Prairie Home Companion each Saturday evening, in large part because, despite being politically and socially liberal, I am personally quite conservative and prone to nostalgia and wistfulness for a purer experience of things that it seems I was denied by the unrelenting progress of history. This week’s broadcast featured an episode in the adventures of Ruth Harrison, reference librarian, a character who is rather similar in that regard. She is educated, non-combative, socially permissive, but often silently critical of people’s tastes and a widespread loss of noble ideals.

In this latest episode she editorialized for a moment in conversation with her twenty-eight year-old intern, Trent (not the other one, Brent, who is thirty-seven) after he had helped a patron find a thriller that showcased truly heinous crimes. Miss Harrison, voiced by the highly talented Sue Scott, commented: “In library school we were taught that the role of the library is to educate, to uplift, not to cater to every whim.” I didn’t even go to library school, but I have always had the same image of libraries.

On hearing that line of dialogue, I thought of the last couple of trips I have taken to the Central Library in the City of Buffalo. It has come a long way from the libraries that were so domestically familiar to me throughout elementary and high school. These days, when you walk around a library, you find that the stacks are deserted but that a sea of people stretches throughout the computer banks. On an occasion when I lost my internet connection, I had to carry my laptop to the library in order to borrow its wireless connection for a day. Doing so made me feel sort of cheap and disloyal, and it also gave me an opportunity to occasionally observe the behavior of the other patrons, which in turn made me feel worse.

I noticed a middle aged couple sharing a long game of solitaire on one computer. Elsewhere, a man about my age was watching Youtube. My eyes have passed over various computer screens each time I’ve been back there, and I find that these are extremely commonplace activities. Many different kinds of games are played in the Buffalo library – first-person shooters, adventure games, bejeweled and similar puzzles. A significant portion of the library patronage these days, perhaps the majority, is evidently poor people who have no access to such entertainment at home and utilize the library for the idle passage of time instead.

Oh, to be poor but also have such free time or the means of transportation to frequent the region’s most expansive library! I understand not reading because you simply don’t have the time amidst your exhausting and low-paying work, and I understand having little access to either books or technology, particularly in a town where everything is so spread-out. But here the people I’ve seen at the library have the opportunity to beautifully enrich their lives with the information and artistry that surrounds them in a variety of media, and they choose to play dull games. It is a tragedy that libraries are used this way, that they are little more than the low-rent internet cafes and LAN parties of the twenty-first century.

Even if people ventured away from the computers, I find that the most prominently featured books aren’t all that much better. I want to believe that there are a few librarians who work in that building and react to the public much as does Ruth Harrison, diligently pointing them towards the popular fiction with easily digestible plots and few themes, then lamenting that she could have recommended Hemmingway or Faulkner. I’ve found that those sorts of lamentations often meet with comments along the lines of, “Hey, anything that gets kids reading.” That’s not the least bit persuasive to me. The mere act of allowing one’s brain to process typewritten words doesn’t in and of itself make for a richer intellectual experience than other alternatives. Is a child really better off reading Stephanie Meyer or Dan Brown than watching Carl Sagan’s Cosmos on DVD or listening closely to a Brahms symphony?

The sentiment of “as long as they’re reading” speaks to what I think is the underlying misconception that drives the degradation of libraries and of collective appreciation of art and literature. It also speaks to the difficulty that we face in reversing the trend. I resent what libraries have become, but I see no way of changing them back into grand temples of information and culture. In order to draw in the public and avoid closure, they have to provide the type of access that people want. And as a matter of principle, anything that qualifies as information or culture should have a place there, regardless of its intrinsic quality. So it’s not as if there is any cause for libraries to restrict people from being able to use them in such frivolous ways. But so long as easy escapism can be found there, the public will surely continue to gravitate toward it.

We need a collective breaking point to overturn the misconception, which drives both trends, that a greater quantity of information is effectively the same as a greater quality. I’m inclined to think that libraries think they are providing an adequate public service and that the public thinks it is adequately utilizing that service simply because, between the books and the high-speed internet, there’s a lot of information that’s directly accessible to the entire public. It doesn’t seem to matter how it’s utilized. But the danger to libraries is the danger to all of society – that as everything comes to be more and more at our fingertips, we will grow increasingly complacent about it and let the petty distractions dominate our attention. Since everything else is still there, such allowances seem to come at the expense of nothing, but in fact they come at the expense of our very minds.

Tuesday, December 20, 2011

Entertainment Without Experience

I still rent movies in the form of physical DVDs, because I like to feel personally engaged with the media that I consume. When I decide to watch a film, I settle myself in front of the television, usually with dinner on my coffee table. As it is now winter, a movie usually means swaddling myself in a blanket and seeing that a pot of hot tea is near at hand. Food and drink are my only distractions, and far from being genuinely distracting, they usually enhance my enjoyment of two hours or so of closely watching a film. I am perhaps too obsessed with small rituals, but many of my activities do require suitable circumstances, and I am rather proud of that fact. It makes me feel as if I am getting the fullest sense of fulfillment from whatever I am doing, even if it is something as banal as watching a television screen alone in a dim room.

Some of the DVDs that I rent begin playback with a commercial for “Blu-Ray with digital copy,” and thus give me what I think is a glimpse of the exact opposite of valuing direct engagement with activities and their settings. Digital copy is a service that allows you to download a copy of a Blu-Ray disc you’ve purchased to your laptop, smart phone, or other electronic device, because apparently there is significant demand for high-definition entertainment on the go. The demand does not actually surprise me, but I thought such demand was already fulfilled by a product called everything that exists in the real world.

The commercial for Digital Copy includes a housewife addressing the audience and explaining that her family loves movies, but they just aren’t always home to enjoy them. Since she speaks directly to me through the fourth wall, I think it’s pretty unfair that I can’t talk back to her, because I have questions. If your family isn’t home to watch movies, it’s probably because they’re out doing other things, right? Why, then, would they perceive any need for electronic entertainment? Do you want to be able to keep up with the Kardashians when there’s a lull in your child’s recital and she’s not actually on stage? Is a basketball game not exciting enough if you can’t squeeze in a couple scenes from Die Hard between periods? If you’re not always home to watch movies, just wait. Movies are specifically for when you are at home.

If you think those aren’t the sort of circumstances to which the woman was referring, you haven’t seen the commercial, because one of the examples that it actually depicts of Digital Copy in use is a boy sitting on a bench outside at a basketball court, dressed in athletic wear, watching a movie while two other boys play basketball behind him. This scene is offered essentially without comment, and it frightens me to think that that might mean that other people are not baffled by it, as I am. I look at it and I see a product being advertised by showing something fun happening off in the background, where the product is specifically not being used.

The best possible explanation I can give for such a scene is that the advertisers are trying to convey that the solitary boy has something to do while he waits for one of his friends to rotate out of the game. But that’s hardly better than suggesting that the kid just watch a movie instead of participating in the other activity in the first place. Our participation in the world around us requires more than just phasing in when action is required of us. In the case of a basketball game, what about cheering on your teammates? It’s not irrelevant that there are other people on the court, and it’s easy to imagine that they may be offended to see that you need to delve into fantasy while they’re in the game. What about watching your opponents to gain some insight into their technique, strengths, and weaknesses? What about just enjoying the game itself as a form of entertainment? If you can’t be bothered to do any of that, and would rather load up a movie while you’re just waiting your turn, I can’t draw any conclusion except that you’re no more than half-invested in the activity in the first place, and probably shouldn’t be bothering with it at all

Still, at least in the basketball scenario the interaction between people is secondary. The same cannot be said about raising one’s child, which is a major part of the commercial. The ad returns to a mother’s narration, and she explains about how digital copy allows her to get more accomplished while she entertains her child. As an illustration of this, we see her grocery shopping while her small child sits in the back of the shopping cart staring at a handheld gaming device or some such. I can’t help but bristle at the woman indicating that she believes her job as a mother is to entertain her child, rather than to invest herself in raising it.

It seems to me that it’s a terrible parental attitude if you think of your child as an obstacle that you have to overcome while you go about your daily routine. I still distinctly recall working on the floor in a retail store and hearing a child screaming at the other side of the aisle. It wasn’t crying, or screaming about anything in particular, it was just making a rhythmic, piercing noise that carried throughout the building. It went on for minutes, and as the child was in my line of sight, I could see that it’s mother was standing beside the cart in which the child was sitting, and was going about her shopping while plainly ignoring the noise. At one time, society might have faulted that mother for failing to intervene with her child’s bad behavior, and teach it why what it was doing was wrong. Now it is apparently coming to be accepted that the solution to such a problem is not parenting, but technology. I wish it was better recognized that that alternative serves the parent, but never the child.

Ever since the advent of television, parents have apparently treated home entertainment as a way of ignoring their children. It’s flawed thinking that guides a parent to suppress her child’s impulse to act out with technological distractions, rather than correcting that behavior. But even if the child has no such impulse, it’s flawed thinking that guides a parent to offer distractions lest the child be bored. Your everyday interactions with your own children are perhaps more valuable than the activities into which you specifically intend to include them. There are a lot of things that kids need to learn about the adult world – the real world – as they’re growing. By instructing him to watch Finding Nemo for forty minutes while she shops for groceries, the hypothetical mother in the digital copy commercial is missing numerous important opportunities to teach her child about nutrition, about money and budgeting, about etiquette and social interaction. I would be surprised if the ascendant tendency to keep children’s attention distant from parental activities did not retard their social development over time.

But what’s retarded social development if the entire social structure is changing so as to no longer expect direct interaction? I find that with every passing year there is a larger proportion of people who are shocked, frightened, or personally offended by being spoken to by someone they don’t know personally. I see more people going out of their way to avoid eye contact with strangers on the street. I still don’t have an iPod, and remember being upset by seeing them gain prominence to such an extent that I came to naturally expect people to be walking around with their ears plugged at all times. And that doesn’t just bother me because it prevents people from hearing the voices of those who might otherwise have spoken to them. What really makes me pity the perpetually distracted is that it prevents them from hearing the entirety of the world’s day-to-day sound. To me, that remains an important part of human experience. It puts your life in context with where you are, and assures some measure of diversity of perception, beyond that which you personally seek out for entertainment.

I witnessed the ascent of the iPod and saw it as the end of natural hearing, and now with the growing access to television and film in all times and place, I feel that I’m witnessing human beings sacrificing the sense of sight, as well. Amidst this constant change, it’s very easy for me to envision current trends as leading eventually to some dystopian future, wherein human beings are constantly plugged into electronic distractions that assure productive complacence and see that nobody ever looks at the sky or listens to a bird song. Honestly, it’s gone so far in that direction that someone thinks the TV Hat is a good idea. Sure, the thing looks utterly laughable, but it also looks like something we would have laughed at as ridiculously over-the-top and implausible if we saw it as part of a depiction of the twenty-first century in a science fiction film from the eighties.

I live a painfully dull life. Few things could be more tragic to me than the thought that in the future, my insular, impoverished existence may be more experience-rich that that of most everyone else, as they’ll all be so accustomed to constantly having something to watch or listen to that they’ll never be fully present to anything they do in this enormously diverse world. The demands for constant entertainment passed the threshold of ridiculousness for me a long time ago. Will there ever come a breaking point when the rest of society agrees that the demand for distraction has outstripped the number of things there are to be distracted from? Or will we keep following the same trends until distraction itself becomes the entirety of our experience?

Thursday, October 27, 2011

In Defense of William Castle

I’ve had to spend an unusual amount of time with my mother lately. Yesterday, when we were driving together back towards my home, my mind frequently returning to the thought that it’s almost Halloween, I asked her if she remembered any particularly noteworthy horror films from her generation that I might have missed.

I have no idea what it will be like to try to remember childhood when I’m fifty-six years old. She sort of struggled to drag some old memories to the surface and did a bit of free association. She mentioned The Pit and the Pendulum, and got to wondering about the entire arc of Vincent Price’s career. I mentioned that lately when I think of Vincent Price one of the first films that comes to mind is The Tingler. Mentioning the title evidently opened up a flood of memories for her, and though she didn’t give much detail she seemed to vividly recall having watched the film in her youth.

The Tingler came out in 1959, so either my mother is manufacturing the memory, or she saw it in some sort of re-release, or else her older sister took a four-year old child to an interactive horror movie, but my mother claims to have seen The Tingler in a theater complete with the William Castle promotional gimmick.

If true, I am delighted to know that my mother got to have that experience, which must have been exquisite fun – at least for people older than four. I recall, perhaps a year ago, tormenting myself by reading the events list in the New Yorker and seeing that an East Village cinema was going to be having a showing of The Tingler which restored the gimmick, installing joy buzzers in selected seats and, presumably, planting professional screamers in the audience. Oh God how things like that make me desperate to be in New York again. I would have loved to be part of such a wildly interactive cinematic experience. No one promotes or executes anything with such originality.

Am I the only one who genuinely admires William Castle? He seems to be widely laughed at by people who are knowledgeable about the history of film, horror or otherwise. From everything I’ve seen, his promotional gimmicks are remembered as little more than cheap stunts aimed at practically tricking the audience into buying a ticket. But to say the least, I don’t understand why his cheap stunts don’t stand up in most people’s minds against modern studios’ cheap stunts of peddling the same garbage to their customers over and over again.

I don’t think that William Castle had any illusions that he was the next Orson Welles. He wasn’t out to create cinematic masterworks; he directed horror films. And his aim in so doing was seemingly to offer the audience a unique thrill and an hour or so of escapist excitement. Towards that end he was marvelously original, and frankly I wish the popular cinema had taken its cues from him going forward.

Castle directed several 3-D western films for Columbia Pictures in the fifties when 3-D was all the rage. He must have been impressed with the notion of having the film interact with the audience, because he adopted his own various takes on the idea in his horror films later on. The studios had made bank on a fad, and as far as I know only Castle had the personal conviction to take the underlying impulse and adapt it in fresh and creative ways.

Considering the recent surge of new 3-D movies, and my sense that it belies the creativity of film studios and suggests a myopic devotion to fads and groupthink, I believe the film industry would benefit greatly from a new William Castle. As a horror fan, I, for one, would much rather go to see a film and see a hearse parked in front of the theater, as was part of Castle’s first effort, than go there knowing that some of the images are going to pop out of the screen at me. Neither may be particularly scary, but the more original alternative at least aspires to establish an atmosphere that reaches past the space between the screen and my eyes, and gives me a reason to believe that I’ll remember not just the content of the film but the actual experience of going to see it.

In an era of cheap, ubiquitous DVDs, studios ought to be interested in advertising new reasons why people should be interested in going to see a movie in theaters. And in an era of increasingly diminished interpersonal relations, a few gimmicks might accomplish that aim and have a positive social effect as well. When I imagine what a thoughtful promotional gimmick might look like today, I think that it would have the potential to make an original film into the sort of shared experience of fandom that is usually reserved for huge franchises like Star Wars. When I was between the ages of twelve and eighteen, I went to midnight showings of each of the Star Wars prequels. What was especially exciting about that was not so much the films themselves (obviously), but the tail gate party in the parking lot, the sense of community, the unusual awareness that everyone was going into that theater to expand upon an experience that had already begun for each of us.

If a modern horror film had a physical skeleton swinging over the audience, or allowed different members of the audience to see different things on screen depending on which glasses they were wearing, or let the viewers choose how the film ends while still watching it, as were all William Castle gimmicks, those in attendance would be aware of their relationship with audience as well as with the movie, and that might give them a reason to not wait for the movie to come out on Netflix.

William Castle single-handedly made movies more than just pictures on (and sometimes leaping off) a screen. Why do we make fun of that? Why did we decide that it was an idea not worth revisiting for forty-six years and counting?

Tuesday, September 13, 2011

Killing Not Just Newspapers, But News

Nielsen released its report yesterday on how Americans spend their time online, and most of the extensive media coverage seems to be focusing on how popular their research shows Facebook to be. Apparently there was some doubt about that prior to yesterday. This study tells a much larger story than that, however. Focusing on the social media aspect of it seems like a strange bit of rhetoric, and an impulse to exploit the angle that news outlets assume will generate the most attention. Social media and blogs together comprised almost a quarter of people’s time spent online, but it was not the largest category. That remains the miscellaneous category, but let’s not pull punches here, it’s porn. The smallest share of time online goes to news, at 2.6 percent.

That’s a significant piece of information at a time when the internet is said to be killing newspapers, with even television media having a difficult time keeping up with changing landscape. But if society as a whole is devoting only one fortieth of its time spent online to learning about current events, I wonder if that calls into question the assumption that traditional news media are failing because of competition from convenient, cheap, high volume online sources of news. Other analyses have indicated that overall readership of established news agencies is in decline, not just readership of their print formats. It seems that it has always been assumed that this readership was dispersing to other sources from which they gathered the same volume of information that they used to consume, but I expect that that would be difficult to prove empirically. To me, these new numbers support an alternative interpretation: that people are opting out of information-gathering altogether, and that established news media are losing ground not to competition, but to distraction.

The existing narrative reflects what I think is an unfortunate and all too common perspective that all change is positive change. Letting that perspective go unquestioned allows us to sacrifice the best of what is currently available to us, either because the best of what is emerging is thought to outweigh it or because preserving anything against the onslaught of social or technological change is deemed a lost cause. The optimistic outlook on current trends in news consumption is evidently that there is a greater volume of reporting, a greater diversity of opinion, and a greater ease of access. That’s hardly all there is to the story, though. A greater volume of reporting doesn’t mean much if the sources of that reporting are devoid of the resources that might otherwise encourage a fuller investigation and a higher quality of reporting. A greater diversity of opinion is hardly progress if it reflects a devaluing of objectivity and a tendency of people to choose the sources of their news based on a preexisting agreement with the outlet’s perspective. Greater ease of access is barely significant if fewer people are choosing to access the most significant information that is available to them.

Of course, I don’t know that any of these trends are truly dominant. I am confident, however, that there is far too much optimistic assumption about the character of American audiences, and far too much dismissiveness and acceptance of powerlessness among those who might be in a position to affect positive change in consumer behavior. Much of the media seems content to fawn over social networking sites, curve their reporting on topics of much broader significance around a sense of awe at their popularity, wrongly declare them to be the drivers of foreign revolutions, and so on. The cultural position of Facebook, Twitter and the like is crucially important, but I would love to see a lot more analysis of its causes and effects, and a basic willingness to criticize and resist.

As far as I’m concerned, the story to be taken away from the Nielsen report is not that Facebook holds irreversible cultural dominance, but that an enormous portion of the American public enjoys masturbation in its multitudinous forms, and hates information and critical thinking. And as much as that drives frivolous use of social media and a resistance to hard news, it also may inform the existing news media’s response to such trends, so that their diminished quality and misplaced emphasis drives nails into their own coffins.

Sunday, September 4, 2011

"If You Need Mommy, Leave a Voicemail"

I witnessed another instance of questionable parenting today. Fortunately, this latest scene is less distinctly shocking to me than yesterday’s example of parental instruction in amorality, but it almost makes up for that by involving not an independent-minded teenager, but a very small child, who is so much more prone to minute influences.

Saturday, September 3, 2011

"Stop Helping, Son"

I was having a meal this afternoon at a diner (okay, it was a Denny’s – I live in Buffalo, NY and travel solely by bicycle), when a couple of people at a nearby table caught my attention. A middle-aged woman and someone I assume was her son of about fifteen years old had finished their meal and we’re about to get up to leave. The young man had his back to me, while I could see the woman in profile. The kid’s hands moved on the table in front of him as he wiped crumbs from the surface or stacked the plates, or something along those lines. I know this from his mother’s reaction, which was to reach across the table, snatch something from the young man’s hands and command him to stop cleaning up after himself. From what I could see, she appeared to actually be taking things that he had gathered together neatly and scattering them back into their prior positions.

“It’s called waitressing, or busing,” I heard her hiss with genuine derision. “They get paid to do that.”

The young man protested delicately: “I’m just cleaning up my own mess.”

The mother began to get up from the table, aggressively pitching a used napkin into its center and gesturing for her son to follow her out. “Don’t do their jobs for them,” she insisted, repeating that “people get paid for that.”

So focused was she on willfully leaving a mess behind that she didn’t ever seem to notice me, off at her side, glaring at her openly, with fury in my eyes. Her son got up as she began moving past him, still being scolded and thus compelled to defend himself against what I think was the single most irrational verbal attack I have ever heard a parent levy against her child. “I like to clean up after myself,” he reiterated.

Here was this adolescent child taking it upon himself to demonstrate a bit of personal responsibility, and his parent was actively chastising him for it, endeavoring to instruct him that it’s wrong to make something easy for another person if they’re getting paid for it and you’re not. Never mind that in this case they’re presumably getting paid less than minimum wage and relying on tips that, given the neighborhood, the establishment, and the arrogant disregard on display among certain customers, probably just aren’t there. And never mind that all that you need to do to improve their shift working at such a shitty job is run a napkin over a table and move a few pieces of dinnerware six inches or so. They’re getting paid to do that shit that takes absolutely no effort on the part of the customer, but quite a bit when you’ve been doing it every ten minutes for ten fucking hours.

I have encountered this sort of attitude many times throughout my life, in numerous circumstances. I still vividly recall arguing with a good friend in high school who routinely tossed his trash onto the floors of the hallways after school, insisting that it was okay because there were janitors that got paid to clean it up for him. As a matter of fact, he argued that he was providing them with job security by being lazy and filthy. He was a smart kid otherwise, so I give him the benefit of the doubt by figuring that that was probably just an ironic way of justifying his own self-centeredness. Then again, he also self-identified as a Marxist, which added a whole further level of necessarily unintentional irony. Being the principal’s son, the kid was from a decidedly upper-middle class background, and his adolescence created in him an identity that thoroughly grasped the theoretical concepts of equality and social justice, but failed at the task of connecting that to the very simple idea of people actively helping one another.

To this day, there is a special loathing in my heart reserved for these kinds of people – people who applaud themselves when it comes to the vague pursuit of social and political causes, and can speak loudly about them, and build their self-perception around them, but are perfectly willing to leave all the work to others when it suits them, or blame the victim when confronted with individual instances of disenfranchisement and inequality, or drive past a person who’s being attacked on the street.

Of course, in the case of the woman at Denny’s, I have no idea what her social views are. She might just plain not like poor people. She may just think that whatever pittance they’re making to clean up her shit, it’s too much, so fuck them and make sure their job is as hard as it can be. She may be that lovely kind of conservative who thinks that “personal responsibility” is just a phrase that’s used to criticize people at the bottom of society. In that case, here’s hoping that her son’s act of teenage rebellion in embracing liberalism and actually behaving with personal responsibility is not just a phase.

Friday, September 2, 2011

Against the Common Wisdom

I received an e-mail forward last night from someone who is on a mass mailing list for some sort of inspirational website. She sent the message to me, however, not in an effort to share a motivating, joyful sentiment, but rather in pursuit of sympathy. It was accompanied with an indication that she felt unfortunate to have received such a message on that day. I’m not really sure whether that is because my friend felt as though the universe was taunting her for a personal failing through the e-mail, or just because she didn’t get the lift out of it that she so earnestly needed on a day when she learned that she had not gotten a job that she wanted very much and was rather confident she would get. I do, however, know my own response to the bit of ostensible inspiration, and I know that it would be constant, regardless of the circumstances of the day.

Saturday, July 9, 2011

Socializing Online Without Wanting To

In last week’s New Yorker, there was an article about online dating, exploring its origins, its multiple iterations, and its widespread relevance in the modern world. The author, Nick Paumgarten, points out that “For many people in their twenties, accustomed to conducting much of their social life online, it is no less natural a way to hook up than the church social or the night-club-bathroom line.” This is certainly true to my experience. I see nothing unusual, shameful, or frightening about meeting a person through online communication, and I can perceive some definite advantages to online dating. But at the same time, I despise an excessive reliance on the internet for social exploration and interaction. I do not have a Facebook or Twitter account, and I steadfastly refuse to get drawn into any such trend, even though it is increasingly clear that the virtual ubiquity of these sites threatens to put me at a distinct disadvantage in some contexts.

Not that any external factors are necessary to put me at such a disadvantage. I’m just no damn good at meeting, interacting with, and relating to most other people. That may seem like the sort of characteristic that ought to push a person straight towards social networking technology, but I think that my resistance to it and my own social impediments are both grounded in similar aspects of my personality. I have high standards for my personal relationships and for the sort of people I interact with. I do not seek out casual acquaintanceships, and the fact that I desire a strong element of earnestness and commitment in even the most basic friendships evidently makes me intimidating at the outset of any social interaction. It probably goes a long way towards explaining why people who are purportedly very fond of me and very interested in me never seem to call me on the telephone, even when they themselves bring up the subject of further contact. I think I appear inaccessible, and that that makes people uncertain of how to reach out to me and secure my interest when we are not meeting in passing. And when we are not, the difficulty is that one or both of us must put forth some serious effort at making a connection. Not so with online communication or text messaging.

Wednesday, June 15, 2011

Educational Recession

David Sirota had a piece on Salon yesterday, in which he claimed that two distinct camps seem to be emerging in the debate over future education policy. And Sirota firmly sides with one of these camps. The side that he privileges and takes to be uniquely supported by research data claims, in his words,
"that larger social ills such as poverty, joblessness, economic despair and lack of health coverage negatively affect educational achievement, and that until those problems are addressed, schools will never be able to produce the results we want."
Those on the other side, Sirota says, "want to radically change (read: charterize and/or privatize) public education under the premise that the primary problems are bad/lazy teachers and 'unaccountable' school administrators."

Tuesday, June 14, 2011

For Only....

I was cleaning up my apartment a little today, and I came across an old advertisement that I had received in the mail from Heifer International. Having ever in your life subscribed to a newspaper or expressed interest in a political cause seems to mark you as someone who probably has disposable income, so I have gotten a number of donation requests from charitable organizations. Virtually all of them strike me as worthwhile causes, and it upsets me each time that I simply do not have the money to donate to them. But Heifer International stood out very prominently among them.

Beneath the front cover, the pamphlet announced that a gift of five hundred dollars could buy a heifer for a third-world family. With compelling copywriting, it explained that one dairy cow could produce as much as four gallons of milk per day – enough for an entire family to drink and share with its neighbors and still have enough to sell. It continued to point out that by producing calves for other families, one cow could be instrumental in moving the whole of a community out of poverty. These extremely bold and shockingly plausible claims made me feel more shame about my persistent lack of financial security than I had ever felt before.

Five hundred dollars is no small sum of money, but it is very definitely an amount that is available at the end of the month for those people we generally conceive of as the default American citizen. The median household income in the United States in 2009 was over 50,000 dollars. I’m sure that to some people that does not seem like a tremendous amount of money, but I’m equally sure that some people have no idea how spoiled they can be. Considering that people around me routinely support themselves with annual incomes well under 20,000 dollars a year, there is no doubt that more than doubling that should leave a great deal of money left over, if only the household didn’t strive to consequently extend its standard of living well beyond what is prudent.

The awareness of what five hundred dollars could do for a third-world family, or even for a working-class American family, painfully reminds me that we collectively have the means at our fingertips to solve the problems that are surrounding us every day. Five hundred dollars is presently a huge sum to me, but apparently I am atypical of the American experience. Why, then, does that typical American not hold himself to a higher social standard? Am I deluding myself when I think that I would sooner buy a heifer for each of a hundred families than buy a house when I already have a rented roof over my head? Or does all of this just suggest that the wrong people are well-off?

Is there any way to bring people to a breaking point in their view of financial entitlement, short of standing them face-to-face with a malnourished family and asking them if they’d still rather hold onto that five hundred dollars for their next home theater upgrade? And would even that not sway some people?

We have the means to solve these problems. It’s in our hands. It’s in our hands, but we’re clutching it tight for ourselves, instead of putting it to good use.

Tuesday, April 12, 2011

Superimposed Image

I’m feeling drawn towards slightly superficial topics tonight. For someone who strives to be highly cerebral and to actually eschew superficiality, I have a strange fascination with fashion and beauty. I can justify some of that as philosophically grounded, because I think aesthetics is a truly intriguing subject on inquiry. But part of it has to do with the fact that I met a girl some years ago who was highly interested in fashion, mostly as an extension of her interest in art, and she single-handedly robbed me of much of my derision. Now it’s just another part of culture that I take pleasure in analyzing and picking apart.

Sunday, April 10, 2011

Not Always Black, but Always White

The front page of the Sunday edition of the Buffalo News prominently read “Minorities – the New Majority.” Now make no mistake, I don’t make a habit of reading that terrible little newspaper; I just happened across it on this occasion. I don’t read it precisely because it doesn’t take much more than one of their headlines to launch me into a diatribe about their thoughtless reporting, bias, or simple bad journalism. Speaking of which…

It’s not at all unusual for the Buffalo News to run a headline like the above, apparently without anyone on staff raising an objection about the obvious contradiction they’d placed top-center on the first page. But what’s altogether more frustrating than that is that exactly that same oxymoronic reference to “minorities” seems commonplace in the media in general, and in much of public discourse.

How powerfully consumed with our culture biases do we have to be that we never pause and think, “Wait, if they constitute a majority of the population, why are we calling them by a term that means exactly the opposite?”? It seems to me that that’s a natural question, but I’d emphasize that even if more people had the common sense to ask it, they still wouldn’t be asking the right question. A better question would be something along the lines of, “Wait a minute: why are we only calling non-white people minorities, if white people are now in the minority?”

If you think about it for a second, you realize that identifying minorities as a collective majority requires separating all of society into exactly two distinct groups: white people, and everybody else. The fact that the hasty editors of news outlets like the Buffalo News don’t bat an eye at such a move goes to show that much of media, and much of the public dialogue throughout white America identifies the default human being as white, and sets everything else in contrast to that.

There is no statistically valid reason for deciding that blacks, Hispanics, Asians, and Native Americans constitute one group, termed “minorities,” while Caucasians make up a second group, which is not labeled as being in the minority even if its share of the population is substantially under fifty percent. The only reason there is for such a move is an ingrained cultural bias. It’s the sort of well-intentioned, socially liberal racism and shortsightedness that leads people who are reflective, but not self-reflective, to champion causes of social justice and equality, without ever addressing the most crucial racial and cultural problem of all – the social tendency to actually look at one kind of people differently than one looks at absolutely everybody else.

Wednesday, February 2, 2011

Buffalo Winters

After hearing for weeks about severe winter weather throughout much of the east coast, Buffalo has finally been struck with the first snowfall that my high standards will accept as extraordinary for the region. It was about seven-thirty when my friend left my apartment tonight. A few minutes later, she called me to apologetically ask that I come help her get her car out from where it had been plowed in a street away from me.

After I put on my coat and traction-less work boots, and ran to the scene with my laughably small snow shovel, she would not stop either thanking me or apologizing. But she’s my friend. The idea that such a simple, obvious, and necessary favor would in any sense a burden or inconvenience is just absurd. I had to remind her that it is truly my pleasure to help a friend in need. In fact, if she had been a complete stranger, and I’d just happened to be passing by while carrying a spatula, I would have been undeniably eager to leap to her aid. That’s the way we all are in this town, isn’t it? I spent every winter of my childhood hearing the reinforced narrative that the winter weather in Buffalo does wonders to bring to the fore the friendliness and compassion of the local population. When the going gets tough, we all pitch in and help one another.

Bullshit. When I finished digging out my friend, she put the car back into park for a moment and climbed out, saying, “I have to give you a kiss now.” As she came close, I looked over her shoulder and then turned to her to whisper before she drew back, “You know, they say Buffalo is the City of Good Neighbors?” She laughed and pointed out that she had been thinking exactly the same thing. While she had kept shifting gears and easing on the accelerator, I had scraped at the snow in a way reminiscent of scooping ice cream with a thimble, occasionally stopping to try to dig my piss-poor footwear into ice to singlehandedly push the car by its fender. All the while, she and I had both repeatedly taken note of the five or six people standing within twenty-five feet of us, evidently all together, some of them working with two colossal snow shovels, others just standing nearby or seated in a car across the street.

Another set of hands or a better tool, and we could have had my friend’s car out of its trap in thirty seconds flat. I didn’t mind undertaking the task alone, but it just seemed to me that there was something almost aggressive about the neglect of someone so near at hand with such an easily fixable problem. No one else even needed the least bit of assistance, my friend being the only one on the street who had not taken advantage of the local church to park in its lot. Is this the famous Buffalo neighborliness? Do I just happen to repeatedly run into the rare exceptions to the rule, or is that self-image that the city repeats like a mantra just not hold up to scrutiny.

I springboard from here into pages upon pages about the empty-headed optimism that locals have about this area, and perhaps I will in fact go on about it in the near future. But for now, suffice it to say that on point of economics, cultural progress, and certainly the moral character of the population, proud Buffalonians often follow the same trend of convincing themselves that it is a terrific place by focusing exclusively on the positive. Frequently, it seems, people with such a doting perspective think the town’s virtues stack up nicely against those of other American cities for an equally ridiculous reason: they’ve never lived anywhere else.

I have. I’m not a well-traveled man, but I’ve lived in New York City and in Boise, Idaho, and I’ve spent sufficient time in Washington D.C., Arkansas, Oklahoma, and Seattle to have gotten a fair sense of the character of those places and their locals, and in many years since leaving Buffalo and slinking back, I have never for a moment come under the impression that the residents of this city are somehow kinder, more generous, more outgoing, or otherwise better neighbors than in even one of the other places I’ve experienced. Not even of Manhattan would I say that it is less neighborly than Buffalo, and Manhattan has a reputation for being full to the brim with rude bastards.

The stereotypes reported for both sides of the state are equally unreliably, and probably for interestingly contrasting reasons: Buffalo’s designation as City of Good Neighbors was evidently a creation of local residents striving to build up the place in their own minds and in those of potential visitors. New York City’s reputation for being brusque and unfriendly is no doubt traceable to the reports of tourists who do not know how to integrate into the pace and local culture of metropolis. I would hazard to guess that stereotypes describing virtually any city are deeply flawed, and for the same simple reasons. People have a habit of making the best of the situations that are familiar to them, and in some cases inescapable. We structure our sense of value around the place and the circumstances in which we were raised, and if we are not sufficiently socially adaptable, being confronted with the unfamiliar is functionally no different from being confronted with the immoral. Speed and directness can seem aggressive is you’ve always been familiar with a slower pace. And where we have our own sense of value, but also find ourselves tied to a particular place, we might be inclined to find examples of those values in our local circumstances, and conclude thereby that it is representative of the place where we live.

That tendency is a great coping mechanism, but it is not a great representative of reality. It is, instead, an endemic problem of self-delusion, which is present everywhere, but for which I see particularly poignant examples in the way people who love Buffalo talk about Buffalo. In that specific case, I eagerly wish for that breaking point wherein some lifelong Buffalonian sees one of his good neighbors look over at him as he struggles with something and say “Nah, fuck that guy, he can take care of himself.” But in broader terms, this topic speaks to the social breaking point that I will spend my life looking toward and trying to provoke at every turn. We must, all of us, stop accentuating the positive, stop looking on the bright side and convincing ourselves that, heck, things are really pretty great around here. I want to write a cynic’s manifesto to convince people to look into the shadows, acknowledge every flaw in the social fabric around you, and then begin agitating for change. There must come a breaking point at which we decide that we can no longer swallow any more saccharine, that we’re too dizzy from constantly turning away. For the simple fact is that there is no solving any of our problems when coping with them means forgetting they exist.