Showing posts with label personal. Show all posts
Showing posts with label personal. Show all posts

Friday, October 12, 2012

Everyone Look at the Ignorant People!



I just happened upon a clip from Chris Matthews coverage of the supporter gatherings prior to the Vice Presidential debate.  It is not enormously significant, but it is a delicious bit of video, which I have an irresistible urge to comment upon.  The roughly one-minute clip begins with Matthews interviewing a random Obama supporter.  Just as he asks her about her health care situation, an old woman interjects from off camera by shrieking the word “communist!” in a voice that would have made it notably fitting if she had followed up with, “burn him!”

Everyone in frame reacts to the shout, but the woman being interviewed shakes it off and takes a few seconds to explain that she and her husband had recently lost health insurance for the first time in their lives.  Chris Matthews lets her finish her answer, but the speed with which he departs when she reaches the end of her sentence suggests an almost Pavlovian response to the shrill voice at the edge of the crowd.  He lowers the microphone immediately and says, “Okay let’s go over to this lady,” whereupon he seeks out the person who yelled communist, in order to ask her what she meant by it.

What follows is a stunningly awkward exchange in which Matthews asks the woman exceptionally unchallenging questions, essentially just repetitions of “what do you mean?” and she repeatedly fails to answer them, instead chiding the professional journalist and commentator to “study it out, just study it out,” derisively referring to him as “buddy,” and asserting that she knows what she means.  It would be painful to watch if I had any inkling that the woman had sufficient self-awareness to be embarrassed by it.  It would be hilarious if it wasn’t such a tragic commentary on the state of political discourse.  Watch it if you like:



Obviously, our culture and systems of information need to be reformed enough to precipitate a breaking point whereby nobody can remain so self-satisfied in their own ignorance as this woman showed herself to be.  Her willingness to gather at a political rally and shout her views on national television suggests that she is firmly committed to them, but even in the space of a minute, her complete inability to explain or defend those views paints the image of someone who has absolutely no idea what she’s talking about, but also doesn’t care that she’s not informed and doesn’t think she has to be.

I watch this woman wag her head at Chris Matthews and pause at length before shooting back, “You don’t know?” when asked what she means by “communist,” and I see someone who believes that in the face of any challenge to their worldview, a self-righteous attitude eliminates the need for facts and rationality, every time.  It is indicative of a sociopathic mindset that takes confidence and strength to trump all else, and that mindset seems like it is breeding extensively in the modern population.  That in turn is indicative of a serious cultural failure in America, though unfortunately one that is near impossible to overturn.

Far less difficult to attain is the personal breaking point that this clip seems to point to, though I must admit that I don’t know which side of it I ought to come down on.  I must admit that in watching the clip, the thought almost immediately crossed my mind that maybe this woman was some sort of amateur satirist aiming to portray the Republican opposition as deluded and irrational, and even that maybe she had been planted there by some group on the left.  I entertain those thoughts because, as with most conspiracy theories, it’s simply easier to believe than the frightful reality, which in this case would be that America is long on individuals who form firm, aggressive opinions on the basis of the extracts of ether and bullshit.

I know that my skepticism about public ignorance is unsustainable.  Indeed, I know that it can be harmful, because it’s a sort of ignorance in itself.  Fundamental to my personal philosophy is the idea that you can’t hope to effectively solve a problem if you deliberately avoid recognizing the reality and extent of that problem.  Public ignorance is the problem at the root of all other problems, because it is that which allows people to avoid reality, and thus deny solutions.

The problem here is that I don’t know whether I should be pushing myself towards the breaking point of taking public ignorance for granted, or if instead I should find a way to keep from assuming that conspiracies are afoot while still giving individuals the benefit of the doubt as regards their level of information.  In other words, one might say that witnessing ignorance of the proportions on display in this clip challenges me to avoid two negative breaking points, which threaten to make me either overly cynical about either human stupidity or overly cynical about political manipulations.

I’d venture to guess that not a lot of people have carefully-reasoned assessments of their fellow men, so this is a personal breaking point that others may have to contend with as well, but being personal, it’s of secondary importance.  What this video clip has brought to mind that could be addressed on a large scale right now is a question for the media about how to handle firm opinions voiced by the public.

I honestly can’t decide whether to praise or criticize Chris Matthews’ response to the political heckler.  Part of me wants to criticize just because I used to get a lot of enjoyment out of focusing my ire for the news media against Matthews, who, despite being a bright guy, was terrible at his job back when I considered MSNBC a news organization.  Now that his job is “partisan” rather than “journalist,” he doesn’t seem so bad.  Okay, it also helps that I don’t have a TV.  But in any event, even if Matthews remains professionally an idiot, the woman he had his brief exchange with is an idiot in much larger terms, and to an unquantifiably greater extent.

The relevant question, then, is, “Did Matthews have good enough reason to focus the attentions of the microphone and camera on this woman’s dimwittedly vociferous views?”  On the one hand, by giving her a voice once she’d asked for it, and contributing no commentary of his own, Matthews allowed the woman to provide her own refutation of her talking points.  The exchange conveyed the impression that extremist views are based on no information, which of course they often are.  That’s a good fact to put on display when the opportunity arises.

On the other hand, we have to remember the shamelessness with which the old woman held her ideas in absence of evidence or personal understanding.  Such shamelessness probably isn’t much affected by having a mirror held up to its own ignorance, and that fact threatens to let this incident stand as encouragement for other people like her.  As I said, the greatest breaking point involved here is also all but unattainable: the creation of a culture that prevents the embrace of ignorance.  For the foreseeable future, lack of information and presence of strong opinions will continue to go hand-in-hand among a sizable portion of the American public.  It will take generations of concerted effort to change that fact.  But that doesn’t mean that opinionated idiots will always be activists.

I estimate that much less comprehensive cultural changes could prevent people who hold uninformed opinions from being so vocal and so public with those opinions.  And one thing that probably doesn’t help is giving voice to those opinions, in all their self-righteous vacuity, on national television.  Viewers at home whose perspective on American politics don’t go much farther than “he’s a communist!” won’t be shamed or enlightened by their impromptu spokesperson’s self-defeated, just as she wasn’t shamed or enlightened by it.  To the contrary, the presence on the airwaves of uninformed declarations and accusations provides more fodder for lazy people to find something to parrot as they make the leap from uninformed citizen to armchair activist.

The opinions that are screeched from the sidelines are the ones that most need to be debunked once they’re present, but they’re also the ones that most need to be disallowed from taking the field.  Overall political discourse is cheapened not only by their ignorance but also by their lack of decorum.  As regards ethics, I think I am so committed a deontologist that I have internalized Kant’s categorical imperative.  When I see things like this video clip and start wondering what ought to have been done in the situation I find myself universalizing the act I witnessed and looking for its effect on the moral system.

In this case, what would the effect be if journalists always turned their attention to the loudest and most abrasive commenter on the scene as Matthews seems to have done?  He even turned his attention away from the woman who was contributing relevant anecdotes to the public understanding, in order to give the shrill, ancient cold warrior a chance to explain her unexplainable views.  I fear that the current state of journalism is not far from embracing the loudest participant in any debate, because the hypothetical result is that all of American politics becomes a shouting match, and that is seemingly not far from the situation that we already face.

In light of that threat of a still more corrupted political and journalistic landscape, I’m tempted to say that although the woman’s response was rather satisfying, the better thing to do in that situation and all similar situations is to keep the person who’s shouting epithets off of our television screens.  But I’d be interested to know what readers think of the effects of either encouraging or discouraging uninformed speech.

Tuesday, September 4, 2012

At Cultural Attractions: Parents Don't Teach, Children Don't Learn

The Buffalo Zoo celebrated the traditionally-last weekend of summer by offering a ninety percent discount on admission on Labor Day. Since one dollar is something I can just about afford on a good week, I took a holiday-morning bike ride around Delaware Park and then queued up with the mass of people, mostly families with small children, who had just as readily sprung at the opportunity for a cheap cultural activity.

Considering the lines at the gate, I was surprised that the scene inside was not as claustrophobic as it could have been. It took a little jostling or waiting in the wings to get a proper angle, but everyone seemed to get their opportunity to look at the cute, or fearsome, or comic animals. I freely admit that I was mostly there just to take another look at some of my favorite creatures, to watch the polar bear swim in its artificial pond, far from the threatened environment of its natural-born fellows, to grin down on the docile capybaras lounging in the rainforest exhibit, to rediscover my respect for the vulture which I discovered when I wrote a report on the species in elementary school, to look for big cats pacing like in Rilke's description of the panther.

But even though this excursion wasn't exactly intended as a fact-finding field trip, I never go to a museum or zoo or aquarium without trying to learn something about the stuff I'm looking at. Not a heck of a lot changes at the Buffalo Zoo from year to year, and I think I had been there about a year ago, so it's not as if I could have expected to discover an animal the existence of which I was altogether unaware of. But there's only so much I can commit to memory, so naturally I find myself rediscovering things on subsequent visits to the same places of learning. I always seem to forget, for instance, that the Rocky Mountain Bighorn Sheep are capable of running at up to fifty miles per hour. The up-side of my disappointment at not retaining encyclopedic recollections – a failure that seems to become ever-worse as I age – is that I sometimes get to re-experience the joy of learning something interesting all over again.

Even if I don't read all of the wildlife facts, of which there aren't even that many at the Buffalo Zoo, I do at the very least try to get the names of the animals right. This is more than I can say of the vast majority of the other patrons that I encountered yesterday. It having been a year since my last visit, I found myself trying to actively identify each species, endeavoring to commit to memory the ones that escaped me this time around. This is natural to me, and I thought it was part of the essential purpose of going to the zoo. I always took it to be a place where you went not merely to look at animals as in a menagerie, but to find out something about the wider world by discovering what they are and from where they come. I especially thought that that was why parents took their children to the zoo. I'd always assumed that it was meant as a supplement to a child's primary education, a way to instantiate curiosity and gauge the direction of nascent scholarship. Apparently I was quite wrong about this as well.

Most any time that I go to places like zoos or museums and find myself crowded by children and their adult chaperones, I am downright shocked by the lack of interest that parents have in conveying any information whatsoever to their charges, or even in encouraging those children to learn anything on their own. I fear that my disdain paints me as a killjoy and that the average reader will see me as attaching far too much significance to the conduct of people who are on a simple, light-hearted family outing. But that's just the trouble. I worry that people attach entirely too little significance to such everyday opportunities to influence the character, values, and perspective of impressionable children.

As much as Americans today recognize and lament the widespread failure of education and the failure of modern children to live up to appropriate standards, I think commentators and individual parents are too much inclined to see that failure as institutional and too little inclined to consider it as social and cultural. If the behavior of parents at zoos and museums is indicative of their broader attitudes, it suggests that people have widely forfeited the recognition of personal responsibility for the education of their own children, instead handing that responsibility off to schools as if the process of raising an intellectually astute and ambitious child is something that can be consolidated into a specific set of hours in specific locales.

If that is indeed the view – if the need for education is recognized, but only recognized as being needed somewhere outside the home – then I can only conclude that people don't really value education at all. That is, they don't value education as it ought to be valued, for its own sake, as both a public and a personal good. You can't expect children to learn well and perform at a high level in school if the culture that they're coming up in is one that portrays education as a sort of obligation and something that brings good things to the learner, but is not good enough in its own right to be worth pursuing in absence of the social obligations of homework and exams.

What else can I conclude from regularly observing that perfectly middle class parents, far from exhibiting much intellectual curiosity of their own, don't even respond to the intellectual curiosities of their own children. But perhaps that's a little unfair. At the zoo yesterday I did find one or two adults expressing curiosity to the extent that they pressed their faces to the glass and perplexedly asked of no one in particular, “What is it?” They just didn't express a great deal of interest in actually doing anything to satisfy their curiosity. They just couldn't be bothered to walk back two feet in order to read the damn nameplate.

This is entirely their own affair when the adults are on their own and solely responsible for their own edification or ignorance. But it gets under my skin when their own lack of care for finding answers threatens to be transmitted to a child who is still blessed by wide-eyed eagerness to comprehend the world around him, whatever aspects of it should set itself before him.

Just a few exhibits down from where I heard one unresolved ejaculation of “What is it?” I found myself looking at another glass enclosure that housed three wallabies crouching at the back of their habitat, when a family walked around me to look at the same. It was comprised of a couple with a daughter just barely of speaking age and a son perhaps six years old. The parents looked, glassy-eyed, into the scene while the boy excitedly called out “kangaroos!” I had started moving away from the exhibit, but noticing the boy being met with silence, I said simply “wallabies,” partly in hopes that his parents would hear me and realize, if they did not realize it on their own, that their son had made a reasonable but slightly mistaken assumption about what they were looking at.

However, I was essentially met with silence, too, except in that the boy, perhaps hearing me or perhaps just seeking acknowledgment from his parents, repeated “kangaroos.” Noticing that they weren't going to say anything and that their eyes had apparently still not passed over the signs that clearly stated the name of the species, I repeated, with the boy more specifically in mind, “wallabies.” Now looking squarely at me, and inquisitively, the boy again said “kangaroos.” It could not have been more obvious that the child was interested in being corrected. He wanted to learn, as most children do when simply presented with the opportunity. This child was young, but most likely old enough to sound out the word “wall – a – bye” if he knew where to look, and if he was made to realize that he didn't know the answer without looking. But to do that, he would need an example to follow, a pair of parents who had the tools to find out answers for themselves, and cared to give their children the same.

The child looking to me instead of his parents for that meager bit of instruction, I addressed him directly, explaining, “No, these are wallabies. Kangaroos are big; these are smaller.” And at that he turned to his parents and his younger sibling to repeat it to them: “These aren't kangaroos, the man says.” At that I was walking away, and I can only hope that their son's claim finally prompted them to look at the sign and sound out “wall – a – bees.” It was up to them to take an interest on their own, but it seemed to me that the child, being a child, not only wanted to know about these things in the zoo, but wanted others to know about them to.

I experienced the same thing elsewhere. In the crowded rainforest exhibit, I, being a nerd, spoke straight to the capybaras, telling them that I just wanted them to know that they are the largest rodents on Earth, and that that's awesome and they should be proud. A young girl just beside me asked, seemingly of no one in particular, "What are those called?" It could be that she heard me demonstrating some knowledge of them and figured that I had the answer, or it could be that she, like so many young children, thought her parents would have all the answers she sought.

She had not spoken straight to me, and that being the case, I would think that a scientifically interested parent, one familiar with zoos, would say something like, “I don't know, let me look at this information card over here so we can find out.” The parents did not move, of course, so I turned to the child and told her, “Those are called capybaras.” Naturally, she then looked back to her parents and sought to inform them of what they did not inform themselves: “They're called capee-bears.” The parents did not repeat the information; they did not move to confirm it or commit it to memory; they did not give her any indication that she should feel proud of having learned something, that she should be thankful for the knowledge, or that she should seek to learn other things as well.

The desire to learn is so natural and so passionate among children. How poorly we must regard it as a society that students evidently end up so thoroughly dissuaded from eager learning long before reaching the lower threshold of adulthood. What standards can we possibly expect students to meet if we handicap them in all the faculties that might prompt them to aim above the mark. If this culture persists, the most likely solution is simply to expect less of students, as has already become the defining feature of decades in the devolution of higher education.

In the future of this culture, we may as well just rename familiar animals to match the absent understandings of parents and their children. Having been to a couple of zoos and aquariums in recent years I've found that as far as doting children and intellectually incurious parents are concerned, every lemur is called King Julian and every clownfish is Nemo. This really aggravates me. My best friend is terrifically fond of the Niagara Aquarium, so I have gone there with her on several occasions. Upon every visit, without fail, one can hear at least half a dozen parents exclaiming, “All right, let's find Nemo,” or, “There's Nemo.” I think I've heard the word “clownfish” used by a parent to a child exactly once.

I have no doubt that some of these parents are just lazy and find “Nemo” easy to remember, but I warrant that a number of them may have good intentions. They're probably trying to use pop culture as a way to facilitate their children's interest in the natural world. But there's more than one reason why this is misguided. For one thing, having been to the aquarium several times, it's clear that children don't need some secondary point of reference in order to take an interest in the natural world, because the natural world is terrifically fascinating. And that's especially obvious when you're a child.

So using an animated film as a way of connecting with an aquatic exhibit is extraneous, but far worse than that it obfuscates children's understanding of what they're actually looking at. It disregards the separation between fantasy and reality, it suppresses knowledge of the actual species name, and it encourages children to understand the creature through an individual depiction and not through objective facts. And then on top of all of this, for many families the fixation on something that is recognizable from fiction overrides the significance of everything else that's on display. People walk in the door and say, “Find Nemo!” and they breeze through ninety percent of the aquarium to get to something that won't teach a child very much that he doesn't already know. If they didn't immediately put that idea in his head, they might be astonished by how much he doesn't care about the clownfish once he's seen the solitary-social penguins, the balloonfish with their glittering eyes, the sharks skulking past viewing windows, the challengingly camouflaged rockfish, and so on and so on.

When parents almost thoughtlessly constrain the purpose of visits to zoos and aquariums and museums, they probably think, more often than not, that they are doing it for the benefit of their children, that they are moving to retain a young attention span and provide its owner a quick shot of enrichment while they can. In fact, I think such parents and caregivers should consider that they might have it all backwards and that the feelings of stress and impatience are all their own, and merely projected onto their children. They should concern themselves less with what their children are looking to get out of the experience, and more with what they themselves are after. If the answer isn't “knowledge, and lots of it,” they can probably expect much more of their children's interest in the moment. But they likely won't be able to go on expecting it as those children age in the presence of a society that doesn't care particularly much for learning.

Wednesday, August 29, 2012

Employer Culture

I recently applied for a job in Wyoming. It was an entry-level reporting position in a small town, and it was advertised via an unusual posting that seemed to encourage a unique cover letter from me. I delivered that, received a response that may or may not have been a form letter, and, on its request, replied with a confirmation of my sincere interest in the position.

The original ad put more emphasis on the setting of the job than on the job itself, and the response really drove that home, emphasizing that the remote location was “not a romantic getaway by any means,” which “might not suit everyone.” My cover letter clearly outlined how I had always hoped to live and work in a remote location after graduating from college in the big city, and that the job seemed perfect for me. In my confirmation of interest, I disputed the notion that it wasn't a romantic getaway, and made it clear that in any event it was a place I could see residing happily, especially if I had a career to build upon there.

The editor sent a form letter to all still-interested applicants to the effect that she would have more time to go over the applications after a specific date. A week after that date she wrote to me directly to confirm that I was not to be interviewed, and in that brief message, she emphasized yet again the apparent insecurities of her entire organization regarding its setting, and explained that she had found someone who she thought would bring a lot to the paper while also enjoying the surroundings.

When I actually hear back from no-longer-prospective employers these days, I am no longer shy about pushing them to the limits of their patience in pursuit of explanations, and in this case I was really confused. I wrote to ask her if I had somehow given the impression that I wouldn't have been able to tolerate living in the sort of remote region that I had just used two sincere letters to explain that I specifically wanted to live in. She kindly pointed to a specific line in my second message. This was the comment that sunk my application:

Speaking more generally, I'm not so concerned with what the job or its surroundings can bring to me, as with what I can bring to them.

Am I crazy for being nonplussed by her reaction? That line came after two solid paragraphs of explaining why the job and its surroundings appealed to me, which followed upon an entire prior letter of the same, and yet all of that was apparently wiped from this editor's short-term memory by my decision to make the point that my values make me more interested in doing a perfect job than having a job I consider perfect.

I can't interpret this in any other way than that I was refused an interview for yet another job that I would have done fantastically well because I was insufficiently selfish. The briefly-prospective employer has given me the distinct impression that the job went to somebody whose application placed more emphasis on how much he wanted someone to give him that job, and less on how well he would perform its duties.

It's another example of the seemingly backwards hiring practices that have been dogging me for six goddamn years, and I took the opportunity to press this person on it, writing back:

I've gotten a certain impression many times over from people responsible for hiring. In your capacity as such a person, which goal would you rank ahead of the other, if you had to choose between them? 1) Finding someone who will do the best job. 2) Finding someone who is least likely to leave the job.

I give her a lot of credit for having been so communicative with me overall, but her response to this question was pathetic:

It depends. I try to find a good balance between the two.

Did I not make myself clear? I know she tries to find a good balance between the two. What I asked was which one was more important, and she simply dodged the question, avoiding any acknowledgment that there is a fragile value system at play in hiring practices. And though I can't wrest a confirmation of this from anyone in a position to give it, I consistently get the impression that human resource departments and hiring managers are interested in finding people just good enough for the open position that the company won't have to do anything to keep that employee on board, because they'll probably never get a better offer.

Other people that I've known have been crippled in their job searches by this employer culture, as well. Acquiring more qualifications often seems to harm job seekers more than it helps – such as teaching at the college level when one is looking for a career in early childhood education. It's evidently not worth taking the risk on hiring a good educator, a good writer, a good anything, if there's a good chance that their ambitions extend beyond the position one is looking to fill.

Obviously no one has admitted to this outright, but this most recent editor rather distinctly suggested it. Her rejection of my application was phrased so as to directly contradict the line that sunk my application, the one in which I said it was most important to me that I bring value to the organization that hires me. She wrote, “The job and its surroundings are to me much more important.”

Much more important than what? Than the person you hire being a good worker, a talented writer, a committed journalist, a person of decent character? All of that takes a backseat to believing that the job and its surroundings are exactly what the applicant wants and that nothing will tempt him away from whatever you're to offer him?

Anecdotal evidence doesn't count for much – you can always find some example that supports what you believe about the world – but at the same time that I and others I have known seem to absorb the damaging effects of these employer practices, I know of one person who appeared to be decidedly on the good side of them.

My ex-girlfriend never graduated high school, having gotten a GED instead. When I met her she had not been working for a longer period of time than I. During the time that I knew her, she routinely quit jobs without notice. I later found she took the same approach to relationships – find something better, sever ties immediately. Despite the fact that her resume didn't suggest impressive qualifications and the fact that she probably didn't have great references from prior employers, she had little problem walking out of one job and into another.

Why on Earth was she capable of being hired immediately, whereas if I applied for the same jobs my resume would be rejected without so much as a phone interview? The only logical conclusion I can come to is the same observation about employer culture. I can easily imagine hiring managers looking at her past history and deciding, “this girl doesn't have a lot of prospects in front of her; we'd be offering something that she should be truly grateful for.” They may have been wrong on both points, as to her graditude and her future outlook, but her mediocre resume gave them good reason to believe that hiring her wasn't a gamble.

With every job I've had, my managers have regarded me as having a work ethic that exceeds that of my coworkers. My performance and responsiveness to training have been roundly praised. The one time in my life that I got to work in an office, I received a year-end bonus that exceeded that of the person who had been promoted out of my position, even though I had only been there for six months. Despite all of this, actually finding a job is damn near impossible for me. I don't have a bit of doubt that I would perform the responsibilities of any job that I applied for with more competence and conviction than just about anyone competing with me for it. But I'm nearly as confident that that's not primarily what employers are looking for.

Of course, it could be that I'm taking too positive a view of myself. It could be that I'm just a terrible applicant. But I'm not about to assume that explanation in absence of evidence for it, and I'm certainly not getting any from the sorts of employers from whom I'm seeking jobs.

Previous to applying for this job in Wyoming, I was rejected without interview for another one that I was even better qualified for, and which was also out of my area. When I asked why, the editor did see fit to get back to me, but her response was utterly meaningless on point of qualifications. She said only that the person she hired "had what she needed." But she also pointed out that he had grown up in the area of the job, so I rephrased my question and asked whether, if I'd had the same qualifications I do now but had grown up in that region, I would have been at least interviewed.

Her response still makes me angry, and I expect that it will for as long as I struggle to have a legitimate career before the end of my twenties. She wrote back with one line: “Ed, I'm sorry. I'm not going to break it down.”

I had asked a straightforward yes-or-no question. I was looking for some indication, even if perfectly vague, as to whether my inability to secure a simple interview was attributable to being underqualified, overqualified, or simply having qualifications different from those that match the sorts of jobs I apply for. I didn't ask her to answer to any of that, though. All she had to do was say “yes,” “no,” or even “maybe.” To do so would have taken less effort than it took to type what she did.

To date, I can't conceive of any reason why she would respond that way, other than to be deliberately rude. This is my entire life we're talking about, and all that a person like her needs to do to give me a little more insight into why it remains so far off the rails is to say either “yes” or “no,” and she couldn't even do that.

I guess in light of that I should feel very pleased with the Wyoming editor for putting forth the effort to dodge my question in a way that at least seemed like an answer. Maybe that counts as progress.

Monday, August 6, 2012

Aging in Buffalo: A Personal Invective


I turned twenty-seven on Friday.  I know that most everyone has the experience of reaching an age at which birthdays cease to be causes for celebration, but I don’t think so many people find them to be the cruel reminder of lost time that they have been for me roughly since I became a full-fledged adult.  That is, if I could ever be called that in the first place.  I’m sure that by some people’s standards, I never grew up.  I’m inclined to agree; I’m just not inclined to blame myself.  That’s why birthdays are so awful.  They remind me of the speed with which time is marching on even as I remain stuck firmly in my place.

It’s interesting to be a resounding failure starting in your very early twenties, and an educated, ambitious one, who simply never had the chance to even screw up an opportunity.  It’s interesting to see the evidence of that failure every time you look out your front door on a hateful city that you never thought you’d have to return to, but then were never able to leave.  The Buffalo that I see every day is a place where no one seems capable of living with purpose, achieving social mobility, or bettering their personal character.

It’s actually terrifying to be aging here, because everywhere I look I see reminders of all the different people I don’t want to become.  Yet in absence of evidence of any alternatives, it seems increasingly likely that I will become just like some of them if this environment continues to hold me so close to its rust-pockmarked bosom.

I used to have more fire in me.  Twenty-seven shouldn’t be associated with this kind of tiredness.  Often, I feel numb enough to tolerate the intolerable.  Honestly, there was a time not long, and yet too long ago when I came close to vowing to kill myself if I wasn’t out of this town by a certain date.  The trouble now is that I can’t for the life of me remember when that date would have been.  Was it the start of this summer?  Next January?  The previous January?  My twenty-seventh birthday?  I can’t remember.  It doesn’t seem to matter anymore.  I am exceptionally well-distracted with the ceaseless struggle to find each day’s work and survive the week, and I am exceptionally well-deluded into thinking that therein, somewhere, lies a future change of life.

But then when I venture out of my home office, I see the change of life that comes over time, in absence of a transformative moment, a firm knock of opportunity, a breaking point.  Who shall I become, among these?  Perhaps by the time I’m in my mid-forties, my home business will be truly legitimate, and I can be like the shop owner around the corner, working irregular, overly demanding hours for a success so modest that in the fullness of middle age he is still living without health insurance.

Or maybe I need not look so far into the future, and instead I can aspire to be like my close peer and lifelong resident of the Blackrock neighborhood, who is consistently and profoundly more successful than I, which means at present that he’s been tasked with managing and fundamentally reorganizing a nearby gas station for eight dollars an hour.  Perhaps I can aspire to that without waiting to decay with age, though I doubt it.  Given my past history, it seems that even to be willfully exploited is too much for me to ask of prospective employers.

If, however, I could by some chance succeed in letting myself be exploited, then I can look forward to being like my brother, seven years my senior, slaving at management of a kitchen in exchange for a salary far short of the absolute minimum threshold for middle class, and too beat-down and molded into complacency to seriously seek a better way of life, while middle age looms.  Then, if I succeed in emulating that image, I can look forward to also becoming like my parents, both lonely people whose lives have apparently lacked any efforts at positive change for years on end.

I have to get away from these people, and I’m losing the capability to even imagine how that would happen, which is in turn inching me closer to the terrible outcome I want to flee from.  But it’s not family that I most fear becoming.  It’s all the little bearers of shattered lives or simple minds that shuffle about me day after day.  The few who possess the means for a decent life still seem either desperately adrift or else aloof and arrogant behind the bitterly ideological walls they’ve had to build for themselves to keep the tragic reality of this rust belt hellscape out of their emptily contented little lives.

The rest are a tragedy unto themselves.  Yesterday, I heard shouting outside my home and went out to make sure nobody was being hurt.  At the end of my street, a young woman was ranting and throwing things at who I presume to be her boyfriend.  I walked in that direction to make sure everything was all right, my phone in hand, ready to call the police.  For all I could tell, the woman was just throwing a tantrum, and the man was not returning the physicality, so I didn’t really know at what point to intervene.  In my uncertainty, I just ended up sitting nearby, next to a man who shook his head at the fighting couple and started talking to me as soon as I arrived.

If I spend time outside, I can generally count on finding half a dozen people in the course of an afternoon whose social status is wildly indeterminate.  I still remember my first encounter with the deplorable Eric Starchild, who wanders the streets of Buffalo selling single plastic beads on black strings for exorbitant prices.  When first he spoke to me, I thought for sure he was homeless and that that was his way of getting by and making the most of the hand he had been dealt.  Years later, I found out that he comes from an upper-middle class background, and after putting up with his attempts to advise me on how I could easily fix my life and have a career, I now have to restrain myself from punching him in the back of the head every time I see him walking somewhere ahead of me.

The fellow I encountered yesterday was of a similar sort.  He specifically described himself as coming from a wealthy family, but also as not being rich anymore.  That still left some doubt in my mind, as he sat there with his grocery cart and half-empty forty ounce bottle of beer, as to whether he was homeless, poor like me, or just another pretender who has still holds the financial means to do something with his life, but chooses not to.

I had a pleasant enough conversation with the fellow, though I could tell from the start that he was just slightly crazy.  It took about thirty seconds of conversation for him to reference mechanisms of government control, and another minute to get to his pronouncements about chemtrails.  He was perfectly coherent by and large, even relatable, but he’d filled the gaps in his worldview with self-assured paranoia.  He quite reminded me of a fellow I met on a Greyhound bus once, who talked to me with great clarity about many things, but occasionally told stories about how the FBI had been sending agents to monitor him in the guise of such people as his ex-wife’s new boyfriend.  I quite like talking to these people.  It’s intriguing to see how a person creates a consistent mythology to explain the tragedies of their lives, and how in the best of cases, this can seemingly avoid seriously impairing the person’s perception of reality in other areas.

I am especially interested to talk to these people now, because a spent a solid couple of years cresting toward the edge of insanity, and communicating with people who have inched past the barrier is the only thing that suggests the possibility that I am not irreversibly headed towards the hideous outcomes that have been realized in so many of the people that surround me.  On the other hand, most of the people I’ve spoken to who have embraced such paranoia have been roughly twice my age.  How many times did they near the edge and draw back while they were still young?  How much longer do my inexplicable and inexpressibly crushing failures have to persist before I manufacture conspiracy theories to make sense of them?

I may have already been suffering under the weight of those failures for six years, but conceivably there could be decades still to come.  Nothing, after all, exists to give me confidence that it will ever change, unless I can count the fact that I’m feeling pretty stable in my advanced age.  But as it happens, it was actually the instability that served to make me feel like I had it in me to fight an intolerable situation, to literally run away from this town with thirty dollars tucked into my shoe if the future here began to look bleak enough.

I fear the sort of person I will become if I remain as invisible as I am for much longer.  I fear it all the more because I no longer have the same confidence in my resistance, yet I still see every bit as much to resist, everywhere I look.  In five horrendous years in Buffalo, I don’t think I’ve met a single person I genuinely respect.  Those older than I are chilling images of the things that this town does to a person.  They are vessels for the display of various unique admixtures of hopelessness, paranoia, ignorance, unjustified arrogance, complacency, prejudice, and greed.

To date, the only person I have seen with any regularity whom I can say does not make me intensely sad is, oddly enough, a toothless old woman who sits smoking near the bus stop by the old Showplace Theater.  She has been as vividly damaged by her own life as all the rest of them, but somehow she is wonderfully pleasant to everyone who passes by, and it is a pleasance unpolluted by the relentless ego that motivates so many other local people to reach out to one another.

This woman, alone among all the others, seems to have found a way to inhabit this place with a character of quiet dignity, and I applaud her for it.  But still it is not good enough for me.  If, God forbid, I reside here when I am near to her age, I would never want my own dignity to be quiet.  I want it to rage against the systematic theft of lives.  I pray that this silence in me now is just a passing phase, and that age is not taking the fire from my blood.

Friday, May 11, 2012

The Fascinating Seventy Year-Old Virgin


The internet loves the news of the weird.  Apparently, a lot of web browsers are clicking over to a brief story about a seventy year-old virgin.  This isn’t that interesting at first blush, especially in light of the story from a few months ago about a woman who’d remained a virgin for over 100 years.  In the latter case, though, I got the impression that the woman may have simply been asexual.  She expressed an overall disinterest in sex and suggested that her longevity could be explained by her not concerning herself with that pursuit.  This seventy year-old woman who is in the news now, on the other hand, claims to have retained her virginity as a matter of moral commitment, as she doesn’t believe in sex before marriage, but never found a husband.

This story wouldn’t be that interesting except for the fact that by being limited to a one paragraph synopsis it opens up my mind to all sorts of speculation about the surrounding circumstances.  That speculation is made all the more intriguing by virtue of two unusual facts:  The woman, Pam Shaw, performed for many years as a cabaret singer, and she’s in the news now because she’s apparently hit her virginity’s breaking point at long last, being ready now to give it up to “a tall, dark, and handsome millionaire.”

This woman seems fascinating.  The image that I get is a tight bundle of lifelong contradictions.  I appreciate that because it’s something that I can relate to, even though there are aspects of it that I admire and aspects that I’m eager to criticize.  First the praise:  Good for her for maintaining her virginity amidst a career in which she was referred to as “The Sexational Pam,” in an industry in which loose attitudes about sex are presumably the recognized norm.

It’s a unique personality type that encourages a person to eschew particular experiences for herself at the same time that she flirts with the edges of those experiences and indulges an active curiosity about them.  As a deliberate virgin, and arguably an asexual, myself, I kind of want the life that she’s led.  I felt oddly comfortable when I had an opportunity to go to an art exhibit at an S&M parlor and when I followed a drunken friend into a pornography store.   So I applaud Ms. Shaw’s commitment to a strangely indulgent sort of chastity.

But here’s the thing that strikes me negatively about her story:  She spent, let’s say, fifty-five years maintaining a commitment to virginity on the basis of not believing in sex before marriage and now she’s announced her readiness to “take the plunge” if the interested part has enough money?  That seems like freakishly inconsistent morality.  Doesn’t the decision to trade virginity for a cash-rich lifestyle sort of betray the very sentiment behind Ms. Shaw’s lifelong chastity.  I would presume that if she didn’t believe in sex before marriage, she felt that love was more important than physical pleasure.  Am I to conclude that now at seventy years old she’d determined that money is more important than both?

On the other hand, I can understand the impulse underlying her statement.  The longer you retain something that requires consistent sacrifice, the more valuable in becomes to you.  Thus, even if you have decided that enough is enough, it can take an awful lot of incentive to push you to an actual breaking point.  It may be that after years of working so close to sex, and now approaching the end of her life, Ms. Shaw has simply decided that she wants to experience something that she’s denied herself for so long.  She probably feels that it can no longer be on the terms that she’d set, so instead she’s changing the terms, compromising the rigid morality in order to cease compromising the physical indulgence.

The woman has evidently lived her life amidst contradictions.  What’s one more?

Wednesday, May 2, 2012

The Oxford Comma, Childhood Education, and Me


The coincidences that I encounter these days are not as profound as they once were.  Now it tends to be more along the lines of repeated references to a film I have yet to see, or some negative coincidence like my some last minute excuse always coming up amongst friends.  A couple of recent, coincidental encounters have compelled me to make something out of a topic of grammatical concern.

I stumbled onto an online discussion recently about the Oxford comma and whether it is or is not grammatically correct, or required.  I later found that another online writer’s personal byline declared him to be “a fan of the Oxford comma,” and having already been given cause to reflect on it, I thought to myself, “Well, hell, me too!”

For those of you who are extremely casual grammarians or who pride yourself on a 1337 ability to avoid the conventions of written English, the Oxford comma is the comma that comes between the penultimate entry in a list and the word “and.”  Nouns, punctuation, and a conjunction make up a list, and there’s an Oxford comma in this sentence.  Some writers use it, some don’t.  Some style guides require it, some reject it.  Speaking quite generally, both its use and its non-use are acceptable.  It seems to me that many people, either because they haven’t thought about it or because they’re naturally committed to one or the other, don’t realize this.

That even goes for teachers of English.  The reason why I know about the controversy over the Oxford comma is that I remember it being a legitimate point of confusion in elementary school.  I’m fairly certain that when it first came up, the teacher of what I’m guessing was my third grade class, told us quite explicitly that there was no comma between the second-to-last and last entries in a list.

I more clearly recall when it came up with a later English teacher, because she didn’t seem to know which was correct, but would not admit to that fact.  She was overseeing an assignment in which students had to add punctuation to an existing sentence, and when she gave the answer she listed the places where each of the commas belonged, paused, and added the Oxford comma to the mix.  Even among a group of nine year-olds, the class was bifurcated on that answer, so that I cheered to myself over my superior understanding, and my neighbor had to correct his paper.

At this point you may be asking what on Earth this has to do with breaking points.  Well, having thus had an opportunity to reflect on my personal relationship with the Oxford comma, I realize that the way I learned about it might represent something that’s essential to the development of an intelligent, independent child.

You see, regardless of what I’ve become, I was the picture of an upstanding, studious child who did with religious devotion what he was told to do by parents and teachers, and always followed the rules.  That contributed to a marvelously successful academic career, which paid off with a sense of pride for most of the time that I was in school but left me with nothing once I no longer had anyone to obey.

Now I seem to have such a contentious, anti-conformist mindset as to give me a rather hard edge, which acts as a social barrier.  Nevertheless, I remember well the child that always did his homework, developed an earnest rapport with authority figures, never snuck out at night or dabbled with drugs or alcohol.  In many ways, I am still the child, even though I have a well-developed and eagerly maintained sense of self.  So I know that if I were to finally be injected into a corporate setting, or otherwise put low in a hierarchy that I’m wont to accept, I will still do what I am told to do at most every turn, and do it with sincere deference.

Knowing the kind of child that I was, I sometimes wonder just how I would fare in the Milgram experiments, which, in the early 1960s, demonstrated how easy it is for ordinary people to do monstrously unethical things when directed to by an authority figure.  My life has been unfortunately short on severe challenges to my own morality.  Mostly, there have just been instances where circumstances casually flirted with a scenario in which I might be called upon to either speak up or stand by as a witness to preventable wrongs.  And I’ve always been afraid of my apparent slowness and caution in responding to such situations.

In a lot of ways, I was quite unlike what one expects in a typical intelligent youth.  My aversion to drugs and alcohol, even to sex, has been lifelong, but psychological studies indicate that a curious willingness to experiment with such things is characteristic of a changeable, and thus intelligent, mind.  The saintly boy scout type might prove to be exceptionally good at reciting the rulebook, but that doesn’t demonstrate any real intellectual curiosity.  Rebellion is supposed to be a natural part of adolescent development, but I never experienced it.  My greatest act of rebellion came at twenty-one when I refused to apply to graduate school.

These sorts of contrasts make me wonder if I really have the firm, capable mind that I was always praised for, or if, instead, I am just a terrifically smooth-running machine.  All those subject areas that I was so good at in my primary and secondary schooling – did I really understand them, or did I just repeat what I was told at the same time that I repeated “don’t talk to strangers,” “don’t smoke,” “don’t skip class,” “don’t talk back”?

My worries about the authenticity of my own intelligence are modestly alleviated, however, by the knowledge that insecurity has been a characteristic of virtually everyone for whose intelligence I have had respect in the past.  Whenever I question my skill at or grasp of something, I take a little bit of comfort in remembering the Dunning-Kruger Effect – the tendency of skilled people to think that everyone else is as good as they while deficient people think everyone else is as bad.

Still, I’m not like the other mentally-capable people I know, and it leaves me with the worry that all along I’ve just been adeptly imitating them, saying the sorts of things they say, following the rules that are supposed to lead to where they are, and generally copying instead of thinking.  After all, the best and the worst of people are the ones who question authority.  The rest are just mediocre.

Of course, what I need to keep in mind is that an essential willingness to question authority doesn’t mean that it’s necessary to do so.  And yet it is necessary to have that willingness, because a constant follower is not one to form his own ideas.  That’s a problem when the ideas that you’re asked to follow are wrong, and it’s equally a problem when you have no firm idea to parrot.  Case in point, the Oxford comma.

I was probably eight years old when I learned how to separate items in written lists.  In retrospect, I take great pride in my reception of that lesson.  More to the point, I take pride in the fact that as a child I was not receptive to that lesson.  The absence of the Oxford comma in third grade English is the first memory that I can dredge up from my spotty personal history of an instance in which I actively, albeit silently, disagreed with a teacher.

I don’t know where I learned that skill so early in life, but I believe that it contributed in magnificent ways to the development of the person writing this today.  A year or so after that first lesson, I defied the prior teacher’s instruction and inserted a comma next to the conjunction, because that’s what made sense to me.  I felt then as I do now:  There’s no sense in excluding the comma from the last item in a list, because the conjunction doesn’t fully separate one noun from another.  There are situations in which you might pair two words as a compound noun linked by a conjunction, such as “salt and pepper,” or “soup or salad.”  If such a compound comes at the end of a list and it’s accepted that the writer omits the Oxford comma, the two nouns will be inappropriately divided from each other.

In a far less analytical way, I was aware of this at eight years old, and even though I wasn’t intellectually prepared to defend my opinion to an old woman in a position of authority, I at least had the fortitude to let the instruction pass through my ears unheeded.  When my later teacher hesitated over the question, I was vindicated, because I knew then that it was a legitimate area of uncertainty, and I was confident that I had resolved it correctly.

Children need the skill to resolve linguistic and explanatory puzzles on their own, if they are to become intelligent beings.  Knowing what I do about myself, I’m almost certain that if I hadn’t displayed that skill at an early age, I would in fact be the intellectual automaton that I sometimes fear I could be.  In light of that, early childhood education cannot be simply a matter of transmitting information; it must encourage children to resolve questions that the teacher has left uncertain, and even to challenge the claims of authority.

In many circles, this is something that’s explicitly rejected.  We often tend to value pure obeisance in our children, discouraging them from questioning until they’re old enough to do so.  That, however, is not education.  The creation of loyal citizens is not the same as the development of clever, critically thinking youths.  The patterns that we establish as children can follow us throughout our lives, and a pattern of accepting things at face value then can make it difficult to pick up the skill of questioning later on.  When it is not deliberately fostered, I don’t know where the impulse to reject false information comes from, but it is enormously valuable to developing minds, and I thank god that I picked it up somewhere.

And I thank god for the Oxford comma.