Showing posts with label trends. Show all posts
Showing posts with label trends. Show all posts

Friday, October 12, 2012

Everyone Look at the Ignorant People!



I just happened upon a clip from Chris Matthews coverage of the supporter gatherings prior to the Vice Presidential debate.  It is not enormously significant, but it is a delicious bit of video, which I have an irresistible urge to comment upon.  The roughly one-minute clip begins with Matthews interviewing a random Obama supporter.  Just as he asks her about her health care situation, an old woman interjects from off camera by shrieking the word “communist!” in a voice that would have made it notably fitting if she had followed up with, “burn him!”

Everyone in frame reacts to the shout, but the woman being interviewed shakes it off and takes a few seconds to explain that she and her husband had recently lost health insurance for the first time in their lives.  Chris Matthews lets her finish her answer, but the speed with which he departs when she reaches the end of her sentence suggests an almost Pavlovian response to the shrill voice at the edge of the crowd.  He lowers the microphone immediately and says, “Okay let’s go over to this lady,” whereupon he seeks out the person who yelled communist, in order to ask her what she meant by it.

What follows is a stunningly awkward exchange in which Matthews asks the woman exceptionally unchallenging questions, essentially just repetitions of “what do you mean?” and she repeatedly fails to answer them, instead chiding the professional journalist and commentator to “study it out, just study it out,” derisively referring to him as “buddy,” and asserting that she knows what she means.  It would be painful to watch if I had any inkling that the woman had sufficient self-awareness to be embarrassed by it.  It would be hilarious if it wasn’t such a tragic commentary on the state of political discourse.  Watch it if you like:



Obviously, our culture and systems of information need to be reformed enough to precipitate a breaking point whereby nobody can remain so self-satisfied in their own ignorance as this woman showed herself to be.  Her willingness to gather at a political rally and shout her views on national television suggests that she is firmly committed to them, but even in the space of a minute, her complete inability to explain or defend those views paints the image of someone who has absolutely no idea what she’s talking about, but also doesn’t care that she’s not informed and doesn’t think she has to be.

I watch this woman wag her head at Chris Matthews and pause at length before shooting back, “You don’t know?” when asked what she means by “communist,” and I see someone who believes that in the face of any challenge to their worldview, a self-righteous attitude eliminates the need for facts and rationality, every time.  It is indicative of a sociopathic mindset that takes confidence and strength to trump all else, and that mindset seems like it is breeding extensively in the modern population.  That in turn is indicative of a serious cultural failure in America, though unfortunately one that is near impossible to overturn.

Far less difficult to attain is the personal breaking point that this clip seems to point to, though I must admit that I don’t know which side of it I ought to come down on.  I must admit that in watching the clip, the thought almost immediately crossed my mind that maybe this woman was some sort of amateur satirist aiming to portray the Republican opposition as deluded and irrational, and even that maybe she had been planted there by some group on the left.  I entertain those thoughts because, as with most conspiracy theories, it’s simply easier to believe than the frightful reality, which in this case would be that America is long on individuals who form firm, aggressive opinions on the basis of the extracts of ether and bullshit.

I know that my skepticism about public ignorance is unsustainable.  Indeed, I know that it can be harmful, because it’s a sort of ignorance in itself.  Fundamental to my personal philosophy is the idea that you can’t hope to effectively solve a problem if you deliberately avoid recognizing the reality and extent of that problem.  Public ignorance is the problem at the root of all other problems, because it is that which allows people to avoid reality, and thus deny solutions.

The problem here is that I don’t know whether I should be pushing myself towards the breaking point of taking public ignorance for granted, or if instead I should find a way to keep from assuming that conspiracies are afoot while still giving individuals the benefit of the doubt as regards their level of information.  In other words, one might say that witnessing ignorance of the proportions on display in this clip challenges me to avoid two negative breaking points, which threaten to make me either overly cynical about either human stupidity or overly cynical about political manipulations.

I’d venture to guess that not a lot of people have carefully-reasoned assessments of their fellow men, so this is a personal breaking point that others may have to contend with as well, but being personal, it’s of secondary importance.  What this video clip has brought to mind that could be addressed on a large scale right now is a question for the media about how to handle firm opinions voiced by the public.

I honestly can’t decide whether to praise or criticize Chris Matthews’ response to the political heckler.  Part of me wants to criticize just because I used to get a lot of enjoyment out of focusing my ire for the news media against Matthews, who, despite being a bright guy, was terrible at his job back when I considered MSNBC a news organization.  Now that his job is “partisan” rather than “journalist,” he doesn’t seem so bad.  Okay, it also helps that I don’t have a TV.  But in any event, even if Matthews remains professionally an idiot, the woman he had his brief exchange with is an idiot in much larger terms, and to an unquantifiably greater extent.

The relevant question, then, is, “Did Matthews have good enough reason to focus the attentions of the microphone and camera on this woman’s dimwittedly vociferous views?”  On the one hand, by giving her a voice once she’d asked for it, and contributing no commentary of his own, Matthews allowed the woman to provide her own refutation of her talking points.  The exchange conveyed the impression that extremist views are based on no information, which of course they often are.  That’s a good fact to put on display when the opportunity arises.

On the other hand, we have to remember the shamelessness with which the old woman held her ideas in absence of evidence or personal understanding.  Such shamelessness probably isn’t much affected by having a mirror held up to its own ignorance, and that fact threatens to let this incident stand as encouragement for other people like her.  As I said, the greatest breaking point involved here is also all but unattainable: the creation of a culture that prevents the embrace of ignorance.  For the foreseeable future, lack of information and presence of strong opinions will continue to go hand-in-hand among a sizable portion of the American public.  It will take generations of concerted effort to change that fact.  But that doesn’t mean that opinionated idiots will always be activists.

I estimate that much less comprehensive cultural changes could prevent people who hold uninformed opinions from being so vocal and so public with those opinions.  And one thing that probably doesn’t help is giving voice to those opinions, in all their self-righteous vacuity, on national television.  Viewers at home whose perspective on American politics don’t go much farther than “he’s a communist!” won’t be shamed or enlightened by their impromptu spokesperson’s self-defeated, just as she wasn’t shamed or enlightened by it.  To the contrary, the presence on the airwaves of uninformed declarations and accusations provides more fodder for lazy people to find something to parrot as they make the leap from uninformed citizen to armchair activist.

The opinions that are screeched from the sidelines are the ones that most need to be debunked once they’re present, but they’re also the ones that most need to be disallowed from taking the field.  Overall political discourse is cheapened not only by their ignorance but also by their lack of decorum.  As regards ethics, I think I am so committed a deontologist that I have internalized Kant’s categorical imperative.  When I see things like this video clip and start wondering what ought to have been done in the situation I find myself universalizing the act I witnessed and looking for its effect on the moral system.

In this case, what would the effect be if journalists always turned their attention to the loudest and most abrasive commenter on the scene as Matthews seems to have done?  He even turned his attention away from the woman who was contributing relevant anecdotes to the public understanding, in order to give the shrill, ancient cold warrior a chance to explain her unexplainable views.  I fear that the current state of journalism is not far from embracing the loudest participant in any debate, because the hypothetical result is that all of American politics becomes a shouting match, and that is seemingly not far from the situation that we already face.

In light of that threat of a still more corrupted political and journalistic landscape, I’m tempted to say that although the woman’s response was rather satisfying, the better thing to do in that situation and all similar situations is to keep the person who’s shouting epithets off of our television screens.  But I’d be interested to know what readers think of the effects of either encouraging or discouraging uninformed speech.

Tuesday, September 4, 2012

At Cultural Attractions: Parents Don't Teach, Children Don't Learn

The Buffalo Zoo celebrated the traditionally-last weekend of summer by offering a ninety percent discount on admission on Labor Day. Since one dollar is something I can just about afford on a good week, I took a holiday-morning bike ride around Delaware Park and then queued up with the mass of people, mostly families with small children, who had just as readily sprung at the opportunity for a cheap cultural activity.

Considering the lines at the gate, I was surprised that the scene inside was not as claustrophobic as it could have been. It took a little jostling or waiting in the wings to get a proper angle, but everyone seemed to get their opportunity to look at the cute, or fearsome, or comic animals. I freely admit that I was mostly there just to take another look at some of my favorite creatures, to watch the polar bear swim in its artificial pond, far from the threatened environment of its natural-born fellows, to grin down on the docile capybaras lounging in the rainforest exhibit, to rediscover my respect for the vulture which I discovered when I wrote a report on the species in elementary school, to look for big cats pacing like in Rilke's description of the panther.

But even though this excursion wasn't exactly intended as a fact-finding field trip, I never go to a museum or zoo or aquarium without trying to learn something about the stuff I'm looking at. Not a heck of a lot changes at the Buffalo Zoo from year to year, and I think I had been there about a year ago, so it's not as if I could have expected to discover an animal the existence of which I was altogether unaware of. But there's only so much I can commit to memory, so naturally I find myself rediscovering things on subsequent visits to the same places of learning. I always seem to forget, for instance, that the Rocky Mountain Bighorn Sheep are capable of running at up to fifty miles per hour. The up-side of my disappointment at not retaining encyclopedic recollections – a failure that seems to become ever-worse as I age – is that I sometimes get to re-experience the joy of learning something interesting all over again.

Even if I don't read all of the wildlife facts, of which there aren't even that many at the Buffalo Zoo, I do at the very least try to get the names of the animals right. This is more than I can say of the vast majority of the other patrons that I encountered yesterday. It having been a year since my last visit, I found myself trying to actively identify each species, endeavoring to commit to memory the ones that escaped me this time around. This is natural to me, and I thought it was part of the essential purpose of going to the zoo. I always took it to be a place where you went not merely to look at animals as in a menagerie, but to find out something about the wider world by discovering what they are and from where they come. I especially thought that that was why parents took their children to the zoo. I'd always assumed that it was meant as a supplement to a child's primary education, a way to instantiate curiosity and gauge the direction of nascent scholarship. Apparently I was quite wrong about this as well.

Most any time that I go to places like zoos or museums and find myself crowded by children and their adult chaperones, I am downright shocked by the lack of interest that parents have in conveying any information whatsoever to their charges, or even in encouraging those children to learn anything on their own. I fear that my disdain paints me as a killjoy and that the average reader will see me as attaching far too much significance to the conduct of people who are on a simple, light-hearted family outing. But that's just the trouble. I worry that people attach entirely too little significance to such everyday opportunities to influence the character, values, and perspective of impressionable children.

As much as Americans today recognize and lament the widespread failure of education and the failure of modern children to live up to appropriate standards, I think commentators and individual parents are too much inclined to see that failure as institutional and too little inclined to consider it as social and cultural. If the behavior of parents at zoos and museums is indicative of their broader attitudes, it suggests that people have widely forfeited the recognition of personal responsibility for the education of their own children, instead handing that responsibility off to schools as if the process of raising an intellectually astute and ambitious child is something that can be consolidated into a specific set of hours in specific locales.

If that is indeed the view – if the need for education is recognized, but only recognized as being needed somewhere outside the home – then I can only conclude that people don't really value education at all. That is, they don't value education as it ought to be valued, for its own sake, as both a public and a personal good. You can't expect children to learn well and perform at a high level in school if the culture that they're coming up in is one that portrays education as a sort of obligation and something that brings good things to the learner, but is not good enough in its own right to be worth pursuing in absence of the social obligations of homework and exams.

What else can I conclude from regularly observing that perfectly middle class parents, far from exhibiting much intellectual curiosity of their own, don't even respond to the intellectual curiosities of their own children. But perhaps that's a little unfair. At the zoo yesterday I did find one or two adults expressing curiosity to the extent that they pressed their faces to the glass and perplexedly asked of no one in particular, “What is it?” They just didn't express a great deal of interest in actually doing anything to satisfy their curiosity. They just couldn't be bothered to walk back two feet in order to read the damn nameplate.

This is entirely their own affair when the adults are on their own and solely responsible for their own edification or ignorance. But it gets under my skin when their own lack of care for finding answers threatens to be transmitted to a child who is still blessed by wide-eyed eagerness to comprehend the world around him, whatever aspects of it should set itself before him.

Just a few exhibits down from where I heard one unresolved ejaculation of “What is it?” I found myself looking at another glass enclosure that housed three wallabies crouching at the back of their habitat, when a family walked around me to look at the same. It was comprised of a couple with a daughter just barely of speaking age and a son perhaps six years old. The parents looked, glassy-eyed, into the scene while the boy excitedly called out “kangaroos!” I had started moving away from the exhibit, but noticing the boy being met with silence, I said simply “wallabies,” partly in hopes that his parents would hear me and realize, if they did not realize it on their own, that their son had made a reasonable but slightly mistaken assumption about what they were looking at.

However, I was essentially met with silence, too, except in that the boy, perhaps hearing me or perhaps just seeking acknowledgment from his parents, repeated “kangaroos.” Noticing that they weren't going to say anything and that their eyes had apparently still not passed over the signs that clearly stated the name of the species, I repeated, with the boy more specifically in mind, “wallabies.” Now looking squarely at me, and inquisitively, the boy again said “kangaroos.” It could not have been more obvious that the child was interested in being corrected. He wanted to learn, as most children do when simply presented with the opportunity. This child was young, but most likely old enough to sound out the word “wall – a – bye” if he knew where to look, and if he was made to realize that he didn't know the answer without looking. But to do that, he would need an example to follow, a pair of parents who had the tools to find out answers for themselves, and cared to give their children the same.

The child looking to me instead of his parents for that meager bit of instruction, I addressed him directly, explaining, “No, these are wallabies. Kangaroos are big; these are smaller.” And at that he turned to his parents and his younger sibling to repeat it to them: “These aren't kangaroos, the man says.” At that I was walking away, and I can only hope that their son's claim finally prompted them to look at the sign and sound out “wall – a – bees.” It was up to them to take an interest on their own, but it seemed to me that the child, being a child, not only wanted to know about these things in the zoo, but wanted others to know about them to.

I experienced the same thing elsewhere. In the crowded rainforest exhibit, I, being a nerd, spoke straight to the capybaras, telling them that I just wanted them to know that they are the largest rodents on Earth, and that that's awesome and they should be proud. A young girl just beside me asked, seemingly of no one in particular, "What are those called?" It could be that she heard me demonstrating some knowledge of them and figured that I had the answer, or it could be that she, like so many young children, thought her parents would have all the answers she sought.

She had not spoken straight to me, and that being the case, I would think that a scientifically interested parent, one familiar with zoos, would say something like, “I don't know, let me look at this information card over here so we can find out.” The parents did not move, of course, so I turned to the child and told her, “Those are called capybaras.” Naturally, she then looked back to her parents and sought to inform them of what they did not inform themselves: “They're called capee-bears.” The parents did not repeat the information; they did not move to confirm it or commit it to memory; they did not give her any indication that she should feel proud of having learned something, that she should be thankful for the knowledge, or that she should seek to learn other things as well.

The desire to learn is so natural and so passionate among children. How poorly we must regard it as a society that students evidently end up so thoroughly dissuaded from eager learning long before reaching the lower threshold of adulthood. What standards can we possibly expect students to meet if we handicap them in all the faculties that might prompt them to aim above the mark. If this culture persists, the most likely solution is simply to expect less of students, as has already become the defining feature of decades in the devolution of higher education.

In the future of this culture, we may as well just rename familiar animals to match the absent understandings of parents and their children. Having been to a couple of zoos and aquariums in recent years I've found that as far as doting children and intellectually incurious parents are concerned, every lemur is called King Julian and every clownfish is Nemo. This really aggravates me. My best friend is terrifically fond of the Niagara Aquarium, so I have gone there with her on several occasions. Upon every visit, without fail, one can hear at least half a dozen parents exclaiming, “All right, let's find Nemo,” or, “There's Nemo.” I think I've heard the word “clownfish” used by a parent to a child exactly once.

I have no doubt that some of these parents are just lazy and find “Nemo” easy to remember, but I warrant that a number of them may have good intentions. They're probably trying to use pop culture as a way to facilitate their children's interest in the natural world. But there's more than one reason why this is misguided. For one thing, having been to the aquarium several times, it's clear that children don't need some secondary point of reference in order to take an interest in the natural world, because the natural world is terrifically fascinating. And that's especially obvious when you're a child.

So using an animated film as a way of connecting with an aquatic exhibit is extraneous, but far worse than that it obfuscates children's understanding of what they're actually looking at. It disregards the separation between fantasy and reality, it suppresses knowledge of the actual species name, and it encourages children to understand the creature through an individual depiction and not through objective facts. And then on top of all of this, for many families the fixation on something that is recognizable from fiction overrides the significance of everything else that's on display. People walk in the door and say, “Find Nemo!” and they breeze through ninety percent of the aquarium to get to something that won't teach a child very much that he doesn't already know. If they didn't immediately put that idea in his head, they might be astonished by how much he doesn't care about the clownfish once he's seen the solitary-social penguins, the balloonfish with their glittering eyes, the sharks skulking past viewing windows, the challengingly camouflaged rockfish, and so on and so on.

When parents almost thoughtlessly constrain the purpose of visits to zoos and aquariums and museums, they probably think, more often than not, that they are doing it for the benefit of their children, that they are moving to retain a young attention span and provide its owner a quick shot of enrichment while they can. In fact, I think such parents and caregivers should consider that they might have it all backwards and that the feelings of stress and impatience are all their own, and merely projected onto their children. They should concern themselves less with what their children are looking to get out of the experience, and more with what they themselves are after. If the answer isn't “knowledge, and lots of it,” they can probably expect much more of their children's interest in the moment. But they likely won't be able to go on expecting it as those children age in the presence of a society that doesn't care particularly much for learning.

Wednesday, August 29, 2012

Employer Culture

I recently applied for a job in Wyoming. It was an entry-level reporting position in a small town, and it was advertised via an unusual posting that seemed to encourage a unique cover letter from me. I delivered that, received a response that may or may not have been a form letter, and, on its request, replied with a confirmation of my sincere interest in the position.

The original ad put more emphasis on the setting of the job than on the job itself, and the response really drove that home, emphasizing that the remote location was “not a romantic getaway by any means,” which “might not suit everyone.” My cover letter clearly outlined how I had always hoped to live and work in a remote location after graduating from college in the big city, and that the job seemed perfect for me. In my confirmation of interest, I disputed the notion that it wasn't a romantic getaway, and made it clear that in any event it was a place I could see residing happily, especially if I had a career to build upon there.

The editor sent a form letter to all still-interested applicants to the effect that she would have more time to go over the applications after a specific date. A week after that date she wrote to me directly to confirm that I was not to be interviewed, and in that brief message, she emphasized yet again the apparent insecurities of her entire organization regarding its setting, and explained that she had found someone who she thought would bring a lot to the paper while also enjoying the surroundings.

When I actually hear back from no-longer-prospective employers these days, I am no longer shy about pushing them to the limits of their patience in pursuit of explanations, and in this case I was really confused. I wrote to ask her if I had somehow given the impression that I wouldn't have been able to tolerate living in the sort of remote region that I had just used two sincere letters to explain that I specifically wanted to live in. She kindly pointed to a specific line in my second message. This was the comment that sunk my application:

Speaking more generally, I'm not so concerned with what the job or its surroundings can bring to me, as with what I can bring to them.

Am I crazy for being nonplussed by her reaction? That line came after two solid paragraphs of explaining why the job and its surroundings appealed to me, which followed upon an entire prior letter of the same, and yet all of that was apparently wiped from this editor's short-term memory by my decision to make the point that my values make me more interested in doing a perfect job than having a job I consider perfect.

I can't interpret this in any other way than that I was refused an interview for yet another job that I would have done fantastically well because I was insufficiently selfish. The briefly-prospective employer has given me the distinct impression that the job went to somebody whose application placed more emphasis on how much he wanted someone to give him that job, and less on how well he would perform its duties.

It's another example of the seemingly backwards hiring practices that have been dogging me for six goddamn years, and I took the opportunity to press this person on it, writing back:

I've gotten a certain impression many times over from people responsible for hiring. In your capacity as such a person, which goal would you rank ahead of the other, if you had to choose between them? 1) Finding someone who will do the best job. 2) Finding someone who is least likely to leave the job.

I give her a lot of credit for having been so communicative with me overall, but her response to this question was pathetic:

It depends. I try to find a good balance between the two.

Did I not make myself clear? I know she tries to find a good balance between the two. What I asked was which one was more important, and she simply dodged the question, avoiding any acknowledgment that there is a fragile value system at play in hiring practices. And though I can't wrest a confirmation of this from anyone in a position to give it, I consistently get the impression that human resource departments and hiring managers are interested in finding people just good enough for the open position that the company won't have to do anything to keep that employee on board, because they'll probably never get a better offer.

Other people that I've known have been crippled in their job searches by this employer culture, as well. Acquiring more qualifications often seems to harm job seekers more than it helps – such as teaching at the college level when one is looking for a career in early childhood education. It's evidently not worth taking the risk on hiring a good educator, a good writer, a good anything, if there's a good chance that their ambitions extend beyond the position one is looking to fill.

Obviously no one has admitted to this outright, but this most recent editor rather distinctly suggested it. Her rejection of my application was phrased so as to directly contradict the line that sunk my application, the one in which I said it was most important to me that I bring value to the organization that hires me. She wrote, “The job and its surroundings are to me much more important.”

Much more important than what? Than the person you hire being a good worker, a talented writer, a committed journalist, a person of decent character? All of that takes a backseat to believing that the job and its surroundings are exactly what the applicant wants and that nothing will tempt him away from whatever you're to offer him?

Anecdotal evidence doesn't count for much – you can always find some example that supports what you believe about the world – but at the same time that I and others I have known seem to absorb the damaging effects of these employer practices, I know of one person who appeared to be decidedly on the good side of them.

My ex-girlfriend never graduated high school, having gotten a GED instead. When I met her she had not been working for a longer period of time than I. During the time that I knew her, she routinely quit jobs without notice. I later found she took the same approach to relationships – find something better, sever ties immediately. Despite the fact that her resume didn't suggest impressive qualifications and the fact that she probably didn't have great references from prior employers, she had little problem walking out of one job and into another.

Why on Earth was she capable of being hired immediately, whereas if I applied for the same jobs my resume would be rejected without so much as a phone interview? The only logical conclusion I can come to is the same observation about employer culture. I can easily imagine hiring managers looking at her past history and deciding, “this girl doesn't have a lot of prospects in front of her; we'd be offering something that she should be truly grateful for.” They may have been wrong on both points, as to her graditude and her future outlook, but her mediocre resume gave them good reason to believe that hiring her wasn't a gamble.

With every job I've had, my managers have regarded me as having a work ethic that exceeds that of my coworkers. My performance and responsiveness to training have been roundly praised. The one time in my life that I got to work in an office, I received a year-end bonus that exceeded that of the person who had been promoted out of my position, even though I had only been there for six months. Despite all of this, actually finding a job is damn near impossible for me. I don't have a bit of doubt that I would perform the responsibilities of any job that I applied for with more competence and conviction than just about anyone competing with me for it. But I'm nearly as confident that that's not primarily what employers are looking for.

Of course, it could be that I'm taking too positive a view of myself. It could be that I'm just a terrible applicant. But I'm not about to assume that explanation in absence of evidence for it, and I'm certainly not getting any from the sorts of employers from whom I'm seeking jobs.

Previous to applying for this job in Wyoming, I was rejected without interview for another one that I was even better qualified for, and which was also out of my area. When I asked why, the editor did see fit to get back to me, but her response was utterly meaningless on point of qualifications. She said only that the person she hired "had what she needed." But she also pointed out that he had grown up in the area of the job, so I rephrased my question and asked whether, if I'd had the same qualifications I do now but had grown up in that region, I would have been at least interviewed.

Her response still makes me angry, and I expect that it will for as long as I struggle to have a legitimate career before the end of my twenties. She wrote back with one line: “Ed, I'm sorry. I'm not going to break it down.”

I had asked a straightforward yes-or-no question. I was looking for some indication, even if perfectly vague, as to whether my inability to secure a simple interview was attributable to being underqualified, overqualified, or simply having qualifications different from those that match the sorts of jobs I apply for. I didn't ask her to answer to any of that, though. All she had to do was say “yes,” “no,” or even “maybe.” To do so would have taken less effort than it took to type what she did.

To date, I can't conceive of any reason why she would respond that way, other than to be deliberately rude. This is my entire life we're talking about, and all that a person like her needs to do to give me a little more insight into why it remains so far off the rails is to say either “yes” or “no,” and she couldn't even do that.

I guess in light of that I should feel very pleased with the Wyoming editor for putting forth the effort to dodge my question in a way that at least seemed like an answer. Maybe that counts as progress.

Wednesday, August 8, 2012

Towards the Shrinking of Education


Last week’s issue of the New Yorker included an article by Andrew Marantz in “The Talk of the Town” that I found unusually inspirational.  That article also included reference to a fact that I think is deplorably neglected and under-explored: “… the Chronicle of Higher Education recently reported that in the past few years ‘the percentage of graduate-degree holders who receive food stamps or some other aid more than doubled.’”  People who are relatively familiar with my views on institutional education will recognize this as fodder for my ire over the socially endemic assumptions about the economic value of college education.

(If you want to get acquainted with those views, please read this, and this, and this, and this.)

Marantz went on to connect this situation to what he says has been called the crisis in the academy, defined by the very situation that I have been watching develop for years, in which the academic labor market is so glutted with highly educated people that terrific scholars are sometimes shouldered out of any sort of employment.  Actually, Marantz – I think just by way of a slightly clumsy transition – identifies the two issues with each other, as if a need for public assistance and the absence of a high-profile academic post are equivalent.  There is a middle ground that is being needlessly excluded, there.

Still, both issues desperately need to be addressed in their own right, and Marantz highlights two individuals who have taken steps to combat the lesser crisis among would-be academics.  Ajay Singh Chaudhary and Abby Kluchin recognized a demand for education among people who could not afford either the time or the money to take the relevant courses at universities, and they responded by teaching their disciplines in cafés over the course of several weeks, at a cost of a few hundred dollars.

Marantz calls their business venture, the Brooklyn Institute for Social Research, “a locavore pedagogy shop,” and I think that’s as good a term as any for what I expect is part of a trend in education which will increasingly challenge the large, money-driven institutions that so many students are finding deliver little in the way of outcomes aside from a crushing debt load.

I can still recall how excited I was years ago, when my disdain for institutional education was still in its childhood – not its infancy, mind you; that disdain actually predates my NYU enrollment – when I heard a story on the news about private genetic engineering labs that people were running in their basements.  After my graduation, I began to advocate with particular verve for the outright rejection of the formal institutions.  I wanted, and still want, people who legitimately care about education, to show that commitment in their private lives by educating themselves and one another and exploring in private settings those new ideas which might be suppressed in the academy, in favor of the status quo.

At the time that seemed like an easy thing to accomplish with the social sciences and humanities, but the idea of moving physical sciences out of the institution and into more intimate settings seemed quite challenging.  Seeing evidence that not only were people up to the challenge but that they were actually doing it thrilled me and gave me great hope for the future of smaller scale scholarly structures.

It’s been a long time, but Marantz’s article finally gives me hope that the trend is continuing, and that it’s embracing not only private experimentation and scholarship, but small-scale education.  With formal tertiary education demanding more and more financial investments from students and delivering lesser and lesser financial rewards, as well as questionable educational outcomes, I expect people to gravitate in growing numbers towards alternative forms of both teaching and learning.

There are others in addition to the Brooklyn Institute, of course.  The internet provides curious individuals with many opportunities to absorb lectures for free and in their own time through uploads of actual college courses, video channels designed for broad-based education, TED Talks, and so on.  At least one company that I know of sells entire college courses on DVD for students to acquire at a fraction of the cost of tuition.

I fully expect more competitors to join in this trend, and so I expect that education in the future will look much different than it looks under the formal structures of today.  Unless the costs or the benefits of colleges and universities dramatically shift gears, the schooling of the future will in large part be much more local and much more collaborative.  The alternatives that provide that character have about as much knowledge to offer as the status quo, given the volume of unemployed scholars.  The only thing that they decidedly lack is accreditation.  But if degrees from accredited schools continue to deliver such dubious prospects for employment and financial security, what value will accreditation really have?

Wednesday, August 1, 2012

Humans Made Better With Technology


It is often interesting to watch the future unfold in real time.  In many areas of human development, the small changes accumulate casually, soundlessly, but add up to one grand spectacle when one take the time to observe it and realize they’re looking at something that just a little time ago would have appeared to be the exclusive domain of science fiction.  The recognition of that progress can be a subtle personal breaking point.  It can be either a negative breaking point – jarring one with the realization that the world is morphing by a series of huge steps into something virtually unrecognizable; or it can be a positive one – welcoming a sense of exhilaration as one comes to gain a clear perspective on the lovely places his world seems destined to go.

I’ll be the first to admit that I am prone to be threatened by changed, including and especially technological change.  I am terrified of the myriad ways in which we seem not only willing but eager to throw our humanity away in an endless quest for convenience, and easy security, and ephemeral connections to an increasingly impersonal, electronic world.  But several recent events have given me a sense of the other side of that coin, the exciting promises that come of our eager, whole-body embrace of new technology.

As much as technology swaddles us with petty conveniences and frivolous distractions, the multiplication of those things goes hand in hand with the growth of technologies that demonstrate potential to really transform not just human experience, but human beings themselves.  While the trends have certainly been building for some time, with the rapid development of prosthetics, and of personal electronic devices, and the social acceptance of a constant technological presence in individual lives, it rather seems to me that in the blink of an eye we were on the verge of the widespread technological enhancement of human beings.

That trend is realized in ways that may lie anywhere on the spectrum from subtle to unmistakable.  On the side nearer to familiarity there is the 2008 Olympics.  While the nation and the world were busy watching Michael Phelps make history by scoring eight gold medals at the Beijing games, they may have missed the fact that it wasn’t just Phelps, but also several of his competitors who were systematically shattering former world records in each event.

This wasn’t just a result of that year’s competitors having been a particularly exceptional crop of swimmers.  Advancements in swimwear technology, led by Speedo, effectively made times before and after 2008 incomparable by quite literally reshaping the actual competitors into something more hydrodynamic, squeezed tight in all the right places to make them glide through the water with reduced drag.

It might be hard to conceptualize mere garments as high technology, but however you look at it, the gear that modern industries have produced for their athletes have served to dramatically increase performance and raise the bar for “personal best.”  Technology doesn’t just aid natural abilities; it enhances them.  This is true in other events, as well.  Running shoes have steadily collapsed the ratio of strength to lightness, with Adidas having developed a shoe that redirects power into the turn for long distance track and field competitors.  In that same category, the design of javelins has both increased outcomes and decreased risk of injury by premiering innovative design materials to limit the wobble of the pole upon release without transferring that force into the thrower’s shoulder.  In every one of these instances, if the competitor is capable of performing at a higher level in a high tech outfit than he could do naked, or if different individuals can perform differently based on the design of the object they’re holding, then we’re effectively enhancing the natural capabilities of a human being, even without drugs.  In a way, we’ve been doing this for decades, but it has gotten far more dramatic very quickly, especially in light of the outcomes of the 2008 Olympic Games.

The current Olympic Games showcase something rather more interesting, albeit something that requires a little more speculation to see how it supports my thesis.  South Africa’s Oscar Pistorius has qualified and been permitted to run in the men’s 4 x 400 meter relay.  Pistorius is a double-amputee who will be running on two prosthetic legs against able-bodied opponents, and he purportedly has a real chance of taking a medal.

This probably comes across to many observers as a nice human interest story, but to my mind this is a terrific portend of things to come.  The significance of the story is arguably more social and technological than it is personal.  The Olympics are the ultimate testing ground for the quality of prosthetics.  If false legs are now capable of holding their own in direct competition with the real limbs of athletes in their prime, there can be little doubt that we’ve designed technology capable of replicating the fullest capabilities of the human body.  If we’ve managed that so early in the twenty-first century, how long can it really be before we have prosthetics that actually exceed human abilities?

Unless Pistorius’ prosthetics fail catastrophically, there is simply no way that he will stand alone for long as an example of a formerly-disabled world-class athlete.  Again, with this development, technology has plainly shown itself to be capable of dramatically enhancing the abilities of un-equipped human beings.  Granted, in the present case, it may only be serving to bring a disabled man’s abilities back up to the baseline, but that in itself is truly remarkable, and it leaves it easy to imagine that the technology of the near future can amplify the performance of already able-bodied individuals in similar measure.

All right, let’s not mince words; there’s no more sane way to say this.  I’m talking about cyborgs.  We’re close to having cyborgs among us.  Depending on how you define the term, we may already have them.  I feel downright silly typing that, as the concept still seems far-fetched to me, but I have to reconcile that with the fact that I read a week or two ago about the human cyborg Steve Mann, who has apparently been experimenting with wearable technology since the early 1980s.

Mann made headlines in mid-July after he was assaulted by employees of a Paris McDonalds for wearing his EyeTap Digital Glass camera, which is permanently attached to his head and cannot be removed without special tools, leading the website io9 to brand the attack as the “world’s first cybernetic hate crime.”  Overstated or not, that puts the immediacy of such seemingly futuristic technology into sharp focus.  Technologically-enhancement of human beings is a definite reality, and not just as one-off experiments in distant government labs.  There is at least one individual who is living with such enhancements on a daily basis, blended in with mainstream society.

Taking all of these indicators together, I can’t help but wonder what the future holds and when it will show it to us denizens of the present.  Perhaps it will still be a generation or more.  Then again, Steve Mann’s developments went on in society’s background for thirty years and when I came aware of them I felt they had snuck up on me.  And while I knew how well prosthetic technology was developing, I never anticipated seeing an amputee run in the Olympics.  That too, snuck up on me.  The pace of change is stunning.  If you’re not paying close attention to the patterns, it’s easy to underestimate what the future holds and how soon the future comes.

Thursday, May 10, 2012

Endorsing Tribalism and Gay Rights


I submitted a brief editorial to AND Magazine regarding the liberal reaction to President Obama’s endorsement of marriage equality.  Hopefully it will go up tomorrow.  Having thought about the topic a little more, I feel I would like to use this space to post something of a supplement to my previous comments.  In my AND piece, I pointed out that there was a tumblr blog launched almost immediately after Obama’s television interview, which consists entirely of animated gifs emphasizing celebration of the newfound vocal support for gay marriage.

My first criticism of this sort of reaction is that it’s making a celebration out of something that doesn’t really warrant it.  It shouldn’t have taken this long to get President Obama to make a basic statement of support for the gay community, and even now that he did, that is now what they need; they need legislative and judicial action, which the President can push for and support.

But apart from the fact that their singing and dancing is an overzealous response by some liberals to a very modest change, what may actually be more significant is that it demonstrates a hideous tendency in private citizens’ engagement with the political process.  The people making the gifs for tumblr and otherwise celebrating yesterday’s announcement must be aware of the fact that nothing has substantially changed.  The celebration, then, isn’t about progress; it’s about popularity.  The sad fact is that in the modern political landscape, we are so caught up in the excitement of the process that we consider high-profile endorsements to be tantamount to actual political victories.

The most damnable feature of our typical approach to social issues and governmental procedure is the impulse towards tribalism.  There are few better examples of such tribalism than widespread rejoicing over the affirmation that our ideas have a place among the powerful and the popular.  That is something much different from cheering over the affirmation that our ideas are correct.  But the more we indulge this impulse to gloat over demographics rather than substance, the less clear that distinction will be to us.

I hope that as gay activists continue to express this misplaced pride in who is coming over to their side, they will approach a breaking point whereby they realize that the fallacies of appealing to popularity and authority only serve to make them more like their irrational political opponents.  Hell, anti-gay activists largely believe that they have Jehovah and most of human civilization on their side.  Even if that were true, it wouldn’t make them any more correct, and it wouldn’t prevent progress towards equality.  That kind of certitude provides nothing other than a sense of self-congratulations, which has no place in politics if politics is to be a rational, productive endeavor.

Of course, it is thoroughly at home amidst the sort of politics that we actually do have in this country.

Wednesday, March 7, 2012

Watch More! Do Less!


When last I watched something on Hulu, I was treated to an advertisement for Hulu Plus, which almost seemed like a thematic sequel to the commercial with Will Arnett that ran during the Super Bowl. I didn’t mention that one in my post reviewing the Super Bowl ads, but I remember now just how puzzled I was by the message that it presented. Arnett played a space alien who presented himself as a member of a vast conspiracy among people in entertainment and broadcasting, observing with malicious glee all of the people around him who remained glued to television programs on their mobile devices while they sat in cafes or just walked down the street.
Obviously, the ad was intended to be tongue-in-cheek, but to simply watch it, it’s hard to see any acknowledgement of the joke. I would expect that an absurdist portrayal of the would-be criticisms of a brand would show the salutary kernel of truth under the surface, but I don’t see that in the Will Arnett Super Bowl ad. Instead, it presents the criticisms of television viewing habits in an over-the-top way, but it also presents those actual habits in an over-the-top and markedly negative way. Arnett explains the evil plot that is modern television, and everyone around him sits in utter obliviousness, staring obsessively, vacantly into their screens. There is no point of contrast; there’s nothing that encourages viewers to both laugh at the absurdity and recognize the appeal of the product.
With the new commercial that I’ve seen run on Hulu itself, the company seems to have stripped the joke out of the equation altogether, leaving only a negative portrayal of their own product. I’m not sure what’s going on here. Either Hulu is engaged in some bizarre campaign of parodying itself, or my values are so hugely out of step with those of much of the culture that these advertisers see certain images as edifying while I see instead as disturbing.
That dichotomy is seen in the images of the new commercial alone, but it’s really driven home by Hulu Plus’ latest tagline: “Make the most of everything.” The ad shows a man on what I presume to be a Stair Master at a gym. I can only guess at the machine he’s using, because we don’t see it. The shot remains tight on this face and upper body, perhaps deliberately restricting our visual awareness of the fact that he is even doing anything. The man holds the handle of the machine with one hand while holding a mobile device in front of his face with the other. And as the camera lingers on the image of his distracted, staring expression, the voiceover says, “Make the most of your workout.”
How? I assume he means by doing the exact opposite of what this man is doing, seeing as he doesn’t appear to be aware of the fact that he’s working out at all. Now, I’m no fitness expert, but I’m pretty sure that if you don’t feel anything and you don’t have to concentrate on your exercise in any measure, you’re doing something wrong. The visual presentation really doesn’t give me the impression that he’s making the most out of his workout by adding television to it. It gives me the impression that he’s not getting much out of either activity.
“Make the most out of your lunch break,” the voiceover says next, at the same time that the scene changes to an image of a woman in business attire sitting on a bench outside and staring at an iPad on which she is watching an episode of Lost. The expression on the actresses face is marvelously discomforting, and it can’t be unintentional on the part of the advertisers. I wonder what the director said to her. Perhaps, “Try to look as if you’ve just dropped acid and you’re watching a dragon tenderly make love to a unicorn on a bed of rainbows.” They even have her raise a fast food beverage cup into frame and clumsily place the straw in her mouth without so much as moving her eyes. It’s an exceptionally unsophisticated image.
Nobody should look as rapturously mindless while watching television as Hulu has the subjects of its ads look. This is doubly true if the person is outside at the time. With the professional woman as with the man at the gym, the camera stays pretty close, but by all appearances it is a nice day outside. And yet Hulu’s concept of making the most of a lunch break on that day is to focus completely on an escapist fantasy and to never, ever glance for a moment at the sun. There’s no joke behind this as with the alien conspiracy ad; they’re actually saying that.
When I started to notice the popularity of watching television on DVD, I thought that there was something very positive about the changes to the way we consume media. At the same time that I miss the unifying experience of knowing that the rest of the country is watching the same thing at the same time, I considered it a worthy trade off, knowing that programs themselves were coming to be seen more as things to be sold directly, rather than just as means of delivering advertisements, and thus as things to be controlled by them. I liked the idea that Mad Men could make money because it was appreciated by its audience, and not just because it sold products. I realize, though, that that idea isn’t entirely accurate; the advertising is still primary, and it still affects the progress and direction of shows.
Now, not only does secondary advertising still hold sway over good media, the idea of entertainment as a product unto itself has proven to have a dark side. With companies now profiting not just from the consumption of their media but from the consumption of media in general, there are advertisers whose jobs have come to be to sell us on the very idea of watching television and movies, and to try to convince us that it’s better for us if we consume more, even as much as possible.
It’s only natural that a company tries to present its product as eminently beneficial to the consumer, especially in contrast to its competitors. It’s just less familiar, and quite unfortunate that in the cases of products like Hulu Plus, the major competitor is the entire outside world. Consequently the vision of such products’ ultimate benefit to your life is a situation in which you no longer have a life at all. “Make the most of everything” is a powerfully, and dangerously disingenuous slogan. With the haunting images of media addiction presented by such products as Hulu Plus and Digital Copy, about which I’ve written before, a far more fitting tagline would be, “You may as well not leave the couch.”