Showing posts with label children. Show all posts
Showing posts with label children. Show all posts

Tuesday, September 4, 2012

At Cultural Attractions: Parents Don't Teach, Children Don't Learn

The Buffalo Zoo celebrated the traditionally-last weekend of summer by offering a ninety percent discount on admission on Labor Day. Since one dollar is something I can just about afford on a good week, I took a holiday-morning bike ride around Delaware Park and then queued up with the mass of people, mostly families with small children, who had just as readily sprung at the opportunity for a cheap cultural activity.

Considering the lines at the gate, I was surprised that the scene inside was not as claustrophobic as it could have been. It took a little jostling or waiting in the wings to get a proper angle, but everyone seemed to get their opportunity to look at the cute, or fearsome, or comic animals. I freely admit that I was mostly there just to take another look at some of my favorite creatures, to watch the polar bear swim in its artificial pond, far from the threatened environment of its natural-born fellows, to grin down on the docile capybaras lounging in the rainforest exhibit, to rediscover my respect for the vulture which I discovered when I wrote a report on the species in elementary school, to look for big cats pacing like in Rilke's description of the panther.

But even though this excursion wasn't exactly intended as a fact-finding field trip, I never go to a museum or zoo or aquarium without trying to learn something about the stuff I'm looking at. Not a heck of a lot changes at the Buffalo Zoo from year to year, and I think I had been there about a year ago, so it's not as if I could have expected to discover an animal the existence of which I was altogether unaware of. But there's only so much I can commit to memory, so naturally I find myself rediscovering things on subsequent visits to the same places of learning. I always seem to forget, for instance, that the Rocky Mountain Bighorn Sheep are capable of running at up to fifty miles per hour. The up-side of my disappointment at not retaining encyclopedic recollections – a failure that seems to become ever-worse as I age – is that I sometimes get to re-experience the joy of learning something interesting all over again.

Even if I don't read all of the wildlife facts, of which there aren't even that many at the Buffalo Zoo, I do at the very least try to get the names of the animals right. This is more than I can say of the vast majority of the other patrons that I encountered yesterday. It having been a year since my last visit, I found myself trying to actively identify each species, endeavoring to commit to memory the ones that escaped me this time around. This is natural to me, and I thought it was part of the essential purpose of going to the zoo. I always took it to be a place where you went not merely to look at animals as in a menagerie, but to find out something about the wider world by discovering what they are and from where they come. I especially thought that that was why parents took their children to the zoo. I'd always assumed that it was meant as a supplement to a child's primary education, a way to instantiate curiosity and gauge the direction of nascent scholarship. Apparently I was quite wrong about this as well.

Most any time that I go to places like zoos or museums and find myself crowded by children and their adult chaperones, I am downright shocked by the lack of interest that parents have in conveying any information whatsoever to their charges, or even in encouraging those children to learn anything on their own. I fear that my disdain paints me as a killjoy and that the average reader will see me as attaching far too much significance to the conduct of people who are on a simple, light-hearted family outing. But that's just the trouble. I worry that people attach entirely too little significance to such everyday opportunities to influence the character, values, and perspective of impressionable children.

As much as Americans today recognize and lament the widespread failure of education and the failure of modern children to live up to appropriate standards, I think commentators and individual parents are too much inclined to see that failure as institutional and too little inclined to consider it as social and cultural. If the behavior of parents at zoos and museums is indicative of their broader attitudes, it suggests that people have widely forfeited the recognition of personal responsibility for the education of their own children, instead handing that responsibility off to schools as if the process of raising an intellectually astute and ambitious child is something that can be consolidated into a specific set of hours in specific locales.

If that is indeed the view – if the need for education is recognized, but only recognized as being needed somewhere outside the home – then I can only conclude that people don't really value education at all. That is, they don't value education as it ought to be valued, for its own sake, as both a public and a personal good. You can't expect children to learn well and perform at a high level in school if the culture that they're coming up in is one that portrays education as a sort of obligation and something that brings good things to the learner, but is not good enough in its own right to be worth pursuing in absence of the social obligations of homework and exams.

What else can I conclude from regularly observing that perfectly middle class parents, far from exhibiting much intellectual curiosity of their own, don't even respond to the intellectual curiosities of their own children. But perhaps that's a little unfair. At the zoo yesterday I did find one or two adults expressing curiosity to the extent that they pressed their faces to the glass and perplexedly asked of no one in particular, “What is it?” They just didn't express a great deal of interest in actually doing anything to satisfy their curiosity. They just couldn't be bothered to walk back two feet in order to read the damn nameplate.

This is entirely their own affair when the adults are on their own and solely responsible for their own edification or ignorance. But it gets under my skin when their own lack of care for finding answers threatens to be transmitted to a child who is still blessed by wide-eyed eagerness to comprehend the world around him, whatever aspects of it should set itself before him.

Just a few exhibits down from where I heard one unresolved ejaculation of “What is it?” I found myself looking at another glass enclosure that housed three wallabies crouching at the back of their habitat, when a family walked around me to look at the same. It was comprised of a couple with a daughter just barely of speaking age and a son perhaps six years old. The parents looked, glassy-eyed, into the scene while the boy excitedly called out “kangaroos!” I had started moving away from the exhibit, but noticing the boy being met with silence, I said simply “wallabies,” partly in hopes that his parents would hear me and realize, if they did not realize it on their own, that their son had made a reasonable but slightly mistaken assumption about what they were looking at.

However, I was essentially met with silence, too, except in that the boy, perhaps hearing me or perhaps just seeking acknowledgment from his parents, repeated “kangaroos.” Noticing that they weren't going to say anything and that their eyes had apparently still not passed over the signs that clearly stated the name of the species, I repeated, with the boy more specifically in mind, “wallabies.” Now looking squarely at me, and inquisitively, the boy again said “kangaroos.” It could not have been more obvious that the child was interested in being corrected. He wanted to learn, as most children do when simply presented with the opportunity. This child was young, but most likely old enough to sound out the word “wall – a – bye” if he knew where to look, and if he was made to realize that he didn't know the answer without looking. But to do that, he would need an example to follow, a pair of parents who had the tools to find out answers for themselves, and cared to give their children the same.

The child looking to me instead of his parents for that meager bit of instruction, I addressed him directly, explaining, “No, these are wallabies. Kangaroos are big; these are smaller.” And at that he turned to his parents and his younger sibling to repeat it to them: “These aren't kangaroos, the man says.” At that I was walking away, and I can only hope that their son's claim finally prompted them to look at the sign and sound out “wall – a – bees.” It was up to them to take an interest on their own, but it seemed to me that the child, being a child, not only wanted to know about these things in the zoo, but wanted others to know about them to.

I experienced the same thing elsewhere. In the crowded rainforest exhibit, I, being a nerd, spoke straight to the capybaras, telling them that I just wanted them to know that they are the largest rodents on Earth, and that that's awesome and they should be proud. A young girl just beside me asked, seemingly of no one in particular, "What are those called?" It could be that she heard me demonstrating some knowledge of them and figured that I had the answer, or it could be that she, like so many young children, thought her parents would have all the answers she sought.

She had not spoken straight to me, and that being the case, I would think that a scientifically interested parent, one familiar with zoos, would say something like, “I don't know, let me look at this information card over here so we can find out.” The parents did not move, of course, so I turned to the child and told her, “Those are called capybaras.” Naturally, she then looked back to her parents and sought to inform them of what they did not inform themselves: “They're called capee-bears.” The parents did not repeat the information; they did not move to confirm it or commit it to memory; they did not give her any indication that she should feel proud of having learned something, that she should be thankful for the knowledge, or that she should seek to learn other things as well.

The desire to learn is so natural and so passionate among children. How poorly we must regard it as a society that students evidently end up so thoroughly dissuaded from eager learning long before reaching the lower threshold of adulthood. What standards can we possibly expect students to meet if we handicap them in all the faculties that might prompt them to aim above the mark. If this culture persists, the most likely solution is simply to expect less of students, as has already become the defining feature of decades in the devolution of higher education.

In the future of this culture, we may as well just rename familiar animals to match the absent understandings of parents and their children. Having been to a couple of zoos and aquariums in recent years I've found that as far as doting children and intellectually incurious parents are concerned, every lemur is called King Julian and every clownfish is Nemo. This really aggravates me. My best friend is terrifically fond of the Niagara Aquarium, so I have gone there with her on several occasions. Upon every visit, without fail, one can hear at least half a dozen parents exclaiming, “All right, let's find Nemo,” or, “There's Nemo.” I think I've heard the word “clownfish” used by a parent to a child exactly once.

I have no doubt that some of these parents are just lazy and find “Nemo” easy to remember, but I warrant that a number of them may have good intentions. They're probably trying to use pop culture as a way to facilitate their children's interest in the natural world. But there's more than one reason why this is misguided. For one thing, having been to the aquarium several times, it's clear that children don't need some secondary point of reference in order to take an interest in the natural world, because the natural world is terrifically fascinating. And that's especially obvious when you're a child.

So using an animated film as a way of connecting with an aquatic exhibit is extraneous, but far worse than that it obfuscates children's understanding of what they're actually looking at. It disregards the separation between fantasy and reality, it suppresses knowledge of the actual species name, and it encourages children to understand the creature through an individual depiction and not through objective facts. And then on top of all of this, for many families the fixation on something that is recognizable from fiction overrides the significance of everything else that's on display. People walk in the door and say, “Find Nemo!” and they breeze through ninety percent of the aquarium to get to something that won't teach a child very much that he doesn't already know. If they didn't immediately put that idea in his head, they might be astonished by how much he doesn't care about the clownfish once he's seen the solitary-social penguins, the balloonfish with their glittering eyes, the sharks skulking past viewing windows, the challengingly camouflaged rockfish, and so on and so on.

When parents almost thoughtlessly constrain the purpose of visits to zoos and aquariums and museums, they probably think, more often than not, that they are doing it for the benefit of their children, that they are moving to retain a young attention span and provide its owner a quick shot of enrichment while they can. In fact, I think such parents and caregivers should consider that they might have it all backwards and that the feelings of stress and impatience are all their own, and merely projected onto their children. They should concern themselves less with what their children are looking to get out of the experience, and more with what they themselves are after. If the answer isn't “knowledge, and lots of it,” they can probably expect much more of their children's interest in the moment. But they likely won't be able to go on expecting it as those children age in the presence of a society that doesn't care particularly much for learning.

Wednesday, May 2, 2012

The Oxford Comma, Childhood Education, and Me


The coincidences that I encounter these days are not as profound as they once were.  Now it tends to be more along the lines of repeated references to a film I have yet to see, or some negative coincidence like my some last minute excuse always coming up amongst friends.  A couple of recent, coincidental encounters have compelled me to make something out of a topic of grammatical concern.

I stumbled onto an online discussion recently about the Oxford comma and whether it is or is not grammatically correct, or required.  I later found that another online writer’s personal byline declared him to be “a fan of the Oxford comma,” and having already been given cause to reflect on it, I thought to myself, “Well, hell, me too!”

For those of you who are extremely casual grammarians or who pride yourself on a 1337 ability to avoid the conventions of written English, the Oxford comma is the comma that comes between the penultimate entry in a list and the word “and.”  Nouns, punctuation, and a conjunction make up a list, and there’s an Oxford comma in this sentence.  Some writers use it, some don’t.  Some style guides require it, some reject it.  Speaking quite generally, both its use and its non-use are acceptable.  It seems to me that many people, either because they haven’t thought about it or because they’re naturally committed to one or the other, don’t realize this.

That even goes for teachers of English.  The reason why I know about the controversy over the Oxford comma is that I remember it being a legitimate point of confusion in elementary school.  I’m fairly certain that when it first came up, the teacher of what I’m guessing was my third grade class, told us quite explicitly that there was no comma between the second-to-last and last entries in a list.

I more clearly recall when it came up with a later English teacher, because she didn’t seem to know which was correct, but would not admit to that fact.  She was overseeing an assignment in which students had to add punctuation to an existing sentence, and when she gave the answer she listed the places where each of the commas belonged, paused, and added the Oxford comma to the mix.  Even among a group of nine year-olds, the class was bifurcated on that answer, so that I cheered to myself over my superior understanding, and my neighbor had to correct his paper.

At this point you may be asking what on Earth this has to do with breaking points.  Well, having thus had an opportunity to reflect on my personal relationship with the Oxford comma, I realize that the way I learned about it might represent something that’s essential to the development of an intelligent, independent child.

You see, regardless of what I’ve become, I was the picture of an upstanding, studious child who did with religious devotion what he was told to do by parents and teachers, and always followed the rules.  That contributed to a marvelously successful academic career, which paid off with a sense of pride for most of the time that I was in school but left me with nothing once I no longer had anyone to obey.

Now I seem to have such a contentious, anti-conformist mindset as to give me a rather hard edge, which acts as a social barrier.  Nevertheless, I remember well the child that always did his homework, developed an earnest rapport with authority figures, never snuck out at night or dabbled with drugs or alcohol.  In many ways, I am still the child, even though I have a well-developed and eagerly maintained sense of self.  So I know that if I were to finally be injected into a corporate setting, or otherwise put low in a hierarchy that I’m wont to accept, I will still do what I am told to do at most every turn, and do it with sincere deference.

Knowing the kind of child that I was, I sometimes wonder just how I would fare in the Milgram experiments, which, in the early 1960s, demonstrated how easy it is for ordinary people to do monstrously unethical things when directed to by an authority figure.  My life has been unfortunately short on severe challenges to my own morality.  Mostly, there have just been instances where circumstances casually flirted with a scenario in which I might be called upon to either speak up or stand by as a witness to preventable wrongs.  And I’ve always been afraid of my apparent slowness and caution in responding to such situations.

In a lot of ways, I was quite unlike what one expects in a typical intelligent youth.  My aversion to drugs and alcohol, even to sex, has been lifelong, but psychological studies indicate that a curious willingness to experiment with such things is characteristic of a changeable, and thus intelligent, mind.  The saintly boy scout type might prove to be exceptionally good at reciting the rulebook, but that doesn’t demonstrate any real intellectual curiosity.  Rebellion is supposed to be a natural part of adolescent development, but I never experienced it.  My greatest act of rebellion came at twenty-one when I refused to apply to graduate school.

These sorts of contrasts make me wonder if I really have the firm, capable mind that I was always praised for, or if, instead, I am just a terrifically smooth-running machine.  All those subject areas that I was so good at in my primary and secondary schooling – did I really understand them, or did I just repeat what I was told at the same time that I repeated “don’t talk to strangers,” “don’t smoke,” “don’t skip class,” “don’t talk back”?

My worries about the authenticity of my own intelligence are modestly alleviated, however, by the knowledge that insecurity has been a characteristic of virtually everyone for whose intelligence I have had respect in the past.  Whenever I question my skill at or grasp of something, I take a little bit of comfort in remembering the Dunning-Kruger Effect – the tendency of skilled people to think that everyone else is as good as they while deficient people think everyone else is as bad.

Still, I’m not like the other mentally-capable people I know, and it leaves me with the worry that all along I’ve just been adeptly imitating them, saying the sorts of things they say, following the rules that are supposed to lead to where they are, and generally copying instead of thinking.  After all, the best and the worst of people are the ones who question authority.  The rest are just mediocre.

Of course, what I need to keep in mind is that an essential willingness to question authority doesn’t mean that it’s necessary to do so.  And yet it is necessary to have that willingness, because a constant follower is not one to form his own ideas.  That’s a problem when the ideas that you’re asked to follow are wrong, and it’s equally a problem when you have no firm idea to parrot.  Case in point, the Oxford comma.

I was probably eight years old when I learned how to separate items in written lists.  In retrospect, I take great pride in my reception of that lesson.  More to the point, I take pride in the fact that as a child I was not receptive to that lesson.  The absence of the Oxford comma in third grade English is the first memory that I can dredge up from my spotty personal history of an instance in which I actively, albeit silently, disagreed with a teacher.

I don’t know where I learned that skill so early in life, but I believe that it contributed in magnificent ways to the development of the person writing this today.  A year or so after that first lesson, I defied the prior teacher’s instruction and inserted a comma next to the conjunction, because that’s what made sense to me.  I felt then as I do now:  There’s no sense in excluding the comma from the last item in a list, because the conjunction doesn’t fully separate one noun from another.  There are situations in which you might pair two words as a compound noun linked by a conjunction, such as “salt and pepper,” or “soup or salad.”  If such a compound comes at the end of a list and it’s accepted that the writer omits the Oxford comma, the two nouns will be inappropriately divided from each other.

In a far less analytical way, I was aware of this at eight years old, and even though I wasn’t intellectually prepared to defend my opinion to an old woman in a position of authority, I at least had the fortitude to let the instruction pass through my ears unheeded.  When my later teacher hesitated over the question, I was vindicated, because I knew then that it was a legitimate area of uncertainty, and I was confident that I had resolved it correctly.

Children need the skill to resolve linguistic and explanatory puzzles on their own, if they are to become intelligent beings.  Knowing what I do about myself, I’m almost certain that if I hadn’t displayed that skill at an early age, I would in fact be the intellectual automaton that I sometimes fear I could be.  In light of that, early childhood education cannot be simply a matter of transmitting information; it must encourage children to resolve questions that the teacher has left uncertain, and even to challenge the claims of authority.

In many circles, this is something that’s explicitly rejected.  We often tend to value pure obeisance in our children, discouraging them from questioning until they’re old enough to do so.  That, however, is not education.  The creation of loyal citizens is not the same as the development of clever, critically thinking youths.  The patterns that we establish as children can follow us throughout our lives, and a pattern of accepting things at face value then can make it difficult to pick up the skill of questioning later on.  When it is not deliberately fostered, I don’t know where the impulse to reject false information comes from, but it is enormously valuable to developing minds, and I thank god that I picked it up somewhere.

And I thank god for the Oxford comma.

Friday, March 9, 2012

Horses, Lambs, Children, and Conflicting Ethics

In the March issue of The Atlantic, Darcy Courteau writes about the consequences that have been faced by the horse market in the years since the last slaughterhouse that produced horse meat in the United States was forced to close. I remember that story well, as I felt at the time that I was at odds with what I perceived as a widely shared instance irrationality in American culture. It seemed absurd to me that the Department of Agriculture should place a value judgment upon the production of horse meat, which differed from that applied to all other livestock.

I disagreed in no uncertain terms with that bit of interference with free enterprise. I disagreed with it on rational grounds despite being a vegetarian and a person highly concerned with animal rights. I simply don’t see how the overwhelming public support for the removal of horse meat and only horse meat from the American market could have stood up to any measure of introspection. It relies on a false distinction between one type of animal and all others.

Being a vegetarian and an animal rights advocate, I want to see that society avoids the mistreatment and slaughter of all animals, not just the ones that I like. How I feel about the creatures is irrelevant; right and wrong are never contingent upon personal attitudes. It may be contingent upon the objective nature of different things, but this doesn’t seem to apply to the situation of horses and other livestock. I don’t see how anyone could realistically argue that horses possess personalities that, for instance, cows or lamb lack, or that horses are better able to experience pain, discomfort, or fear.

Unless one earnestly believes that horses are intrinsically different from other animals, which belief they would have to hold in absence of real evidence, I can only assume that their impulse to suppress the slaughter of horses while allowing it for other animals is on the basis of the personal relationships people sometimes have with horses.

But that doesn’t really make a difference when we’re talking about just the concept of slaughtering them for meat. It’s not as though opponents of horse meat had personal relationships with this or that particular horse. Some people have personal relationships with particular rabbits, or snakes. It’s not unheard of for someone to keep a pig as a house pet, or to feel affection for a cow that is kept solely for dairy production. Rarely is any of this used as grounds to argue that the entirety of society ought to disallow the killing of or production of meat from any animals of a certain species.

If one recoils with horror at the very thought of horse meat, but never bats an eye when filling his shopping cart with pork and beef, he is wedging an artificial dividing line into the application of his principles. Such selective defense can only be irrational. And if one is concerned with consistency of his own beliefs or ethics, instances like that ought to lead to one of three outcomes: a change in attitude leading to universal application of the principle, even if potentially inconvenient; abandonment of that principle; or production of a satisfactory account of why the dividing line is not artificial.

If a person utterly opposes the production of horse meat but neither opposes the slaughter of all other creatures nor truly believes that the mental lives of horses are significantly and objectively different from those of all other creatures, then that person is trying to hold two contrary views at once: that killing a sentient, autonomous being that’s called a horse is wrong, and that killing a sentient, autonomous being that’s not called a horse is okay.

Cognitive dissonance is the enemy of breaking points. When you give yourself license to hold views that are in opposition to one another, you strip yourself of the crucial motivation for intellectual or moral growth. Breaking points arise of conflict, and sometimes it is a conflict between two opposing ideas that you yourself maintain. A person who is concerned with rational consistency will keep an eye out for such conflicting views, and his breaking point will entail a sudden realization that either one of his ideas is wrong, or he doesn’t actually know why each of them is right.

In the case of the ranching of horses and the slaughter of them for meat, the tension between views goes well beyond the simple difference in perception of horses and other animals. Cognitive dissonance is easy when you’re operating on pure intuition. When those intuitions are directly challenged by pragmatic concerns, it’s much more difficult to make glib pronouncements that a certain action is simply wrong. Courteau writes of the fallout from the closure of the last US horse meat producers:

“In states across the country, reported cases of equine abuse, neglect, and abandonment skyrocketed. And the kill buyers of yesteryear aggregated into rarer but still more haunting boogeymen, purchasing for the abattoirs of Canada, or, worse, Mexico, where horses at some slaughterhouses are reportedly subject to torturous conditions.”

Consequentialism makes for complex ethical calculations, and if one wishes not only to release the United States from the stigma its citizenry attaches to the slaughter of horses, but to actually reduce the suffering experienced by American horses, then such a person’s intuition that it was good to force closure of the slaughterhouses is probably in error. But that error and the larger error of deliberate cognitive dissonance are both based on the same mistake of thinking that your knee-jerk intuition is sufficient grounds for all moral judgments.

When one really starts to analyze the consequences of people’s intuitive moral pronouncements, we see that cognitive dissonance is quite easy to come by once all the nuance of principle and pragmatism is taken into account. In other words, what a person thinks is wrong often fails to align perfectly with why he thinks it is wrong. We cannot permanently avoid the moral burden of having to occasionally choose the lesser of two evils. Unfortunately, this doesn’t seem to occur to many people who have non-inquisitive, black-and-white views of morality.

The other night, I was watching the documentary Sweetgrass, and the depictions of some of the operations at the sheep farm brought to mind these same questions of ethical complexities. The opening scenes of the film largely focus on the beginning of lives for sheep on that farm, and I was somewhat shocked by the dismissive treatment by the ranchers of both newborn lambs and nursing mothers. But if watches with a measure of objectivity, one quickly comes to realize that given such high volume of sheep, the farmers are doing what they can to promote survival of the highest number possible.

Some years ago, I had a good friend who was a devout, even zealous Buddhist. He was exceptionally sensitive to implications of animal mistreatment, and aggressively, immediately judgmental of perceived wrongdoing. It occurred to me while watching Sweetgrass that he certainly would have found the farmers’ behavior to be unforgivable, but that any alternative behavior that would have resulted in the survival of fewer sheep would have elicited just as much disdain from him. While their rationalizations were grounded in Buddhism instead of Christianity, this friend’s social and political views were decidedly conservative, and probably didn’t differ very much from those of his Christian parents.

His moral judgments, like those of many conservatives, and indeed like those of many people of any political leaning, were severely averse to nuance. I recall discussing abortion with him on one occasion and using the word “complex” to describe the breadth and seriousness of the associated ethical questions. That evoked fiery indignation from him, and he said, “No. You can kill or you can not kill. It’s actually really simple.”

And it would be simple if that’s all it came down to, if there weren’t any genuine questions about what qualifies as killing, if there weren’t any other ways of being responsible for another creature’s suffering. What my friend believed seemed simple on the surface, but at a deeper level of analysis it becomes clear that he was keeping it simple by ignoring the hard questions.

No doubt he would have agreed that his moral concern was with decreasing the suffering of sentient beings, a utilitarian concern. That view means it is reprehensible to do anything that promotes or permits the death of, say, sheep or horses. But it must also make it reprehensible to do anything that promotes or permits the hunger or severe discomfort of the same creatures.

In the case of the sheep in Sweetgrass, keeping all of the lambs alive meant separating them from their mothers immediately upon birth, forcibly compelling ewes to nurse lambs to which they had no connection, and hastily handling the creatures as if they were inanimate objects. The alternative would have been to handle them more delicately, more compassionately, but chances are that in light of the enormous numbers of sheep that needed to be handled by just a few farmers, that would have resulted in some of the lambs being neglected, and thus starving or being killed by competing sheep.

Both alternatives may well be similarly unethical, but it’s unhelpful to simply reject whichever alternative is current simply on the basis of its perceived wrongness. The choice of one wrong action is, in cases like this, the direct consequence of the rejection of another.

It may strike some people as hideously dehumanizing to draw such a parallel, but the pragmatic circumstances surrounding the abortion debate can be elucidated by thinking of the entire human race as a correlate to a herd of livestock. As population increase, the rate of survival within that population, or at least the average utility available to each individual, naturally decreases. Mandating the birth of more young is tantamount to mandating the provision of more suffering. A person who opposes either abortion or the neglect of newborn lambs or the slaughter horses doesn’t have to accept that fact as a justification of the contrary position, but he does have to acknowledge the consequences of what he’s advocating.

In fact, I find that most people refuse to do this. They are, instead, happy to embrace cognitive dissonance, presumably because it is easier to live in a fantasy world in which right actions never have unintended consequences than it is to willfully struggle with moral dilemmas. That perception, however irrational, may help an individual to remain admirably committed to his own ethical obligations, but it also results in unfair judgments predicated upon others.

It’s not rational to demand that a creature with little access to resources must both birth its child and feed it. The acceptance of cognitive dissonance results in dissonant demands and no-win situations. That is the cognitive dissonance of, for instance, anyone who repudiates abortion without compromise, but also rejects the provision welfare. Essentially, the two views in concert pronounce that it’s wrong both to terminate a pregnancy and to have a child while poor.

Again, a rational person whose views are at odds with one another must apply the relevant principle, abandon it, or explain how they can be reconciled. In the given case, if a person claims the principle of defending the lives of innocents, he must apply that principle by providing material support to unsupported children. If that is too inconvenient, he must rethink his stance on abortion, or else explain why it’s worth defending an unborn child but not one who has truly entered the world.

It’s not easy to decide upon coherent ethical theories as to what constitutes right and wrong, but even once you have, it’s not easy to determine how to apply those theories. If you want children to have both a chance at life and at least basic comfort once they’ve begun that life, you’ll eventually have to confront a situation in which those desires stand in opposition. If want the lambs to avoid both starvation and mistreatment, you’ll be horrified, when you look closely enough, to realize that it sometimes takes one to avoid the other. You can save the horses from the abattoir, but you may thus doom them stable that does them even greater harm.

There is a certain sense in which my Buddhist friend’s pronouncement is still correct. It’s very simple: you can either kill or not kill. But the operative word there is “you.” The individual often has privileges that are absent to society at large. You can choose to carry your own unintended pregnancy to term, but if you can then feed that child without fail, you’d better thank God that you never really had to face the choice between depriving a child of life and subjecting it to exquisite hardship. And you can’t conflate either situation with the broader hypothetical in which the nation is inundated with a million additional young lives that must be supported and defended.

If you raise horses and you’re uncomfortable with them being either slaughtered or abused and underfed, you can do as Ms. Courteau’s father had always done and refuse to sell them to kill buyers. But when such sales are no longer an option and the reduced demand causes the prices of horses to fall, lowering your revenue to the point where it is no longer possible to take adequate care of the horses you have, the dual ideals of defending all life and defending against all suffering are no longer sustainable.

This has been the situation of horse farming in the United States for the past four or five years. I remember it being mentioned by some as a possible consequence at the time that the last slaughterhouse dealing in horse meat was closing. But mostly I remember objecting to the irrationality of it all. I remember this very well, but somehow I missed the fact that the Congress resumed funding for these slaughterhouses in November, which may result in some reopening this year.

I won’t be happy to see domestic horses go back to slaughter. Indeed, I hope that someday in the far-distant future they all close again, but that they do so then right along with those that deal in every species of animal, and that it be on the basis of the universal application of moral principles, not on the basis of an absurd double-standard.

But despite the fancifulness of that hope, I’m not naïve about the implications. I know that many animals will suffer and die from lack of care during any possible transition away from their slaughter and consumption. But if I could be alive when that time comes, I would say that that is the unhappy consequence of doing right in a way that is more crucial to our future moral standing. It is a great tragedy of the social aspect of moral existence that we sometimes have to prioritize our values against one another. But our collective morality gains not a bit from pretending that there is no such problem.

The nuanced demands and consequences of collective ethics are discomforting, in that they may require us to accept things that don’t feel right to us. Intuition is a powerful tool in making moral judgments, but it can only lead us so far. If it guides a situation towards less obvious but more serious harms, we’ve probably made the awfully mistake of eschewing rationality in order to appease the short-sighted demands of immediate perception. Only reason, and not intuition, is capable of handling nuance and recognizing indefensible cognitive dissonances.

Rationality is a skill that must be learned for the sake of coherent, far-reaching moral behavior. It draws the dividing line between those who think they are doing the right thing and truly are, even if they appear not to be.

Courteau writes of the reversal of the double-standard regarding horse meat, “Many pet lovers are furious, but PETA actually supports the reversal, arguing that the suffering of unwanted horses increased after the demise of the kill plants.” If PETA, which is often so prone to over-the-top displays of self-righteous, black-and-white morality, can learn the value of nuance and circumstance, anyone can.

Monday, March 5, 2012

Immorality of and for Children

Moving about my town this weekend, I made two markedly unpleasant observations, which were quite distinct from each other, but also meaningfully connected. They both spoke to the deplorable effect that many adults have upon the children growing up around them, in the one place through the influences they predicate upon them indirectly, and in the other through what they willfully do to them.

When I had just gone out of my home to catch a bus and go to meet a friend, I was walking down the principal street of my neighborhood and I saw a ten year-old boy turn to stare openly and at length at the backside of a seven year-old girl who had walked past him. Now, it could be that there was some other context that I was missing – he may have recognized her from elsewhere but been too shy to call out to her – but to my eye his behavior was indistinguishable from that of the appallingly many men I have seen stop in their tracks and follow with their eyes the receding course of a woman they find attractive.

The young boy didn’t appear to be simply looking; he appeared to be leering, and I know all too well what that looks like. It’s been so commonplace in recent years that there’s no longer any deluded part of me that’s willing to pass it off as an anomaly when I see another man doing it. It’s become a social trend, and in turn I’ve become pretty consistent in reacting to it in some fashion when I see it. That action only rises to the level of staring crazily at the unabashed lecher, but my hope is that by thereby calling attention to the fact that he’s not invisible to the world just because the object of his ogling has her back turned I can help to instill a slight sense of shame.

To do so seems like an even stronger imperative now that I’ve seen a young boy exhibiting the same brazen rejection of self-restraint. After all, the boy was about ten years old, and the object of his leering about seven. Unless human biology has changed far more than I realize, there’s no way that he has sufficiently developed sexuality to provide him with a strong instinctual desire to look. Even if there was, that instinct would direct his attention toward a woman with fully developed secondary sexual characteristics, not a child like himself.

The logical conclusion, as I see it, is that the boy was showcasing an environmentally learned behavior. The vulgar social trend of open displays of unchecked lust is probably self-progenitive, like many social behaviors, and will grow and worsen in communities where it is not combated. What I observed was a ten year-old boy having learned lecherousness before he ever learned about sex, and perhaps before he’d so much as heard the word “hormones.” It is a truly hideous culture that allows its youth to inherit vices before they inherit any reasons for indulging them. And that is a trend that is only interrupted by adults within the culture being mindful of the behaviors that they put on display to their children, and seeing that the indulgence of common vices never outstrips the reasons for them.

While the incidental corruption of youth by the action of a collective culture is awful, at least there is a plea of ignorance to be made. What is worse still is putting the worst of oneself on display in full awareness of the fact that a child is the main witness or the direct object of it. I was coming back with my friend from where we had met up, and we had to wait a few minutes in the Metro Rail station before transferring to a bus. Other passengers emerged from the tunnel with us and most of them headed straight out to the street. A boy who may have been as young as four, accompanied by who I presume to be his mother, was among them, but the two stopped short inside the door at the behest of the woman’s sudden and exceptionally severe shouting.

“Are you serious?! Tie your goddam shoe! I’m fucking sick of this shit! Tie your goddam shoe! And it better fucking stay tied this time, or I’m gonna beat your ass!”

She delivered these commands and threats with lengthy pauses and with repetition, so that the entire affair lasted a thirty seconds or so as the boy sat on a bench and tied his shoe while she stood imposingly over him, doing nothing but staring down with a fury that never relented. I stood nearby and glanced repeatedly in their direction with a similar, but I think righteous, fury in my eyes. But that was it; I reacted in the same way that I tend to react to lechers on the street, which I steadily realized was not good enough as I watched them go.

As always seems to be the case when there is a subtle but significant opportunity for me to stand up for something, I found myself regretting my prolonged silence for a long time after the fact. When these demonstrations of immorality spring themselves upon me, it tends to take me time to process what I am witnessing. And in this case, I wrestled silently with the situation for too long. It’s one thing when someone is harassing a stranger, but another when some public conflict is between friends or among family. The lack of known circumstance makes me reluctant to insert myself into a situation that does not concern me. Perhaps there are issues involved that I don’t understand.

In this case at the rail station, my moral compass wobbled terribly because of the fact that it was the woman’s own child at whom she was directing her aggression. I’ve always found that there is a common but flawed cultural assumption that people have special rights and privileges in dealing with their children, and that it’s almost never the place of the community to insert itself into another person’s parenting. But recognizing the common assumption as flawed doesn’t mean that I entirely avoid being influenced by it. The effect is evidently that I feel I must be quite sure that a situation rises to the level of unjustifiability, as by involving physical violence, before I confront wrongful actions against one’s own child.

Unfortunately, when the aggression doesn’t cross the line from threats to physicality, I’m compelled to make moral, rational, and probabilistic calculations before my perception of the situation reaches a breaking point at which my mind exclaims, “of course there’s no justification for that!” Of course there was no justification for this woman screaming at her four year-old child because his shoelaces had come undone. He’s four. He probably learned how to tie a bow just months or weeks prior, and clearly he wasn’t getting any help from his mother in perfecting the craft. Her assistance took the form only of demeaning criticism and public humiliation, and even if that isn’t the normal dynamic between them when the child is struggling with something, her response isn’t justified even in an isolated case.

I wanted to defend the child against the maternal onslaught he was absorbing, and it would have been worth doing so not just for the sake of protecting his fragile emotions, but perhaps more so for the sake of protecting his malleable mind from being warped into the image of the insanely hot-headed, irrational woman who is raising him. The aggression hurts the child in the short term, but he’ll get over it. Kids are resilient. But at the same time, dealing with his problem by doing nothing more than shouting at him to fix it or suffer the consequences gives the impression that that’s the best – perhaps the only – way to solve further problems. One day, that child will grow into a man who has the power over someone else in a situation, and if his mother’s treatment of him is indicative of the overall environment that he’s living in, there’s a definite risk that he’ll command that power without reason or restraint.

At a higher level, there’s a terrible social consequence to the message that’s sent by the parenting techniques that the woman put on public display that night. The black mother and son, being in Buffalo, were almost certainly from a background of low socio-economic status. A cycle of enforcement that says “solve your problem or suffer the consequences” is indicative of a tragic victim-blaming tendency that even operates inside of disadvantaged communities. Rather than doing anything to help the boy become more practiced at tying his shoes, his mother merely insisted that he do it better, implying that worse consequences of failure would be as good as greater opportunities for success. One wonders if she will offer the same message when he needs help on his homework, or when he’s looking for a job, or when he needs a social support system. There is an implied resignation there, accepting the assertion that there’s something wrong with the individual, or the race, or the community, and that until such time as that changes, there’s little point in trying to help them to better outcomes.

Everything moral choice that we make – with respect to our children, our neighbors, within ourselves – begins the alteration or supports the preservation of the way things are at the level of the family, of the community, and throughout the culture. I failed to decide quickly to step up to the woman and insist that she stop screaming expletives at her child and start actually raising in hopes that he’ll be even better than she. And in that failure, I missed an opportunity to put a new nick in the structure of the world as it is. I feel as though had she stayed around another moment I would have been past my breaking point, but as it was she stalked off quickly enough that I barely raised my voice before she was through the door. However, her child trailed behind her, and I saw that he looked squarely back at me as he was going out. In absence of having truly stood up against an example of horrible stewardship of our children, I comfort myself with the hope that the boy himself recognized my indignation for what it was, and that even as he followed his raving mother, he realized that not everybody is the same, that there are other sorts of people that he can grow into.

Friday, February 17, 2012

Simplistic Thinking from Educated People: Arne Duncan


Every time a representative of the government goes on the television or radio to talk about higher education, my blood boils a little at my recognition of the simple-mindedness that governs policy in that area. On last night’s Daily Show, Jon Stewart’s guest was Secretary of Education Arne Duncan. At the very end of the portion of the interview that went to air (the entire thing is available in three parts on the web), Duncan made the most indefensibly black-and-white assessment of the outcomes of education that I have yet encountered.
First, though, he pointed out that the United States is now ranked 16th in number of college graduates, whereas a generation ago it was in first place. He further explained that our rate of graduation hasn’t fallen, but has leveled off, allowing fifteen other countries to surpass us. Now, after a good deal of research, I’ve found that different reports come to different conclusions on the exact ranking, and they base those rankings on different criteria applied to different countries, so I can’t pin down exactly which countries beat out the US on this subject, or even whether Duncan is quite correct with his statistics. But it’s certainly the case that we’re far from the top, and some countries can be pretty conclusively identified as exceeding us in provision of tertiary education.
Duncan’s point is apparently that our achievement of benchmark standards for secondary education is insufficient to prepare students for college and university. I’ll eagerly agree that that’s true, but it is unhelpfully presumptuous to assume that that’s the only important factor contributing to low levels of higher education attainment. What of the steadily climbing costs of college tuition and the dearth of public funds to compensate for the out-of-pocket expense for students and parents? Might that not hold back some perfectly capable students from actually obtaining the education that they’re intellectually, but not financially, suited for?
Among countries in the Organization for Economic and Cooperative Development, the United States is 29th out of 34 in terms of how much funding for educational institutions comes from public funds. Not only is this situation accepted by US society, it is lauded by some elements thereof. Private institutional dominance of tertiary education, and indeed of all segments of society, increases competition and improves outcomes, they say. But with the US ranking somewhere around 16th in educational attainment, it’s clearly not working that way. In fact, among the nations that are fairly reliably ranked well ahead of the US on this point, many are classed as those nations that conservative Americans tend to envision as socialist hellscapes.
Several Northern European countries are variously placed in lead positions on the list, including Iceland, Finland, Denmark, Norway, and the Netherlands. What’s more, an Economic Policy Institute study of the affordability and accessibility of higher education in various countries concludes that “Finland and the Netherlands should be models for the international community” when it comes to both of these factors.
The correlation between cost and completion rates is not overwhelming, but it is sufficient that it needs to be explored as a factor, rather than being discounted among the ongoing repetition of the claim that if kids are smarter, they’ll always do better. There are other factors, and to deny that is to accept such unforgivably single-minded approaches to solving our problems as will only worsen some aspects of the situation. It’s not just that we’re failing at educating our children, though certainly we are doing that. It’s also that we’re failing to provide our children with suitable opportunities, access, and incentives.
Duncan seems to be under the mistaken impression that the problem underlying our trend of slipping behind the rest of the post-industrial world is just that students are failing at an alarming rate. But it’s not just dropouts that account for the low completion rate; the US ranks behind most of the OECD countries in terms of actual enrollment in higher education. And that fact is specifically ascribed in part to rising costs. That should be fairly obvious, especially to a Harvard-trained economist like Arne Duncan. As opportunity costs rise, the rational motivation for people to invest in something goes down.
The response to this would probably – nay, certainly – be that the opportunity costs of not attending college are unquantifiably higher than the material costs of attending. To that I would offer the simple challenge: prove it. The claim is repeated in the media constantly, always asserted, always assumed, but never adequately proven. And it would be one thing if the assertion was just that, on average, people with higher education backgrounds tend to do better than those without them. But that’s not what representatives of the administration say. Instead, they spread the hideously uncritical idea that if you get a college degree you are guaranteed success, and if you don’t get one you are guaranteed failure.
Do you think I’m mischaracterizing their claims? Arne Duncan said it on the Daily Show: “We have a million young people dropping out of school every year. A million. There are no jobs. None. They are guaranteed poverty and social failure.”
Guaranteed, he says. That there are any guarantees in life is an odious and socially detrimental lie. Virtually nobody would argue that people aren’t better off overall if they’re educated. For my part, I think that education is the most important thing that a person can pursue in life, though I am careful to emphasize that there are different ways of pursuing education, some far less expensive than others, and that education can serve a variety of ends, from vocational training to living a richer, fuller life of poverty. But the universal economic benefit of higher education is a baseless assertion so long as there are other explanations for a portion of the correlation between education and earnings, and other alternatives as to how hiring and job training might take place.
Now, Arne Duncan wasn’t very specific when he said “a million students dropping out.” If he was referring to students who drop out of high school, sure, they have their work seriously cut out for them if they want to be materially or socially successful. However, I’d still consider it irresponsibly closed-minded to say that both poverty and social failure are absolute guarantees for every child who has dropped out of high school in recent years.
Even working at a fast food restaurant can eventually allow a person to make a living wage, as long as he or she doesn’t rush to have children or otherwise climb into a hole that can’t be escaped through years of earnest work and eagerly sought promotions. What’s more, I’ve known people who’ve dropped out of high school and then obtained GEDs earlier than when they would have theoretically graduated. Hell, my ex-girlfriend never finished high school, and she leapt easily from job to job, quitting without notice and being hired for positions with higher pay, more responsibilities, and better titles, all at a time when I, with my fancy NYU degree, couldn’t so much as secure an interview for anything more than an eight dollar per hour retail job. Some people are just lucky; some just aren’t.
Regardless, I don’t think Duncan was referring to high school dropouts. The only statistics that I could find on short notice were from the 2004-05 school year, at which time 540,382 students dropped out of school between grades nine and twelve. Unless that number has doubled in seven years, I think Duncan was referring to any student who has dropped out at any level, primary, secondary, or tertiary. If so, some of the Americans who have been guaranteed poverty and social failure according to Arne Duncan include billionaires Bill Gates, Paul Allen, Steve Jobs, Ralph Lauren, Dean Kamen, and Mark Zuckerberg, as well as a pretty extensive list of other highly successful individuals in a variety of fields.
This repetition of a shockingly simplistic set of talking points about higher education has got to stop. Is learning good? Chirst, yes! That part is perfectly simple. But it’s not a purely economic good, and to whatever extent it does improve your income potential, that’s not the only factor. There is something to be said for the influence of social connections, environment, work ethic, opportunity, investment capital, employer bias, and plain old luck. Amidst all of that, what I want to see happen is that kids start going to school not because they want to make money, but because they want to learn. Is it really too much to ask that we encourage education on those grounds, rather than trying to deceive every young person into pursuing something that he’s not interested in and at which he’s no good?