Still Whistling ‘Dixie’

As the United States approaches the 150th anniversary of the end of the Civil War, it has become increasingly common for relics from the old Confederacy to recede from public view.

While there are undoubtedly certain corners of America in which warm feelings toward the slave-owning Deep South still burn, as a general rule, a given locale or organization today has precious little to lose—and often much to gain—from abandoning whatever residual Confederate loyalties it might yet possess.  Particularly when it is under public pressure to do so.

But what happens when the entity in question is so deeply and inextricably tethered to a component of the Confederacy itself that to renounce such ties would be to hollow out its own soul?

It looks like we’re about to find out.

Down in the sleepy Virginia town of Lexington, there lies a small liberal arts college called Washington and Lee.  Founded in 1749, the school assumed George Washington’s name in 1796, following a hefty donation from the man himself.  When the Civil War ended in 1865, the school recruited Robert E. Lee, the former general of the Confederate Army, to be its president.  Lee accepted, and held the post until his death in 1870.

So mighty was Lee’s impact in transforming Washington College into a serious and respected institution of higher learning, the place was swiftly rechristened Washington and Lee University in his honor.

To this day, W&L defines itself by the general’s personal code of conduct from his days as chief administrator.  “We have but one rule here,” said Lee, “and it is that every student must be a gentleman.”  (The school has been co-ed since 1985; today, the women outnumber the men.)

From this credo, W&L maintains an honor system that most American students would find both odd and terrifying, and the result is a university that ranks in the top tier of just about every “best colleges” list and, according to at least one survey, boasts the strongest alumni network in all the United States.

(Full disclosure:  My younger brother is one such alumnus, and, in point of fact, has become as much of a gentleman as anyone I know.)

Against the clear benefits of a university adhering to the values of this particular man, there is at least one equally obvious drawback:  the fact that this same Robert E. Lee spent four years fighting in the defense of slavery in the United States.

Whatever his personal views might have been about America’s peculiar institution—they were complicated, to say the least—Lee functioned as the seceded states’ rebel-in-chief during the climactic months of the war, thereby endorsing the proposition that the holding of human beings as property was a principle worth fighting, dying and killing for.

If a university is prepared to assume the totality of a man’s strengths as part of its core identity, must it not also be prepared to answer for that man’s most unattractive faults—not least when they involve the trafficking and torture of people he would otherwise wish to be educated?  Can this wrinkle in Lee’s makeup really be so easily glossed over?

Such an intellectual compromise is, in so many words, the primary intent of an intriguing new list of demands, submitted last week to the board of trustees, from a group of seven W&L law students calling themselves “The Committee.”

To be precise, these stipulations are for the school to remove the Confederate battle flags that adorn the inside of Lee Chapel, where the late general is buried; to prohibit pro-Confederacy groups from demonstrating on school grounds; to suspend classes on Martin Luther King Day; and, perhaps most dramatically, to “issue an official apology for the University’s participation in chattel slavery and a denunciation of Robert E. Lee’s participation in slavery.”

Doth the Committee protest too much?  Does W&L have a moral obligation to the whole story of Robert E. Lee, and not just the bits that serve its interests?

It is critical to note that, in its official policies and practices, the school today cannot credibly be accused of harboring neo-Confederate or anti-black biases.  (In its letter, the Committee refers to “racial discrimination found on our campus,” but does not cite specific examples.)

The town of Lexington, which has historical ties to Stonewall Jackson as well as Lee, naturally contains many citizens who hold such repugnant views, and who sometimes express them through marches or other forms of public demonstration.  However, this is not, as it were, Washington and Lee’s problem.

It is precisely because W&L makes no formal overtures toward the pre-war South’s view of civilization that it could seemingly afford to differentiate its latter-day founding father’s virtues from his vices.  The university’s president, Kenneth P. Ruscio, suggested as much in a magazine article in 2012, writing, “Blindly, superficially and reflexively rushing to [Lee’s] defense is no less an affront to history than blindly, superficially and reflexively attacking him.”

So why not put real muscle behind this plea for historical nuance by acceding to the Committee’s fourth and final demand (if not the first three)?  What does W&L stand to lose by looking reality in the eye and acknowledging a few unpleasant facts?

Wouldn’t that be the gentlemanly thing to do?

Leave a comment

Filed under culture, national issues, state & local issues

Back to Normal

When I was in college, Marathon Monday simply meant getting drunk and having a great time.

The Boston Marathon begins in Hopkinton at 10 a.m.  For us in our dorms near Kenmore Square, that meant waking up at 9, breaking out a 30-pack of Bud Light a few minutes later, and eventually hobbling over to Beacon Street to see the race’s leading men, women and wheelchair-bound pass by, followed soon thereafter by 20,000 or so runners-up.

It’s a perfectly sensible tradition in our fair city.  Drinking beer with friends is a joyous experience, and cheering on thousands of ironclad runners is a joyous experience as well.  To do both things simultaneously—well, the word “orgasmic” would not be too far off.

The theory is that the third Monday in April—Patriots’ Day, as it is officially known—is the one day when Boston cops don’t bother citing people for public intoxication.  (Note:  This is not necessarily true.)  For spectators, the Boston Marathon is such a merry, mellow and family-friendly event that any outbreak of inebriation is of a decidedly harmless and good-natured sort.  (Patriots’ Day is also understood as the one time in which you can drink in the morning and not be considered an alcoholic.)

A decade and a half earlier, the relevant marathon liquids were water and orange juice, which my short schoolboy self would help distribute to runners along the course.  This was an especially high honor in 1992, when my father and uncle ran the marathon together.  I remember it well:  I had lined up two plastic cups along the curb—one for Dad, one for Uncle Roy.  As soon as we spotted the duo coming up the hill, I dashed for the first cup, handed it off to Dad with perfect precision, then dashed back for the second.  When I returned to the edge of the street, expecting Roy’s outstretched arm, he and Dad were both long gone.  I guess they had some place to be.

There are plenty more Marathon Monday stories I could recount, and they are all indicative of the Boston Marathon’s core cultural purpose, which is to bring together virtually every resident of the Boston metro area in a display of total, unadulterated gaiety.  If you aren’t an actual participant, you attend the marathon for no reason except that it’s so goddamned enjoyable.  For all the boozing and tomfoolery, it’s just about the most innocent mass gathering in all of the United States.

And now, of course, it’s not.

As the city executes its final preparations for Monday’s race—the first since the moment when last year’s went horribly, horribly wrong—we are forced once again to deal with this concept known as the “new normal.”  Like boarding an airplane or sending an e-mail, the act of watching (let alone running) the Boston Marathon is no longer as innocuous or carefree as we long assumed it to be.

As a consequence of last year’s madness, this year’s festivities will be subject to extraordinary security provisions, including new restrictions on bags and other personal items, random searches by police, and prohibitions on strollers, large bottles and costumes.

Some of these regulations are perfectly reasonable; others seem needlessly excessive.  In any case, they illustrate how the Boston Marathon has joined the ever-growing list of public spaces subject to uncommonly intense scrutiny by the authorities, in the interest of keeping the peace and ensuring that nothing goes awry—a task that is ultimately impossible, since any marathon is an inherently open event.

In essence, the “new normal” is about the tension between security and freedom, with the implication that the former has taken precedence over the latter.

On better days, I take the view that safety in America has always been something of an illusion, that one assumes a million and one risks the moment one steps out the front door, and that the notion of national “innocence” is an absurdity that never existed and never will.

And yet when it comes to the Boston Marathon, I prefer the old normal.  I wish we could have it back.  I wish we didn’t have to think about the possibility of terror and violence at such an otherwise happy occasion, and I think it’s an obscenity that it took just two people with one bad idea to force us to think otherwise.

But our hand has indeed been forced, and there is no turning back.  As such, we are left with the second-best course of action, and that is to descend upon Monday’s race in record numbers and have the time of our lives.  Just like we always used to.

Leave a comment

Filed under culture

Traditions Passed Over

The Passover Seder is among the most sacred, enduring and universal of all Jewish traditions.

It is worth noting, then, that no two Seders are ever exactly alike.

When I was younger, for instance, my family’s service would be led by my grandfather, who had us read very solemnly from an ancient edition of the Passover Haggadah, replete with arcane, sexist language that we kids could not begin to understand.  Our recitation of the Exodus story and its implications left no detail unuttered.  Including the meal, a Seder begun at 6 o’clock could be expected to carry on until well past 9.

In more recent years, my folks, my brother and I have sometimes joined close friends of ours in their more modern, “family-friendly” event, featuring a homemade, illustrated version of the Haggadah that abbreviates and clarifies the text, eliminating the dull, sluggish bits while emphasizing the songs and encouraging audience participation—not least in flinging plastic frogs at each other while recounting the Ten Plagues.

This year, our clan was graciously included in a large-ish gathering that took the “do-it-yourself” approach several additional steps.  The “Four Questions” were asked not only in English and Hebrew, but also in Spanish, Polish, Latin and sign language (even though no one present was deaf or foreign-born).  The singing of “Chad Gadya” became a competition as to who could complete the most verses in a single breath.  (The eventual winner nearly fainted in the process.)  The hidden afikoman, or middle matzo, was found not by the children, but by one of the host’s teddy bears.

This is a mere sampling of the Seders I have personally experienced here in my own tiny corner of Judaism.  How the world’s remaining 14 million or so Jews conduct their annual Passover observances, I can only guess, but I suspect that they, too, are all over the map.

Admittedly—crucially, in fact—all of the disparate spins on Passover described above adhered to the same general rubric, and all contained the same essential elements:  the Exodus narrative, the Seder plate, the cup for Elijah, and so forth.  You might say the differences from year to year were ones of style more than substance.  And as many would argue, you can alter a holiday’s details without destroying its essence.

Except that for many Jews, the details are the essence, whether during festivals like Passover or a typical Shabbat service.  In the minds of folks like my late grandfather, one must never stray from the original script; it would be an insult to our ancestors if we did.

This mindset looks upon alternative approaches to Judaism with a mixture of sadness and contempt, viewing them as acts of cultural and religious effacement.  Owing to the Jewish people’s history of being nearly exterminated over and over again, historical continuity is essential—a means of bridging one generation to the next.

And yet I, for one, have drawn far more meaning from our recent “revisionist” Seders than from the old-school, Rabbinically-sanctioned ones of my upbringing.  They are more enjoyable, yes, but also more adept at communicating Passover’s actual significance, thereby imparting to us why we bother to observe it in the first place.

Is tradition-for-tradition’s-sake really more important than ensuring that the basis of the tradition is widely understood?  Don’t let anyone tell you this is an easy question to answer.  It most assuredly is not.

The tension between old customs and new sensibilities is real, and it assumes many forms.  Further, we can probably abandon any hope that such a clash will ever completely go away.

To wit:  We young people can pooh-pooh the “we’ve always done it this way” argument all we want—as a supporter of gay marriage rights, I do this quite often—but what happens when we’re faced with people for whom the very fact of an act’s infinite and unchanging repetition is what gives the act its meaning?

What happens when it’s our own sacred traditions that fall under scrutiny?  Will we be as susceptible to change as we demand others to be?  What makes us so special?

The Seder I attended this week was as memorable and entertaining as any I can recall, organized and led by people who take their faith seriously but also aren’t afraid to defy certain conventions for the sake of setting a lively table.

Yet as I shot a plastic green frog into my brother’s wine glass, I could faintly hear my grandfather’s harrumph of disapproval in a back corner of my mind, and I had to concede that his view of what constitutes a proper Seder is as valid as anyone else’s.

What is more, he could rest assured that, even at this table, at least it was Manischewitz in the glass.

1 Comment

Filed under culture

Muhammad vs. Ali

Meet Ayaan Hirsi Ali. Born in Somalia in 1969, she was subjected to genital mutilation at age 5—among other physical abuse—and, later on, forced into an arranged marriage to a distant cousin. That is, until she escaped to the Netherlands in 1992, where she worked various jobs that eventually led to a position in the Dutch parliament and as a campaigner for women’s rights—particularly within Islam, her religion of birth that she would ultimately renounce.

In 2004, she penned the screenplay for Submission, a short film criticizing the treatment of women in Muslim culture. The movie’s director, Theo van Gogh, was subject to an especially harsh critique in the form of being murdered in the street by a member of a Dutch terrorist organization called the Hofstad Network. What is more, attached to the knife that killed van Gogh was a note to Hirsi Ali, informing her that she was the next person on Hofstad’s hit list.

As a result, Hirsi Ali briefly went into hiding, before resuming her work as an advocate for the empowerment of women, including by founding the AHA Foundation in 2007, “to help protect and defend the rights of women in the West from oppression justified by religion and culture.”

For these efforts, Hirsi Ali was to be given an honorary degree from Brandeis University next month. However, last week the distinguished Waltham, Mass., institution opted to un-invite her from its commencement exercises, following on-campus protests by students, faculty and others.

What was their grievance? It was that this woman, who had spent the balance of her adolescence being tortured by practices ordained and justified by a particular wing of Islam, has had a few disparaging things to say about Islam.

Case in point: In a now-infamous 2007 interview in Reason Magazine, Hirsi Ali asserted, “I think that we [in the West] are at war with Islam,” that the religion is inherently violent and extreme, and the only way for Islam to “mutate into something peaceful” is for it to be “defeated.” In a separate interview in the same year, she called Islam “a destructive, nihilistic cult of death.”

Naturally, the wide dissemination of these contentions led many to tar Hirsi Ali as hateful, bigoted, Islamophobic and all the rest. Petitions were circulated across the campus, demanding the school rethink its decision to honor Hirsi Ali.

Last Tuesday, it did exactly that. I greatly wish that it hadn’t.

In examining this whole brouhaha, we probably need not expend much time on the question of rights. To wit: No one has the “right” to an honorary degree from Brandeis or any other great American university. An institution of higher learning has the full freedom to make such decisions however it deems fit.

What happened here, however, is that Brandeis specifically chose Hirsi Ali for the privilege of addressing its graduating class, only to then rescind the invitation when it became clear that too many members of the Brandeis community were afraid to hear what she might have had to say.

I say “afraid” because that’s what they were. They couldn’t handle facing an opinion about the world around them with which they do not agree—a sentiment that might force them to turn their brains on and exercise some critical thinking. And a university is no place for that.

A highly prudent question in this case is whether Brandeis was aware of Hirsi Ali’s more inflammatory statements when it first tapped her as an honoree. In an official statement, the school says it was not—that the administration regards her as “a compelling public figure and advocate for women’s rights,” but that it “cannot overlook certain of her past statements that are inconsistent with Brandeis University’s core values,” adding, “we regret that we were not aware of these statements earlier.”

One would like to be able to take Brandeis’s powers-that-be at their word—namely, to accept that Hirsi Ali’s stridently anti-Islam comments came as a surprise. However, in order to swallow this, one would necessarily need to believe that no one on this degree-granting committee thought to peruse Wikipedia or Google or any other source of basic biographical information on a woman whom this university evidently valued enough to bestow such a distinction in the first place.

Does this sound plausible to you? If conferring an honorary degree truly is “akin to affirming the body of a recipient’s work,” as the New York Times put it, why did Brandeis perform such apparently shoddy research on this particular would-be recipient?

No, I think it’s perfectly reasonable to conclude the administration knew that it was making a gamble—an admirable one, in my view—and then when it realized it had a small mutiny on its hands, the school panicked and bowed to the will of the mob.

Is this what constitutes “Brandeis University’s core values” nowadays? I dearly hope not.

Leave a comment

Filed under culture, national issues

Mozilla’s Fired Fox

Let’s do a bit of supposin’, shall we?

Suppose, for instance, that a flourishing technology company hires some guy to be its CEO, and shortly thereafter it is revealed that this man once donated $1,000 to the Ku Klux Klan—a contribution he does not regret.  Following a public outcry from both within and without the company, the CEO finds the pressure too great for him to continue, and he resigns.

Nothing wrong with this, right?  In the world we now inhabit, to express white supremacist views—and financially support white supremacist groups—is perfectly legitimate grounds for the face of a large (or small) corporation to be effectively hounded from his post.

Yes, one has the right to say anything one wants and to spend one’s cash as one sees fit.  However, this does not prevent a company from concluding that such an official holding such views could yield catastrophic economic consequences (read: a massive exodus of customers) and thus, of out prudence, getting rid of this cretin as swiftly as possible.

In short:  Freedom of speech does not guarantee freedom from consequences of that speech.

Now suppose, however, that instead of having given $1,000 to the KKK, our hypothetical CEO had contributed to some anti-abortion group, such as National Right to Life or Pro-Life Action League.  Were this disclosure to lead to the same series of events described above—outrage, cries for dismissal, and actual dismissal—would it not be considered a scandal?

To express any opinion whatever about abortion might be destined to cause controversy, but since the American public is divided on the question, it would be absurd to contend that any particular opinion is effectively “out of bounds” in the national discourse.

As such, to dismiss or otherwise ostracize the head of a company that has nothing to do with abortion on the basis of his views on abortion smacks just the slightest bit of totalitarianism, does it not?  Do we really want to be a country that simply gets rid of people who say things that might make us uncomfortable?

I float these hypothetical scenarios in response to the non-hypothetical occurrence last week, in which a CEO named Brendan Eich was forced to resign from the Mozilla Corporation because of his $1,000 donation in 2008 in support of California’s Proposition 8.  You know, the one that outlawed gay marriage.

In this keruffle’s wake, the central question—duly hashed out across the blogosphere for the past week—is whether Eich’s opposition to same-sex marriage was, all by itself, a valid reason for him to be induced to abandon his position atop Mozilla, which runs the Firefox browser system.

In other words, is the view that gay marriage is a bad idea now among those thoughts that a person can no longer express without fear of losing his or her job?

I must confess that I am conflicted.

On one hand, I put tremendous stock in my position as a First Amendment absolutist.  I would prefer that everyone be able to say exactly what they think, and that everyone else allow them to do so.  As a supporter of gay marriage rights, I am rather horrified by the possibility that anyone with an opposing view would decline to state it for reasons of political correctness or outright fear of persecution.

Frankly, I think some people make too much of a deal about what a particular CEO thinks when making decisions as consumers.  As blogger Andrew Sullivan so crispy observed, it is not a little ironic that the very people who have long demanded “tolerance” from their rhetorical foes are, themselves, now acting so very intolerantly toward those who refuse to tow the party line on the matter of gay rights.

There is just one thing preventing me from vocalizing the principle of open discourse at the absolute top of my lungs, and that is my unadulterated delight that this party line on gay rights—and specifically gay marriage—is now the majority view in the United States.

To be clear:  At this point in time, to say that gay people are not entitled to marriage is not as horrific as saying black people are not entitled to any and all rights accorded white people.  However, our culture is plainly, steadily and irreversibly moving in that direction, and I am pleased as punch that this is the case.  So far as I’m concerned, the day that anti-gay words and actions become utterly verboten in polite society is one that cannot possibly come soon enough.

But until that day dawns, let us resist the urge to impose it by force, like some politically correct mob.  It’s a highly unattractive means of getting one’s way, whatever the issue might be, and we Americans are supposed to be better than that.

Suppose, in the future, we make more of an effort to prove it?

Leave a comment

Filed under culture, national issues

One and Done

For whatever reason, April 2014 has swiftly become Living Ex-Presidents Awareness Month.  Whether by chance or design, the former occupants of the Oval Office are eating up newsprint everywhere you look.

You’ve got Jimmy Carter popping up on talk shows from coast to coast, promoting his new book, A Call to Action: Women, Religion, Violence and Power.

There was the gathering in College Station, Texas, over the weekend to observe the 25th anniversary of George H.W. Bush coming to power, and to honor the legacy of Bush’s administration, on which there are several books in the works.

Over in nearby Dallas, there was a gallery opening at the George W. Bush Library, featuring oil portraits of foreign leaders by our most recent former commander-in-chief.

And Bill Clinton?  Well, since when has he ever taken a day off?

For my money, the most interesting of our living ex-presidents’ exploits at this moment concern Carter and the elder Bush—two men who, for all their political differences, share the dubious distinction of having lost their bids for re-election.  Carter was defeated by Ronald Reagan in 1980 amidst the Iran hostage crisis and a lousy economy.  Bush lost to Clinton in 1992 in a three-way contest that also featured H. Ross Perot, in a campaign centered, again, on a lousy economy.

As duly noted by most people, to be a one-term president is axiomatically to be a failure.  Whatever one might have accomplished in four years as America’s chief executive, if one fails to be re-elected—for whatever reason—then nothing else really matters.  Sure, forging a lasting peace in the Middle East is all well and good, but if you can’t then secure 51 percent of the vote here at home, what have you really brought to the table?

Accordingly, most of these electoral rejects spend a great deal of their post-presidential years in a kind of defensive crouch, having to underline their successes against a chorus that seems only interested in reciting their faults.

Of the ten highest-ranked presidents in U.S. history—based on the average of 17 scholarly polls dating back to 1948—the first nine were elected to a second term.  Today, let us attempt to draw some wisdom from the tenth, James K. Polk.

The nation’s 11th president, serving between 1845 and 1849, Polk is periodically cited as among America’s most underrated chief executives.  Probably his biggest “legacy” concerns his gift as a land-grabber:  In the quest for expanding the official borders of the United States, Polk essentially picked up where Thomas Jefferson left off.  Following the annexation of Texas, the war with Mexico, and the Oregon Treaty with Great Britain, the United States under Polk secured more than one million square miles of new territory—an expansion greater even than the Louisiana Purchase.

You may well ask:  With such a titanic accomplishment to his name, why did Polk not get elected to a second term?

Answer:  Because he didn’t run for a second term.  In fact, he never intended to.  At the highly-contested Democratic convention of 1844, Polk made it abundantly clear that, if nominated and if elected, he would be a one-term president.  Period, full stop.

Whatever the political calculus was at the time, Polk made good on this campaign promise.  According to legend, he outlined four specific policy goals upon taking office and accomplished all of them within his four-year tenure.  As such, he could then depart the White House in March of 1849 with his head held high, his mission having been accomplished.  Polk retired to private life, died three months later, and that was that.

I wonder:  If Jimmy Carter and/or George H.W. Bush had announced at the outset that they would not seek a second term, and if their presidencies had otherwise shaped out exactly as they did, would we view their tenures differently than we now do?  Do we not lay far too much emphasis on winning re-election as an indication of presidential fortitude, compared to what one actually accomplishes while one is in office?

In the future, might the country be better served if more candidates took Polk’s lead by pledging a single term with a short, but clear, list of goals?  Such an approach would surely take most of the guesswork out of assessing whether a particular leader is a success, and it would lower the impossibly high expectations branded upon even our most modest commanders-in-chief.

Most important of all, self-imposed term limits would concentrate the mind and workload of the president in question, freeing him to tackle a specific, narrow and realistic agenda, rather than attempting—inevitably in vain—to solve all problems at all times.

It sure seems like an experiment that would be worth trying.  It’s not like it hasn’t been done before.

Leave a comment

Filed under national issues

Dead Idol Worship

Roger Ebert has now been dead for a year.  That means I have spent the past 365 days having to decide for myself whether a particular movie is good or bad, rather than simply logging on to to find out.

I exaggerate, but only slightly.

In point of fact, Ebert likely had a greater and more direct impact in shaping the course of my academic career than any other person I have never actually met.  Indeed, very few within my actual sphere of acquaintance could claim as large of an influence over my decision to go to college to be a writer (if not a critic).

From the beginning of high school onward, I would visit the website of the Chicago Sun-Times every Friday, digesting every word Ebert wrote.  I read his reviews of all the week’s new releases—whether or not I intended to see them—as well as his “Great Movies” series and periodic essays on any subject he pleased.  On multiple occasions, I watched the DVDs of Citizen Kane and Casablanca with his audio commentary track in the background, which gave me as thorough an appreciation for those movies—and movies in general—than any written analysis ever could.

This is all to say that when Ebert died on April 4 of last year, it felt as if a small bit of me had died along with him.  I didn’t know him personally, but he might as well have been a member of my immediate family.  Through his writing, I came to think I knew his mind and his character as well as those of most people I interact with on a regular basis.

As such—and because I continued to read his stuff until the very end—his death left a proverbial hole in my daily routine that could not be filled by anyone else.  Over the past 12 months, I have mourned his passing more forcefully than those of folks I actually, properly knew, whose funerals I actually attended.

Presumably, most will understand what I mean, and have likely felt the same at some point about some personal hero or other.  Yet I nonetheless feel wrong in saying this, as it risks cheapening that very solemn act of grieving.

We Americans are renowned for our celebrity worship, and have been since time immemorial.  Through magazine tabloids, TV shows and the Internet, we follow the lives of famous strangers with an obsession that borders on creepy.  We talk about our favorite celebrities as if we are good friends with them, and when an especially prolific one shuffles off to the great beyond, the entire country has one big nervous breakdown.

We are all well aware of this behavior of ours, but we frown upon it at the same time, as perhaps we should.  Even when our idol worshipping doesn’t spin out of control, it is odd and slightly pathetic to live our lives vicariously through others in this manner.  Why devote so much love and attention to someone who will never give it back?  Someone who, indeed, does not even know you exist?

What I would suggest is for us to differentiate among various types of role models.  Or at least to give it the old college try.

I have said how much I admired Roger Ebert.  As well, I held near-equal esteem for Philip Seymour Hoffman, the film and theater actor who died of a drug overdose on Super Bowl Sunday.  In his relatively brief career, Hoffman managed to star or co-star in practically every movie worth seeing, and when he died I, like many others, felt as if I had been robbed of countless great performances yet to come.

Yet Hoffman’s death has not finally proved as significant to me as Ebert’s, because for all that the former’s work enriched my life, it did not come at the consistent, regularly-scheduled intervals of the latter’s.  Yes, one could depend upon two or three excellent Hoffman performances in a year, but the loss of that is far easier to swallow than the loss of someone with whom one checked in every week, without fail.

In other words, if we are to allow ourselves to grieve for fallen stars with an intensity otherwise accorded close family and friends, let us do so in proper proportion.  All men and women are not created equal in our own eyes, and we should reserve our fullest attention and deepest gratitude for those who truly deserve it.

Leave a comment

Filed under culture

Batterer Up

Here in eastern Massachusetts, there lives a fellow named Jared Remy.  I have never met him, but from what I hear, he is precisely the sort of person from whom one should run as fast and as far as one possibly can, should one happen to cross his path.

Now 35 years old, Remy is a hulking blob of steroids and idle rage.  For the past decade and a half, his life has essentially been a series of restraining orders from ex-girlfriends and others, who have alleged every sort of physical and verbal abuse imaginable—and who, in many cases, have subsequently withdrawn such requests or otherwise declined to testify against Remy in court, due either to his promises to mellow and reform, or out of fear of further torment.

One could hardly say such fears were unjustified.  As is made abundantly plain in an exhaustive recent article in the Boston Globe by reporter Eric Moskowitz, Jared Remy’s penchant for unleashing holy hell upon practically everyone he has ever met is matched only by his curious, near-superhuman ability to never be adequately punished for it.

For all the arrests and criminal charges he has accumulated over the years—the total number of such incidents is 20—Remy has been convicted on a mere two occasions and didn’t serve even a day in prison for either.  Rather, his history is one of wrist slaps:  Probation, counseling, curfews and the like, but nothing close to what his alleged behavior would seemingly deserve.

Now, it looks as though that will finally change.  Since August 15 of last year, Remy has resided at Middlesex County Jail in Cambridge, awaiting a trial that is to begin in the fall.  On that late summer day, you see, Remy got into a scuffle with Jennifer Martel, his then-girlfriend and the mother of their four-year-old daughter.  The fight might have been like every other such incident in Remy’s past, except that in this particular act of aggression against the current woman in his life, Remy had the bad luck of actually killing her.

(Remy has pled not guilty, calling the murder charge “ridiculous.”  However, his life and the physical evidence suggest otherwise.)

What makes this saga interesting—as most New Englanders well know—is that Jared Remy’s father happens to be one of the most beloved figures in all of Boston sports.  Jerry Remy played second base for the Red Sox from 1978 to 1984, and has broadcast the team’s games on the New England Sports Network since 1988.  As a television personality, he strikes as an utterly lovely guy and not particularly a man responsible for raising—and arguably enabling—a psychopathic killing machine.

The region now asks:  Is Jared Remy’s position as Jerry Remy’s son the reason he has evaded legal retribution all this time?  Would the average Joe with an identical record, but without Jerry Remy’s fame and high-priced attorneys, be afforded such an extraordinary string of second chances?

Did you need to wait for the end of that question before formulating an answer?

Justice is supposed to be blind.  Here is but the latest high-profile demonstration that it’s not.

Perhaps the most vexing question of all—legal considerations aside—is why all these women were willing to take Jared Remy back.  After all the pushing and shoving, the death threats, the property damage, the belittling comments about their weight, and the out-and-out assaults, what exactly was there left to admire?  Were Remy’s charms and powers of persuasion really that irresistible?  We know that love is irrational, but is it that irrational?

Then again, in broaching this subject, we might as well ask why any woman would stay in an abusive relationship for any reason.  Why, indeed?

Lacking a simple answer or an easy solution, we depend upon our justice system for protection against lunatics like Jared Remy—protection that follows a reliable protocol that, for instance, would not allow Remy to roam free after having violated restraining orders on multiple occasions, thus proving that he does not take the law seriously.

The Globe report notes that, in addition to the numerous allegations against Remy that were taken back, there were several other incidents in which the victim considered seeking a restraining order but ultimately didn’t, because he or she feared retaliation.

This is not right.  A person should never be afraid to seek protection from the authorities because he or she thinks, in effect, that the authorities will not do their job.  And, more to the point, the authorities should not foster such mistrust by failing to do that job in the first place.

In this case, it appears they have failed, and with murderous consequences.  Shame on them, and shame on us.

Leave a comment

Filed under culture

The Right to Hate

I have no evidence that the Westboro Baptist Church is secretly a pro-gay rights organization masquerading as a gang of religious extremists in order to make anti-gay groups look ridiculous.

However, if such a cheeky cabal were formed, I suspect it wouldn’t look a heck of a lot different.

For the past many years, the Westboro Baptist Church has served two essential purposes in American public life.  First, to be arguably the most universally detested organization in our 50 states united.  And second, to ensure, beyond all doubt, that the First Amendment to the U.S. Constitution is as healthy and muscular now as ever it has been.

To review:  The WBC are the folks who shuttle from place to place wielding signs with such heart-warming messages as “God Hates Fags,” “God Hates America” and “Thank God For Dead Soldiers.”  Most of its members are related, either by blood or marriage, to its founder and patriarch, Fred Phelps, who died on March 19, at age 84.

The group is perhaps most notorious for its practice of picketing the funerals of U.S. soldiers, whom it claims were killed as a consequence of America’s tolerance for homosexuality, among other things.  In 2010, this ritual led to a Supreme Court case, Snyder v. Phelps, in which the Court ruled in favor of the church, arguing that protesting a funeral is a form of free expression protected by the First Amendment.

While the death of Fred Phelps does not necessarily mark the demise of the Westboro Baptist Church itself, it may well hasten its diminished presence in the public eye.  As such, we might entertain the notion of referring to the WBC in the past tense, if only for its cathartic effects.

On this subject, I have but one question:  On balance, has the Phelps family been good for America?

My answer:  Yes, but it’s complicated.

I say the WBC is the most hated organization in America—a fairly uncontroversial sentiment—but we might also say it has come by this distinction rather lazily, as far as generating mass hatred goes.

After all, what could be more of a “slam dunk” in the quest for amassing public scorn than to spit on the graves of fallen soldiers and to craft placards with the sort of radioactive language that leads even those who otherwise agree with you to recoil in disgust?

The WBC can be accused of being any number of things, but subtle is not one of them.

Quite to the contrary, they are cartoon characters—hysterical, childish, simplistic, ideologically absolutist to an extent previously not thought possible, and—surprise, surprise—completely convinced of their moral rightness on all fronts.

Indeed, the more time one spends reading the WBC’s various statements on matters of public import, the more one feels the weight of precious seconds of one’s life being irretrievably wasted away.

In other words, the WBC seems to incite the world’s rage and indignation for their own sake, as if it were all one big piece of performance art.  As such, the church can hardly be taken seriously in the first place.  To coin a phrase:  Its antics are not worth dignifying with a response.

Yet we have done exactly that, be it through satire and counter-protests, or in the case of people like Albert Snyder, through lawsuits alleging the infliction of deep emotional distress.

And we cannot blame some folks for taking WBC at face value, since its views do not exactly come from nowhere.  In point of fact, the church’s basic beliefs about homosexuality are drawn directly from the Old Testament, and its musings that God kills Americans as a punishment for homosexuality is an almost word-for-word plagiarism of Jerry Falwell’s infamous explanation for the attacks of September 11, 2001.

In any case, their flagrant ridiculousness has proved exceedingly useful in reminding ourselves that enforcement of the First Amendment can be a very nasty business, since the right to free expression must be extended even to those whose views no one else on planet Earth wishes to hear.

In this way, the Phelps family’s victory at the Supreme Court was a great relief, because it demonstrated that—at least in this case—our federal institutions still take the Bill of Rights seriously.  That our most sacred liberties apply even to those who probably don’t deserve them.  Yes, even organizations like the Westboro Baptist Church, which expresses nothing but scorn toward the very country in which these liberties are practiced.

For better and for worse, that is what America is all about.

Leave a comment

Filed under culture, national issues

Malaysian Mystery

It’s hard to ignore a story about a commercial airliner that went missing nearly three weeks ago and still has not conclusively been found.

But I’ve been trying my best to do exactly that.

It’s not that I’m incurious about how a large pile of metal carrying some 239 human beings could vanish into thin air.  With all the technology now at the world’s disposal, it seems downright impossible for a loaded airplane to disappear without a trace.

And yet I have opted to mostly tune out the ongoing saga of Malaysia Airlines Flight 370, keeping the coverage of its possible whereabouts at arm’s length.  The reason for this concerns the narrow but crucial difference between what we’d call a “search for answers” and a “search for meaning.”

To wit:  Since the Malaysia Airlines plane first lost radio contact with officials on the ground, it has become increasingly clear that we have no earthly idea what the hell happened.  America’s news networks have conjured one hypothetical scenario after another, but these seem to be based on a combination of shoddy information and pure speculation.  Actual, verifiable facts have come in fits and starts, leaving jittery would-be analysts to fill in the blanks.

Indeed, the media’s imagination in this story is commendable for its comprehensiveness:  It was terrorism!  No, it was suicide!  No, it was an asteroid!  No, it’s a government conspiracy!  And so forth.

For most onlookers, the Malaysian jet disappearance serves as a good old-fashioned mystery, sparking the natural human tendency to gather clues and eventually arrive at a resolution.  In other words, a search for answers.  The who, the what, the where and the how.

What’s missing is the why—the element that can only be assessed once all the other questions have been answered, and the element without which everything else is worthless.  The why is what tells us what is so interesting about all these disparate facts in the first place, paving the way for a greater understanding about the world around us.  In other words, the search for meaning.

On the matter of the missing plane, I would very much be interested to find out whether Flight 370 had been taken captive by a hijacker, or instead had merely fallen victim to bad weather or faulty equipment.  I would be intrigued by the prospect of a government cover-up in the investigation of the plane’s trajectory and final resting place, and the ways the passengers’ family and friends have been left in the dark.

But here’s the thing:  Thus far, we have been presented limited, if any, evidence that any of the above is actually the case.  And we can’t interpret the facts until we know what the facts are.

For instance, if it turns out Flight 370 was hijacked by terrorists, we can have an international conversation about terrorism.

If it was a simple (or not-so-simple) mechanical malfunction, we can talk about mechanics.

If the Malaysian and/or Chinese governments have behaved improperly in the course of this ordeal, let us hold the culpable officials to account.

And if we never find out what really happened somewhere over the Indian Ocean on March 8, let us accept that not all mysteries can be solved and not every tragedy yields a take-home message.

In short:  Less us calm the heck down.

Admittedly, this is not an easy thing for us humans to do.  For mysteries big and small, we want answers and we want them now.  We want to know that everything happens for a reason, and that even the most horrific events can be redeemed, however slightly, by the dissemination of the Truth.  Nothing irritates us quite so much as a cold case, and the possibility that we may never find the answers we seek.

What makes this compulsion problematic is our national media, which prefers to report false information rather than no information at all, effectively jerking us around from one conspiracy theory to another for no good reason.

While America’s cable news networks may have jumped the shark permanently when it comes to hysterical coverage, we still have the option to ignore them.  To resist the urge to draw meaning where there is none, and to glean answers from sources who don’t even ask the right questions.

Leave a comment

Filed under culture