The Wrong Rationale Down Under

Today in Slate, Chloe Angyal has a piece on the sexist media coverage of Australian Prime Minister Julia Gillard.

This argument is familiar, of course, because it bears some relation to recent charges of sexism in the media coverage of other female political figures, be they Hillary Clinton, Sarah Palin, or Elena Kagan. I understand that when confronting foreign political circumstances, American audiences typically demand some kind of governing metaphor, or at least some relevant metonymy, that makes the circumstances “over there” more closely analogous to those at home. It’s for this reason that the lede to this story attempts to universalize, and holds that “the Australian media have hit all the same sexist notes about Gillard that the American media played in their coverage of women in politics like Hillary Clinton, Elena Kagan, and Sarah Palin.”

But what evidence are we given in Angyal’s piece that this is in fact the case? Is her wardrobe subject to continual, banal, and bizarre speculation? Does she mysteriously lose credibility for looking haggard, overly “masculine,” or simply unattractive? Is the possibility of her undergoing breast enhancement given as a potential analogue for her inexperience? Is that very inexperience perhaps too the consequence of having sacrificed ambition in order to raise a family?

While one can quibble over the significance (or believability) of any one of these media narratives, surely they at least suggest a standard of feminine political character different from that which might be applied to their male colleagues (although, likewise, if John Edwards looked like Joe Lieberman, I doubt he’d make it to the cover of the National Enquirer). The point is that they are each valid issues that, if not necessarily as inflammatory as they might first seem, at least create a space for questioning dominant paradigms of appearance, personal conduct, and parenthood.

But what this article attempts is to build a baseless case upon the reverse paradigm: that Gillard is the subject of misogynist scorn because we’re already familiar with the treatment of Clinton, Kagan, Palin, et al. What evidence are we given that this is the case?

Foremost there’s the accusation that the relationship between political allies of opposing genders is sexualized because they’re painted as being “in bed together”… an expression which 0.33 seconds of Googling would reveal to apply equally to David Petraeus and the Karzai brothers, the entire federal government and the firm of Goldman Sachs, and Google, Inc. and Verizon Communications as personified by their CEOs Eric Schmidt and Ivan Seidenberg. Unless we’re meant always to think that the relationship between Petraeus and the Karzais is an entirely homoerotic overture, it’s fair enough to say that the expression ought to stand as a commonplace metaphor for entangling alliances (and is likewise a byproduct of how we categorize these alliances when they seem ill-fitting: “strange bedfellows,” a characterization apparently common enough to bear a distinct dictionary definition).

We’re likewise given the assertion that the media, blindly, can only characterize the dissolution of a partnership as “divorce”…and yet the article in question is actually about Kevin Rudd, not Gillard, and moreover is in reference to the end of his “honeymoon period,” another common expression which certainly forgives the extension of the metaphor.

Surely there’s a case to be made for the speculation regarding Gillard’s living arrangements, but here we’re not even given evidence that said arrangements even remain an issue: the most recently dated article we’re given is Gillard’s own statement that “decisions in [her] personal life, [she]’ll make for personal reasons.” It’s difficult to suppose that a politician is being “dogged” with speculation when her curt avowal of personal privacy appears to be the final word on the matter.

Again then, rather than simply using recent precedents in American politics as a point of comparison, Angyal’s article exploits their very prominence in order to magnify an otherwise entirely trivial case. The reader of such an article is only meant to think, “Oh, yes, it was so terrible how Kagan was treated,” and to then transpose that instinctual horror onto the particular case in question, irrespective of its merits. I am positive that women in positions of authority are subject to closer, more varied, and more arbitrary scrutiny than many of their male peers, but to judge that this is so in every instance simply because it is true elsewhere does no one any favors.

Leave a comment

Filed under Uncategorized

Muckraking – Why Good Intentions Aren’t Good Enough

Following on the heels of my earlier post on the David Paterson affair, the New York Times breaks a new development today (from the same journalists as the original investigative report) which finds, by way of two anonymous sources, that Paterson did instruct two staff members to contact the woman accusing one of his aids of domestic violence. The article is laid out beautifully, except for this gem buried in its own paragraph midway through the article:

These accounts provide the first evidence that Mr. Paterson helped direct an effort to influence the accuser.

This is why I don’t feel compelled to recant the previous post even though Paterson’s relative guilt is now, at this point, pretty much a moot point. By floating the previous three articles – the same articles that led, directly, to Paterson’s resignation – the Times unquestionably laid the groundwork for the two sources paraphrased here to come forward. Without those articles, the sources say nothing. But if this is, quote, the “first evidence” that Mr. Paterson ever directed anyone to do anything, on what grounds are the other articles justified?

I am not saying that domestic violence is not a crime. I am not saying that Paterson is innocent. This, however, is a procedural question, and procedural questions are endemic to how institutions like the press, and like the law, are ordered; it’s the kind of thing that, not coincidentally in a show called “Law & Order,” goes wrong during the police investigation and leaves Sam Waterston glaring. No one likes to see guilty people go free; and no one likes to see crooked politicians abusing their position of power to exploit the innocent, let alone the victimized. But to build journalism on mere supposition – as the new article admits, without direct evidence – is not journalism: it’s muckraking, of the foulest, most generic kind.

Maybe the Times is feeling unloved, given all the hype – even in the Times! – surrounding the National Enquirer’s attempts to earn Pulitzer approval for their coverage of the John Edwards scandal, but whereas the Enquirer carried the day by actually implementing conventional journalistic practice, the Times here is only aping the worst habits of the tabloid press.

Leave a comment

Filed under Uncategorized

What’s with the New York Times and Albany?; or, We Are Pathetic

I’d considered writing about this earlier, then figured that adding fuel to the speculative fire would only run contrary to my conviction. Lee Siegel sums up my own thoughts on the David Paterson affair today on the Daily Beast. Money quotes:

Conscientious and admirable as the Times’ reporting was, the paper’s investigation could not even determine one essential point: Whether the woman Johnson is accused of attacking was called by Paterson after the incident, or whether she called Paterson herself. In a case where wrongdoing turns on the possibility that Paterson tried to intimidate the woman into not pressing charges, who made the call is essential to know. But we don’t know.

Yet so intense is our need for an outrage-fix that we turned innuendo into the instrument of a massive high and drugged ourselves into certainty that Paterson had traduced his office in the most thrilling and intolerable way. And—voila! Paterson announces he’s terminating his campaign, and The New York Times has the scalp of a second consecutive New York governor hanging from its belt.

For weeks – it seems like years – now the Times has percolated anticipation as to what Paterson’s sin might be. Online outlets even claimed that the charges were so severe that Paterson had called the paper himself to strike some kind of a compromise (how this would ever work, once the mere existence of a transgression had become public knowledge, who knows). The paper managed all of this adroitly, first running a profile on Paterson’s driver-turned-confidante (my god, he doesn’t have a Master’s!) to yawns, and only gradually turning up the heat. Their restraint alone is admirable; it’s the kind of move that one would never expect to see in the 21st century information marketplace, where anyone wanting to sell papers (correction: online advertising space) would do well to lay out as much dirt as quickly as possible. But still, the entire exercise seems somewhat pointless. The Times hinted its way towards another gubernatorial resignation, but it wasn’t the quality of its information, or even the thoroughness of its reportage, that warranted any real mention. Really? It was the fragility of the public psyche and the gullibility of the newshounds who followed the paper’s forty-seven (okay, it was three) part installment. Such gamesmanship is to lower journalistic standards, not to raise them, but still, because of Paterson’s resignation, the Times is empowered to call it a win. Maybe it is. But not for us.

1 Comment

Filed under Uncategorized

Correction of the Moment

This from Charles Blow’s NYT op-ed about “Tyler Perry’s Crack Mothers,” which uses Mo’Niques likelihood of winning an Academy Award for her performance in “Precious” as a jumping off point for discussing the demonstrably false stereotype of black crack mothers prevalent in Perry’s films, and in the culture more generally:

Correction: An earlier version of this column incorrectly described Mo’Nique’s character in the movie “Precious.” She was not a crack addict.

Now it’s common enough practice for op-ed writers to use a particular cultural moment as a springboard to broader sociological inquiries. I’ve got that. But if Blow’s central premise is faulty, if he’s never apparently seen the film in question, why does the article exist in the first place? Why wouldn’t the Times just pull the online version entirely? Besides his analysis of “Precious,” Blow includes but one example as to how the crack mother trope is prevalent throughout Perry’s oeuvre:

In the last five years, he has featured a crack-addicted black mother who leaves her children in two of his films and on his very popular sitcom, “House of Payne.” (In one of the films, the character is referred to but never seen.)

I’m going to guess that the editor who made the correction didn’t likewise go back and correct Blow’s arithmetic, which would mean that Perry has featured a crack-addicted black mother who leaves her family in exactly one film in five years. In that time, Perry has directed, written, produced, and (generally starred in) ten separate feature films, counting “Precious.” On television in that time, Perry has produced just under 50 episodes of his other very popular sitcom “Meet the Browns.” “Tyler Perry’s House of Payne,” was recently renewed by TBS for an additional season, giving it an unprecedented 126 episode run within two years. This figure is equivalent to a new episode every 2.896 days. In all that air time, Perry has apparently produced only two black crack mothers? Is there really nothing else to write about?

Tyler Perry is an outright scourge when it comes to the perpetuation of racial stereotypes; that his cultural products find a home with the same audience he is typing makes his message all the more powerful and, to most eyes, pernicious. But Blow’s argument uses Perry in all the wrong ways. He sets the director up as a straw man, and Perry’s the one left laughing when Blow swings and can’t quite knock him down.

Leave a comment

Filed under Uncategorized

In Defense of Buzz

Here again comes one of those topics so exhaustively masticated by the blogosphere that, at this point, linking to individual commentaries would require dissertation-like concentration. To instead summarize, the evolution of commentary on Google’s new (yet integrated) app, “Buzz,” looked something like this:

  1. Release Day: “Google has a new product. It’s pretty much like Twitter. Or Facebook. But with mail.”
  2. Release Day +4 hours: “Google’s new product is boring.”
  3. Day Two: “Google’s new product exposes nominally private information to the public? This is a travesty!”
  4. Day Two+: Outrage, outrage, outrage

To this allow me to add Step 5: Google’s “Buzz launch wasn’t flawed, Google’s intentions are.” As we’ve noted before, this kind of synecdoche – magnifying something minor into an allegory for the entire operation – is pretty much the last refuge of journalism, the penultimate refuge of tech journalism, and the first refuge of bestselling literature (actually, I’ve never said this last part before, but it can be inferred). In an effort to avoid hypocrisy, however, allow me to take umbrage with a specific aspect of this “Kontra” person’s formulation, which is notable primarily because it was picked up (read: published) by the All Things Digital wing of Newscorp by way of the Wall Street Journal. To wit:

In its urgency to offer a me-too product, Buzz confuses the read/unread email paradigm with real-time messaging stream like Twitter. It adds insult to injury by co-mingling various cognitive spheres like blogs, photos, videos, status, etc into thin soup delivered through an unceasing firehose. The final blow is the embarrassingly unfocused layout: the complete absence of visual hierarchy and progressive disclosure, overabundance of visual cues/links for action, and clumsiness in using white space to strip away meaningful information density.

Comparing Google to Microsoft, as the article does, is fine. Insofar as we’re discussing corporate strategies, and corporate entities, it’s an entirely reasonable formulation, especially if you truly believe that their short and long term strategies are mutually ruinous, or at least, uninspired. That Microsoft’s entry into the smartphone marketplace is being lauded by the tech press should render part of that interpretation dubious, but whatever. That Apple is pretty much the only tech company not lambasted by this post, except indirectly (ie, “there are those” – that is, not the author – “who would call Apple ‘evil'”), should be even more suspect, but fine, I’ll get over it. I’m an adult. I read things on the internet. Disassembling bias is part and parcel of what I do.

But this is a rhetorical blunder, and one that is deeply, intuitively, and, dare I say, intentionally flawed. If Google’s design choices are staid, then they ought not to be declaimed for actively attempting to unite a variety of concepts in interesting ways. The stark, absolute demarcation between “cognitive spheres like blogs, photos, videos, status” is intellectually unambitious, for it assumes that each of these spheres needs, inherently, to remain separate, and it cloaks the baldness of the assumption with pseudo-intellectual vocabulary (cognitive spheres?) obscure enough to actually reward the faulty assumption. Such a mindset fails to acknowledge the fact that so many of the forms of media life that we now encounter are themselves hybridized representations of other, earlier forms. What are blogs? Etymologically, “web logs,” with all the Livejournal-esque solipsism that that entails, but now the form has come to accommodate a separate, distilled way of accessing news. Blogs can, in principle, win the Pulitzer Prize. Ross Douthat’s NYT blog is about a thousand times more compelling than his actual editorial content (and I mean that in the nicest way possible). Would it make sense for me to throw up arbitrary divisions between my “cognitive sphere” of newsreading and that of blog perusal? Of course not. So why should the same standard be applied to Google?

Personally I find this sort of conceptual amalgamation compelling, since I have no need for Twitter, and yet a solid portion of my day is spent either reading news articles or formulating correspondence. At least within my circle of acquaintances, I can tell that the level of discourse via Buzz is different than it is over my Facebook feed (or in my brief flirtation with Twitter). For whatever reason, people seem to respond to the material in a different way, or are more willing to parse what other people are saying at length when they aren’t corralled into a narrow character limit, or aren’t cognitively (a ha!) coached that their comment is just about as relevant as the fact that someone from high school became a fan of Dos Equis. What I’m interested in is the level of discourse, and how it sustains itself. This would inevitably prove more revolutionary – more innovative – than some fresher appeal to my real-time, micro-vlog paradigm, or whatever other need to communicate runs so deep that I didn’t even know that I had it yet. But my enthusiasm is only partial; it’s qualified by the fact that time, and the public, need to enact my private convictions before they can be rendered true. That’s the sort of intellectual openness that the notion I’m fighting back against can’t admit.

Leave a comment

Filed under Uncategorized

You Lost Me at “Hello”

It’s an unfortunate side effect of overly-coy titling that critics of the show “Lost” can frequently point to its mere name as at once cause and symptom of the utter bafflement that fans of the series so willfully find themselves in, week after week. The water cooler talk that abounds after any given episode, to these eyes and ears, is just a way to forever parade viewers in front of further and further commercials, and amounts, in a series with unlimited questions and few proper answers, to a kind of narrative masochism.

Last night’s premiere, the first episode of the final season, promised to be the beginning of the end – where all our questions will be answered, claimed the trailers in paraphrase – and yet it seems as though even the show’s supporters have found themselves more confused than ever, or, put less delicately, outright pissed. I’ll confess that I too am among these ranks, and while it has become common practice at times like these (cf, “The Sopranos,” “Battlestar Galactica,” “Lost on Mars”), to declaim the writers and producers of a program for “betraying the viewer’s trust,” my irritation, my sheer wonderment, doesn’t fall along these lines. Where the producers – and I’ll continue to use the general form for now, for if nothing else the show is a produced, collaborative effort, regardless of who wields creative celebrity – have lost me is not in their violation of my trust, of their characters, or even their own story, but in their betrayal of fundamental principles of narrative sanity.

It wasn’t always this way – or actually, it kind of was. The first several seasons of the show, while lauded for their originality, demonstrated an unusual capacity to irk people by refusing to directly answer any of its escalating mysteries.

Why is there a polar bear on the island?
Because that kid summoned it from a comic book.

How can he do that?
Don’t pay any attention to that – look, those lottery numbers are also on a buried hatch!

(The Wizard of Oz-like effect here should be obvious.)

Yet in announcing, mid-way through the third season, that the series had a definitive end – the sixth season, 2010, now – and in changing its very organizing principle from a series of flashbacks to a series of flashforwards, the producers demonstrated a commitment to closure. Anyone, given enough characters, can find unlimited way to suture together people’s backstories in compelling ways, but the mere fact that encounters have taken place outside of the frame narrative offers little commentary on why these particular events are uniquely compelling, why these characters are now on an island (with a capital “I”) in the middle of the Pacific. One can see why so many were tempted by the now-debunked theory that the Island was an allegory for the Afterlife. Transitioning to flashforwards is a different matter altogether, however, because it forces the narrative of the present towards some specified point. The events in the future now need to be a direct consequence of what is occurring in the present, rather than just an additional lacquer of explanation: basically, from season three til the very end, the producers need to have not only some clue as to what the fuck they’re doing, but a fine-tuned sense of the narrative’s shape.

Thus the best episodes are those that flaunt this sense of precise consequentiality, as in the episode where Desmond, like Vonnegut’s Billy Pilgrim, becomes unstuck in time, or when the simultaneous rendition of Sun’s delivery and Jin’s scramble to buy gifts for the baby is revealed to be at once a flashforward and a flashback, a trick of clever editing to exaggerate the reveal that Jin really is dead, or at least thought dead, in the not-too-distant future (which, come to think of it, yet further becomes a kind of metacommentary on just how inconsequential the flashback sequences really are, since it’s Jin’s backstory that has no real purchase on the events of the story itself).

So where did last night’s premier fall flat? In offering partial solutions to some questions – yes, Juliet is dead, and not in the “Sayid is dead” or the “Jin is dead” sense, but in the “Miles is able to commune with her from beyond the grave” sense – while yet further obfuscating others. Rather than establishing some preliminary traction to the Jacob-Adversary feud, or at least refining how this pair affects the continuing (well, now aborted) struggle between the Dharma Initiative and the Others, this episode of “Lost” presents us with new Others that we’ve never seen before, Others who, despite their inclusion of the stewardess from the original flight, and zoo-like witness to Jack’s captivity, seem so fundamentally divorced from the figures we already know – Richard, Ben, or even Jacob – that they might as well be a new species entirely. Despite it’s too-obvious betrayal of the reporter’s politics, the WSJ’s live-blog of the event actually settles on perhaps the easiest label for this new batch of cronies: one part “hippy,” one part Golden Triangle “drug dealer.” My bet is that by dressing this newest batch of Others as though they’d just left Burning Man (and having them, like Jacob, walk around the most hazardous place on Earth barefoot), the producers hope to remystify what has already been domesticated: no sooner had Tom, the mouthpiece of the first batch of Others who kidnapped Walt, removed his fake beard, than we were actually whisked away, literally, to the suburbs (a place that Lindelof and Cuse – dammit, there, I did it – uncutely refer to as “New Otherton”).

Formally, I’m actually okay with the introduction of competing factions or subfactions, so long as the endgame is achieved intelligibly; yet by expanding the overall cast of characters in the very moment they’ve “promised” to contract the storyline, the “Lost” team plays right into the hands of the show’s critics while denying fans any semblance of what they actually want. As some of today’s day after response seems to indicate, many people, even people who do this kind of thing for a living, continue to think about the show in character-based platitudes, and that’s fine, but when a show reaches a point where major characters can be killed off twice in the same episode, and yet its audience barely has time to register their indifference because they’re too wrapped up in why the hell John Lennon is living with someone from Mortal Kombat, it’s fair to say that the show’s ability to create compelling fictions has been effectively anesthetized. I don’t feel betrayed: Lindelof and Cuse owe me nothing. But it damages the integrity of my cultural memory to have three seasons of twenty-four episodes, three seasons of sixteen episodes, one hundred and twenty hours of my life rendered laughable thanks to the excesses of over-elaborate puppeteering. Soon I’ll begin to feel like all those poor people who watched “7th Heaven.”

My broader compunction has to do with the introduction of this season’s new narrative mechanism, neither flashback nor flashforward but “flashsideways.” Now the whoosh of time shifts past is gone, and we’re led to believe that detonating a hydrogen bomb atop an electromagnetic anomaly leads to the establishment of two parallel universes. From an “internal logic” standpoint, this is fine: it’s no weirder than smoke monsters, or communion with the dead. But at a narrative level, it’s troubling, if only because it allows for no subordination, no privileging, of events in one timeline against to the other. In fact, the two narrative threads are no longer even looking at one another. Whereas the universe where Flight 815 lands safely at LAX must now, perforce, now look obsessively at the show’s past, to deal with the million and one things that might now be different – no Shannon on the plane! and there’s Desmond! – the more conventional “on Island” universe must now blaze on ever forward, solving the mysteries of Jacob and hopefully, hopefully, hopefully finding some way to bring both timelines into communion. But precisely because the “safe in LA” timeline is just reexamining discrepancies in things I already know (“Hey, Bernard didn’t make it back from the bathroom before”), rather than things I don’t, it amounts largely to intellectual masturbation, and it’s difficult to imagine why I should even care.

ABC, along with the producers of “Lost,” have been quick to tell us what they’ve told us at every season premiere, that new viewers can tune in at Episode 601 and never miss a beat, and to that end, the “safe in LA” universe makes perfect commercial sense. But in order for anyone, especially stranded newcomers, to feel compelled by this part of the storyline, each segment of action needs to remain individually compelling, and for the life of me I don’t know how seeing a guy in a wheelchair console a guy in an airport accomplishes that. What the producers seem to have lost sight of, or never known to begin with, is that by divvying up its narrative between one contiguous chunk of plot (“people on island run amok”) and yet packaging each episode around one narrow conceit (“this episode focuses on the memories of this character”), “Lost” always had this kind of “newcomer appeal” to begin with. “Lost,” Seasons One through Five, belongs to the grand tradition of exquisitely overarching master narratives subdivided into more readily digestible chunks, stretching back through “Buffy’s” “monster of the week” motif and into the serialization of Victorian novels, with their uncanny ability, in final, collected form, to explicitly remind us of who characters are at unnecessary moments, but moments that would have been necessary to newer subscribers. What we’ve seen of the new season seems to take that format too much to heart, by opening itself up to even the greenest gaze, yet in neglects to touch on one other, narrower aspect of a serialized plot: the scene transitions between commercials. And if I reach a point where I’m better off muting the screen every time I don’t see sand and palm trees, it’s difficult to imagine why I should bother turning it on at all.

Leave a comment

Filed under Uncategorized

Post-Mortem Post

Is literary fiction dying thanks to the preponderance of MFA programs, as Ted Genoways, editor of the Virginia Quarterly Review holds? Only if you believe in deaths of the Black Knight in Monty Python variety, writes Roxane Gay.

Considering that Genoways’ original point seems to be that literary fiction is too successful, I’m going to have to hand the point to Gay.

Leave a comment

Filed under Uncategorized