Poor Players Taking Flight: The Sad, Strange Artistry of Iñárritu’s Birdman

Birdman (The Unexpected Virtue of Ignorance)
Alejandro González Iñárritu
New Regency

Meditations on failure–especially failure as it relates to the pursuit of greatness–call to me like the bat signal to Bruce Wayne. JCVD, The Wrestler, Crazy Heart–heck, even the recent Netflix series, Bojack Horseman–all have at their core a gratifying honesty about our capacity to grieve and regret personal failings without ever really surmounting them.

Birdman, though, is all this and more. Certainly, promotional material just gives us the story of Riggan (Michael Keaton), a washed-up actor trying to emerge from the long shadow of his blockbuster career as the superhero “Birdman” by writing, directing, and starring in an adaptation of Raymond Carver’s What We Talk About When We Talk About Love. However, in practice Birdman reflects on the failings of a number of characters–including us, the audience not just of this film but of film and theatre in general.

This indictment of the audience is deftly achieved with camerawork that wends us through the labyrinthine underbelly of Riggan’s Broadway theatre–bystanders in one character arc; seemingly active participants in the opening of the next–but it remains a risky thematic move on director Alejandro González Iñárritu’s part. Then again, a film about the craft and consequences of performance is necessarily risky, and by taking more such risks, Birdman avoids becoming too self-conscious on this accord, escaping instead into the company of experimental films like Synecdoche, New York (2008), Holy Motors (2012), and even Brazil (1985).

Moreover, that Terry-Gilliam-esque feel is not limited to a dizzying claustrophobia or the shadowy, peripheral emergence of short people and reindeer (the latter bizarrely part of the Carver stage adaptation); rather, from the outset we’re also given to believe that Riggan might genuinely have superpowers. To this end, the voice of Birdman dogs him relentlessly, undercutting any joy and artistic reconstitution he had hoped to find in this theatre project, and goading him into actions both heavily foreshadowed and pointedly ambiguous. Indeed, a good question to ask of this film, as with Brazil, is where on earth the realism ends. (I count at least three places in Birdman where surrealism might have conquered all.)

Again, though, the film as a whole never rests on just one target. For the first half, this means heaving and lurching between Riggan’s inner turmoil and the opposing disingenuousness of Broadway legend Mike Shiner (Ed Norton), but as the film progresses, it also takes swipes at the movie industry and the cowardice of critics upon whom careers are made or broken. At times the film also seems to recapitulate some of Carver’s short stories, playing out truly jarring side-material between Mike and girlfriend/co-cast-member Lesley (Naomi Watts), then Lesley and co-cast-member Clara (Natalie Gold), but these aesthetics soon give way to the usual tricks of big blockbusters and magical realism (Borges’ Labyrinths even gets a cameo in a sun-tanning bed). The soundtrack similarly swings between the kind of pit band one might expect with live theatre, to more conventional, orchestral scores, to arias marking a weakening grasp on reality in Riggan’s inner world.

As the film’s own performances go, one character consciously exemplifies bad acting, and in so doing establishes a theme of “going to extremes” that the rest of the film rigorously builds upon. Against this contrast, Keaton and Norton have an easy time illustrating good acting, while Zach Galifianakis, in a marginal role as producer/lawyer/inexplicably-sidelined-best-friend, shows a capacity for something other than deadpan and slapstick comedy. But the standout performance is clearly Emma Stone as Riggan’s daughter, Sam: Somehow her precariously large, watery eyes in no way soften the wallop of a scathing attack on her father’s sense of worth–a tirade she makes up for just a couple beats later, with one perfectly understated word.

Not surprisingly, Shakespeare also finds a way into this production about productions, with someone on the streets of Broadway belting Macbeth’s “sound and fury” number with all the agony of Ginsberg’s Howl; and to hear that former text meted out piecemeal at the height of our protagonist’s despair, the implication that all this has happened before and will happen again becomes inescapable. Rarely have I ever so longed to and dreaded returning to a film I’ve just viewed, but Iñárritu’s Birdman is a wild, slippery, and deeply unsettling lament–not just for the folly of individual aspirations to greatness, but also, more fundamentally, for our failure to know what it is we ever really we talk about, when we talk about greatness at all.

“A Progress Accompanied By Constant Violence”: Gibson’s Grey New World

The Peripheral
William Gibson
Putnam

I was seventeen when I read Pattern Recognition, a solid run of classic scifi already behind me. Cayce Pollard, her world, showed me what science fiction could really do. What it was meant for, maybe. William Gibson, father of cyberpunk, didn’t need to create much in the way of new technology for this post-9/11 effort; he just showed us how the present world, with all its current conceits and power dynamics and technologies, could already be exploited in other, deeply significant ways. Science fiction of the present, in other words–but then, all science fiction is really about the present. It’s just the rare writer who can present this fact so blatantly, and so well.

I felt that same, breathless excitement when I started the The Peripheral–Gibson’s prose here every bit as precise as that of Pattern Recognition. Since the story is also dense with detail and narrated through the limited omniscience of two protagonists, operating in different temporal regions, this precision could easily have been the book’s downfall. (I, certainly, was afraid that if I put it down I’d have trouble picking it up again.) But Gibson seems wise to this danger: Granted, his sentences run short, and information is parcelled out in equally clipped bites, but he keeps his actual chapters brief as well–most written in a punchy style that holds our two protagonists close in the reader’s mind, even if their worlds are not.

Stylistically, then, The Peripheral is pure mystery-thriller, and much of the content leans this way, too: Flynne Fisher is stuck in a near-future American wasteland of passed-over war veterans, impossible healthcare costs, illicit drug manufacture as the only stable job source, and big businesses so big that their financial attributes seem to bleed into other dimensions. Meanwhile, Wilf Netherton, in a far-flung, post-global-fallout London, is a carelessly amorous publicist in a world rich in scientific advances (if low on people), where messing with alternate timelines is a luxury hobby with huge benefits for offshoring crime. When Flynne beta-tests what she thinks is a video game and sees something she’s not supposed to, she becomes the target of a future/alternate-continuum hit–and a critical part of Wilf’s own attempts, in conjunction with business associates, to track down a murderer.

However, all the mechanics of a mystery-thriller are subordinated in this text to richly speculative slice-of-life details: details that become especially uncanny because of their resonance with much in our present-day world. Although this book does offer the excitement of murder, attempted murder, kidnapping, and a final showdown, the long lead-up and surrounding implications of modern technology preoccupy far too much of the text for The Peripheral to be read as murder-mystery alone.

To this end, the first third of the book follows the dawning realization that these timeframes we’re alternating between, chapter-by-chapter, are aligned with different realities. This realization is interwoven with observations (teased out of everyday conversations around Flynne’s home) about all manner of current affairs, including drone warfare, combat-oriented video games, prosthetics technology, mass-produced lab-grown meat, weaponry upgrades, the various uses and abuses of 3D-printing, shifting global economies, the questionable priorities of national and local security, and emerging medical conditions.

But the real crux of novel, and the source of its title, comes from a technology in that far-flung, “post-jackpot” future: “the peripheral”, an anthropomorphic body that runs off its own, limited AI until inhabited and operated by a human being through direct mental link-up. Yet even this far-future technology is not so great a leap from current technologies–like lab testing that’s allowed one human being to control the movements of another through thought alone, or the ever-expanding possibilities of gaming in general. In classic Gibson style, then, while action and intrigue unfolds throughout the text, the peripheral, as a concept, is explored at equally great length and detail, especially as Wilf prepares for Flynne to become present in his timeline and moment through this very technology.

Even the ending of Gibson’s text is far more a comment on the current state of our financial system, with all its alarming technological self-sufficiencies, than any neat-and-tidy wrap-up of a murder mystery. What matters less is that a case is solved (almost a given for the crime genre) than the means by which lives are rebuilt in the case’s aftermath. Is any of it sustainable? Are the beds these characters lie in worth a damn in the long run?

Earlier this year, I read Thomas Pynchon’s Bleeding Edge, which–despite also addressing the fundamental uncertainty of technology and what technology brings out in us (as positioned around the bottoming out of cultural meaning wrought by 9/11)–left me underwhelmed. Likewise, David Cronenberg’s Consumed negotiates our physical relationships with modern technology in a number of haunting ways, but I was not satisfied with its attempt to tie this strangeness to broader, world-order narratives.

I couldn’t quite put my finger on the reasons for my disappointment with both these volumes, though, until finishing Gibson’s latest. And yes, this ending is deflating, too: The seamless integration of a few characters into systems we know full well are destructive offers little hope of more meaningful change. But everything about the way Gibson’s shown us this world–the class divide presented from the outset; the meticulous object fetishism engaged with throughout; the high priority placed on sparing, but ultimately human dialogue–invites the reader to reflect more fully on the characters for whom change is still possible.

Without openly saying it, then, or even providing a coherent alternative to certain collision courses our technology might place us on, The Peripheral is without a doubt a call to arms. There is an old adage: Wherever you go, you are. In Gibson’s world, this is no less true, it seems, no matter how extraordinary and far-off our destinations. So who are we, anyway, and what might we become when we arrive?

Classy as All Get-Out

John Wick
David Leitch & Chad Stahelski
87Eleven (and others)

Calling a film “slick” has terrible connotations, suggesting not just an exceptional level of cinematic polish but also a structural gloss that offers the viewer no emotional way into the story. I’ll say instead, then, that the action-thriller John Wick is smooth–smooth like those two fingers of Johnny Walker Black you’re taking neat because you’re a decent human being; smooth like the feel of a high-thread-count silk shirt on bare skin; smooth like the leisurely stride of a body with nothing to prove across a nightclub floor. Even Keanu Reeves’ weakness with dialogue in no way detracts from the sleek, stylish underworld his character (our protagonist) must re-enter in order to seek revenge; the whole movie thrums with class.

Amid all this aesthetic appeal, of course, is a plot: John Wick is grieving for his wife when her parting gift arrives, but he only gets a day with this new puppy and lifeline before the punk-ass son of a Russian mobster decides to steal John’s car and murder John’s last living connection to his wife in the process. However, Iosef (Alfie Allen) is soon set straight by his father, Viggo (Michael Nyqvist), about just how big a mistake that carjacking was: John, we discover, was called the Boogeyman when he worked for Viggo because he was the guy you sent to kill the Boogeyman. Yikes. Viggo thus tries to pre-empt the vengeance plot he knows John will be planning, and the race is quickly on between one man with nothing to lose and… well, the entire Russian mob, it seems, in an effort to protect one loser son.

Indeed, Iosef is visually out-of-step in this film–a diminutive character who talks a big talk from the safety of his entourage, does what he wants, and can only scamper, rodent-like, when a crisis emerges. It is likely no surprise that the only sexploitation we see in this film (scantily-clad waitresses working pool-level at the nightclub where he and his homies are idly drinking while John is on the way) relates to his utter disregard for the consequences of his actions; he and his ilk are clearly cast as the entitled me-generation ruining an otherwise great thing the real menfolk had going.

All around him, to that end, are the endearing constructs of an “old guard”: immaculate, elaborate homes built on the wealth of mob business; a private currency used to access secret services and clubs; a hotel with strict rules about mob business being off-limits therein; and a host of characters who respond to all manner of extremely violent episodes with the sort of neighbourly conversation one might expect when taking out the trash. In conjunction with the playful, comic-book construction of subtitles for all Russian dialogue, these wry, wink-and-a-nudge asides give the whole universe of this film a touch of nostalgia for the good ol’ days.

And the people who operate in this old world are, of course, professionals: John Wick makes every shot, despite the training of his would-be assassins, but the film is also replete with dynamic action sequences that necessitate reload after reload while fighting at close quarters, and negotiates the problems that reality incurs with similar aesthetic grace. John is also equally skilled in gunfighting, knife-fights, jiu jitsu, and (shall we say?) vehicular manslaughter with extreme prejudice–all of which, in conjunction with a tight soundtrack, keep the pace lively and the fight scenes diverse.

This is not to say there aren’t moments of dubious deus ex machina, especially when John gets into scrapes needed to further the plot or set up a change in location, but on the whole, the film stays on point: no silly side-plots with new flings (no female assassins used as sex objects at all, even!); conversation at a strict minimum; and the fight scenes aerobatic, intense, and brutal without ever dipping too far into absurdity. As action thrillers go, John Wick is thus a powerful reminder that a) masculinity need not be defined in relentless contrast to the female form; b) attention to detail supplants the need for a lot in the way of special effects; and c) goddamn, when are theatres going to allow me to drink scotch openly while watching movies that could’ve been lifted from the pages of GQ or Forbes magazine?

There is simply no justice in the world.

Restorative Justice in a World of Competing Truths

There’s a photo meme that pops up on social media every now and again, claiming the existence of “a tribe in Africa” with a different approach to crime. When one among their number does something wrong, the meme claims, he’s taken to the centre of the village, where the entire tribe surrounds him. For the next two days, they tell him all the good things he’s ever done, because they see all human beings as innately good, and any bad act as a cry for help. Their actions, the meme claims, help the perpetrator reconnect with his fundamental goodness. It’s a fuzzy, feel-good concept slapped atop photos from any number of African tribes; however, not only does the tribe with such a practice seem not to exist, and not only is the whole meme blatantly fetishistic of “primitive” cultures in general, but the approach to restorative justice itself falls apart upon close inspection.

A group of young Maasai men in a “jumping dance”; this picture from Wikipedia shows up a heck of a lot alongside the feel-good story of “a tribe in Africa”, as if they’re all interchangeable, and as if justice is meted out so simply throughout the continent.

For example, what if one individual did violence to another member of this community? Sexual violence, perhaps? The meme says that the “entire tribe” comes together to restore the individual to his sense of natural goodness–but what if the victim does not wish to take part? Are they obligated anyway? If the community insists, then the perpetrator’s needs are valued higher than the victim’s; if the community does not, then the individual will not be fully restored to a state of goodness, and their past trespass will continue to mark interactions with at least one member of the community.

Justice, in other words, is never simple. Life–that great, relentless aggregation of experience, positive and negative–never permits it to be so. We trespass against others and others trespass against us, and more often than I suspect any of us should like, these trespasses are never corrected: not really. We cut ties when doing more (pressing charges, for instance) seems too dangerous; we part ways when all parties seem to have different, irreconcilable understandings of the trespass itself; we forget; we die.

Yet somewhere amid such individual failings, we as a society still have to aspire to something more–at least, if we want to live with any reasonable expectation of collective safety. But by no means is this easy: not when we essentially have two judicial systems (the legal courts and the court of public opinion) running in lockstep, and often with utterly competing motives. A mother, for instance, might well decry the state of juvenile detention as too cruel, and too likely to limit an already vulnerable individual’s options for reintegration into society; the same mother might also be furious to learn that a juvenile offender with a history of sexual violence was released into the community after serving his age-reduced sentence.

It’s complicated, in other words. Is justice meant to be a punitive, preventative, or restorative endeavour? And if all three, in what order, and where do we draw the line? When do we say that an individual has lost their right a) to return to general society in any capacity, ever, and b) to return to general society after serving their court-determined sentence without being further punished by the court of public opinion? More to the point, what is the appropriate role for public opinion (majoritarian, knee-jerk, and easily influenced by the latest breaking news) in all this talk of justice anyway?

These are never easy questions to answer, as evidenced every single time a potentially criminal situation plays out primarily in the public sphere. Right now, for instance, Canada is dealing with a high-profile case of he-said/multiple-shes-said, one which happens to exemplify a great many irreconcilable gaps in the current means by which we seek justice, restorative or otherwise, and evaluate individual truth claims therein.

The trouble this time started soon after former CBC radio host Jian Ghomeshi was stripped of his popular show, Q, and all further association with the CBC after refusing to resign. Ghomeshi hired a PR firm to issue a statement on his behalf, claiming that he was fired for his interest in non-vanilla sexual practices, a situation he claims was exacerbated by an ex-lover’s plot to destroy his reputation in collusion with an investigative reporter. This statement was also made in anticipation of a lawsuit on his part, which will involve suing the CBC for $50 million in damages.

While the CBC has not issued a detailed statement on the reasons for their decision to fire Ghomeshi, The Toronto Star then filled in some of the blanks last night, in publishing an article explaining the claims of non-consensual sexual violence (and assault in general) that one of their investigative reporters gathered from three women who were initially interested in intimate encounters with Ghomeshi, as well as a report of sexual harassment at the workplace that another woman took action on through her union, only to be asked what she could do to make the workplace less toxic.

The short-term aftermath, from a social media perspective, has been predictable, but no less fascinating. Many people, having read Ghomeshi’s statement first, were horrified by what seemed to be CBC’s prudishness and invasive attitude towards what happens between two consenting adults in the bedroom. Then the assault claims emerged, and some of those same proponents were mightily displeased with the possibility that the rhetoric of BDSM (a community with very rigorous discourse around consent) was being used as a shield for rape. Noted sex columnist Dan Savage summed up this perspective in the following tweet:

DanSavage

(Andrea Zanin also offers a more nuanced breakdown between Ghomeshi’s claims and those of the anonymous women, as both pertain to BDSM culture.)

For myself, I should also note that when I saw Ghomeshi’s response to this situation, I couldn’t help thinking how differently I would react if I were innocent of the acts he’s been accused of. I can’t help thinking that if I were in a prominent position and needed to release a pre-emptive statement, I would write about how seriously I take the forthcoming charges of sexual assault and harassment, and how devastating it is to be on the receiving end of these accusations, both for myself and for people who have trusted and believed in me for so long. I would ask for patience as the facts around this situation emerge, and tell people who wished to support me in this difficult time to do so by supporting agencies in their community that help to stop the cycle of violence. I’d then close by listing shelters and other aid organizations committed to such ends. But I’m not Ghomeshi, and there is a clear limit to how much I have any right to judge his culpability based on how far his actions deviate from what mine, if so accused but innocent, would be.

Meanwhile, many who have enjoyed Ghomeshi’s work with the CBC over the years, or perhaps adored him in his Canadian-indie-band incarnation, Moxy Früvous, have been openly struggling with the hope that these brutal allegations are nothing more than a smear campaign. I have seen people openly ask questions to the effect of “How can someone so well-liked by so many, so well-read, and so seemingly intelligent have committed such acts of sexual violence?” Indeed, as one friend noted, even The Toronto Star brings class into the equation when assessing the validity of these allegations. The article makes a point of saying that the women claiming Ghomeshi struck, choked, or otherwise abused them are all “educated and employed”, which suggests that the reporter understands the significance of a victim’s “image” when going public with such claims against a high-profile figure.

This significance is underscored by another set of critical responses to the situation as it unfolds: people for whom the lack of criminal charges is a critical point in Ghomeshi’s favour. These are the folks who asked first: Why didn’t the women go to the police, if their attacks really happened? Putting aside all the cultural realities such a question avoids (e.g. the low reporting rate for sexual violence in general, the equally low conviction rate for all one’s trouble if a person does report, and the particularly severe threat of reprisal for “going after” a celebrity), this line of reasoning also implies that the legal court is the only court in which criminal matters should be resolved, such that a verdict therein should be accepted as the final word either way. Has Ghomeshi ever been tried and convicted of sex crimes? No? Well, the argument goes, case closed; everything else is “just” hearsay.

Obviously, this worldview does not accord with reality: Not only do we have criminal courts, which are as fallible as their human components, but also civil courts, where cases that might have failed the criminal test can still be tried in pursuit of monetary reparations; and of course, even if both fail, the court of public opinion, where careers are lost regardless. However, it’s easy to see why so many people wish to deny the complex interplay between these three: Justice would be a much simpler affair if this reality weren’t so.

Moreover, there is clear danger (within our current system) in abandoning the criminal courts as our primary means of seeking justice for criminal activity, for if we do, we are left pitting individual truth against truth in the no-holds-barred court of public opinion. And to what result? With criminal courts, at least, comes the possibility of conclusion: The trial ends, the sentencing ends, and eventually the sentence ends, too. But where is our corollary in the public sphere? When, if ever, do its sentences come to a close?

I have no idea how the Ghomeshi situation will play out; the CBC lawsuit will surely continue apace, and the allegations against him will likely develop as time goes on, but for all I know, they might do so in a way that allows Ghomeshi (who at present still remains quite popular with many Canadians) to land on his feet, career-wise. In the meantime, though, this situation just illuminates the extent to which we’re living with a system of widespread non-confidence in institutional justice, and a public sphere in which criminality is negotiated to even more distorted and irreconcilable ends.

Is there a way out of this quagmire? If we want to move past this state to something more productive, we might do well to ask ourselves what punitive, preventative, and restorative justice could look like in the public, non-legal sphere. Granted, this is difficult to achieve because of our culture’s predilection for lawsuits, but if there were a better set of standards by which the court of public opinion worked to process, penalize, then ultimately seek to reintegrate those who have done harm, is it not possible that the current (and very legitimate) fear of lifelong character assassination would cease to play such an integral role in how we weigh competing claims to truth?

This not so abstract a question, either: Unlike the feel-good meme I shared at the outset, we do have real-world examples (many of them from real places in Africa, to boot) of how oratory in the public sphere can work to restore social order on both an individual and communal level after even the most devastating acts of human violence. Truth and reconciliation commissions are broadly tasked with helping countries acknowledge and begin the arduous work of moving past heinous crimes by allowing individuals to tell their stories, while restorative-justice-styled mediation (modelled after the workings of certain indigenous communities in Canada) allows all involved parties to negotiate their own, situation-specific resolutions in a less combative environment.

So–there are glimmers of hope, even if it’s not immediately clear how to apply them in situations like Ghomeshi’s case above. Is the hive-mind of the internet even capable of similar feats of restorative justice? Can we use the court of public opinion to fill in the gaps left by our existing judicial system–One, without further exacerbating the vulnerability of potential victims? And two, without delivering life sentences to every potential perp?

The alternative is a sort of half-life for those caught up in an injustice, like something out of Samuel Beckett’s Play–each of us inside the situation narrating our personal truths in a deluge of competing voices; and all of us outside, looking in, grown so cynical that we either cleave to our first instincts or see no way of differentiating between competing truths at all. In light of such visions, I simply have to hope we haven’t reached the end of what modern technology might enable us to do, in pursuit of a more just world.

Disingenuous Critique: John Gray’s Review of Richard Dawkins

I almost wonder why I’m bothering to write a response to John Gray’s New Republic review of Richard Dawkins’ An Appetite for Wonder: The Makings of a Scientist. I am a strong opponent of hero-worship, I have not read Dawkins’ autobiography myself, and even without reading it, I can already anticipate many facets of Dawkins’ fairly affluent, colonial upbringing that probably go less examined and deconstructed than would satisfy a modern, more culturally self-reflective audience.

However, for a philosopher, Gray so sorely abuses the rhetoric of his discipline to perpetuate a long-standing attack on Dawkins–and moreover, does so on the back of some truly distorted representations of 19th-century thought–that I feel I would be remiss, as one who studies the non-empirical uses of rhetoric in 19th-century science writing, to keep mum on the review as a whole. This is absolutely not to say that any critique of Dawkins is inherently wrong; I simply wish to see prominent public thinkers use the best possible argumentation when they set about dismissing other claims.

To this end, my issues with Gray’s wholly inappropriate approach (and again, he is a seasoned political philosopher; there is no excuse in ignorance to be found here) begin almost at the outset, when he cites a portion of Dawkins’ The Selfish Gene (1976) and then applies a form of psychoanalysis more in keeping with humanities scholarship of the same academic era:

Intelligent life on a planet comes of an age when it first works out the reason for its own existence. If superior creatures from space ever visit earth, the first question they will ask, in order to assess the level of our civilisation, is: “Have they discovered evolution yet?” Living organisms had existed on earth, without ever knowing why, for over three thousand million years before the truth finally dawned on one of them. His name was Charles Darwin.

Several of the traits that Dawkins displays in his campaign against religion are on show here. There is his equation of superiority with cleverness: the visiting aliens are more advanced creatures than humans because they are smarter and know more than humans do. The theory of evolution by natural selection is treated not as a fallible theory—the best account we have so far of how life emerged and developed—but as an unalterable truth, which has been revealed to a single individual of transcendent genius. There cannot be much doubt that Dawkins sees himself as a Darwin-like figure, propagating the revelation that came to the Victorian naturalist.

First, Gray pathologizes as distinctly Dawkinsian an exceedingly popular argumentative trope (“If aliens came to Earth…”) that always presupposes the observing aliens to be a more intelligent form of life. From here, Gray calls out the myth of individual genius–which is absolutely a dangerous and inaccurate concept, but also one that permeates the great bulk of Western ideology, whereby individuals–of industry, of political office, of activism, of science and technology, of scholarship–are routinely celebrated to the exclusion of the communities and cultures that fostered their growth and contributed to their greatest achievements.

But the reason for treating these conventions as exceptional and bizarre in Dawkins is made clear in that last line: “There cannot be much doubt that Dawkins sees himself as a Darwin-like figure…” Really? Plainly asserting a paucity of doubt does not make it so. The only argument being made here is by the most tenuous of extrapolations: If Dawkins does indeed see Darwin as a “single individual of transcendent genius”, how does it intrinsically follow that by “propagating [Darwin’s] revelation” (that is, not creating any new revelation himself) he fashions himself as another Darwin?

To put it another way, billions of people regard Jesus Christ, Muhammad, Buddha, the Dalai Lama, or heck, Ayn Rand or Deepak Chopra as singularly transcendent individuals. Without denying the likelihood that some people who buy into this myth of individual genius might also consider themselves transcendent individuals on par with their idols, there is absolutely nothing intrinsic about this connection. The great majority–as should be expected in a species so dangerously prone to groupthink and obedience to perceived authority–idolize from afar.

Gray goes on to talk about the way Dawkins seems to treat his upbringing in British-occupied Malawi; the comfortable colonialism is, again, not surprising, but the insinuations Gray draws from moments of self-reflection are. To this end, Gray writes on a boarding school incident as follows:

Today, Dawkins is baffled by the fact that he didn’t feel sympathy for the boy. “I don’t recall feeling even secret pity for the victim of the bullying,” he writes. Dawkins’s bafflement at his lack of empathy suggests a deficiency in self-knowledge. As anyone who reads his sermons against religion can attest, his attitude towards believers is one of bullying and contempt reminiscent of the attitude of some of the more obtuse colonial missionaries towards those they aimed to convert.

I am trying to avoid the language of incredulity, but as an honest reviewer of an autobiographical text, what exactly is Gray expecting here–that Dawkins interrupt his chronological narrative to reflect on the possible similarities between his boyhood indifference to the physical abuse inflicted on a small child in front of him, and his articulated opposition to religion, as espoused in spheres of adult debate?

But no–it’s not even that simple, because Gray shifts the goalposts even within his thought: starting with the abused English boy and ending by likening adult-Dawkins to “some of the more obtuse colonial missionaries” by engaging in public debate. Remember that Gray’s indictment here is a “deficiency in self-knowledge”, but the standard for “self-knowledge” he sets here is nothing less than Dawkins moving from the case of this abused little boy to a recanting of his public debates against religion as a kind of bullying emblematic of the entire British colonial system. For someone who extols the importance of nuance and gradation when assessing human behaviours, Gray is highly selective about the behaviours given a pass.

Of course, Gray’s particular biases come to the fore soon enough; an atheist himself, he makes his case for religious value from very selective constructions of faith. In a paragraph acknowledging that Dawkins’ atheism emerged amid a groundswell of the same, Gray writes:

If there is anything remarkable in his adolescent rebellion, it is that he has remained stuck in it. At no point has Dawkins thrown off his Christian inheritance. Instead, emptying the faith he was taught of its transcendental content, he became a neo-Christian evangelist. A more inquiring mind would have noticed at some point that religion comes in a great many varieties, with belief in a creator god figuring in only a few of the world’s faiths and most having no interest in proselytizing. It is only against the background of a certain kind of monotheism that Dawkins’s evangelical atheism makes any sense.

Did you catch that reconstruction of the world’s religious backdrop? “[A] creator god figuring in only a few of the world’s faiths” is a curious way to present the overwhelming religious dominance of the Abrahamic faiths (Judaism, Christianity, Islam) and their pressing, ongoing influence on sociopolitical affairs in the Western world. Moreover, Gray is blatantly dishonest when he writes “A more inquiring mind would have noticed at some point that religion comes in a great many varieties”: the previous paragraph quotes Dawkins as learning from his mother “that Christianity was one of many religions and they contradicted each other. They couldn’t all be right, so why believe the one in which, by sheer accident of birth, I happened to be brought up?” This was the conversation, according to Dawkins, that started him on the road to atheism.

Gray then dissents from Dawkins on the Pauline notion of original sin, and further makes the oft-used, inaccurate claim that Biblical literalism is more the stuff of “[c]oarse and tendentious atheists of the Dawkins variety” than Christian history, citing Augustine’s interrogations of what the words in Genesis might mean to imply that he didn’t, say, believe in a very young Earth. I went into the flaws of this rhetoric in my criticism of the Slate.com article by Michael Robbins, but I will reassert here that allegorical interpretations were absolutely not made exclusive of literal interpretations by early Christian figures–and why would they be? It is no great mark against early Christian figures to say that they operated as well they could with the knowledge they had on hand. Until the concepts of deep time and deep space gained public purchase in the 19th century, it’s only natural that historical accommodation would be made for the events spelled out in Genesis.

On this accord, then, Gray would be well-served in reading contemporaneous reviews on and related cultural responses to both On the Origin of Species (1859) and The Descent of Man (1871) (the latter of which explicitly tethers humankind to evolutionary theory). Gray writes “When he maintains that Darwin’s account of evolution displaced the biblical story, Dawkins is assuming that both are explanatory theories—one primitive and erroneous, the other more advanced and literally true.” Well, yes: Taking into account the extreme hostility of most initial reviews, particularly from Christian sources, and the fact that 11,000 Anglican clergymen signed a declaration in 1860 that the Bible must be taken literally, and the banning of On the Origin of Species in Trinity College, Cambridge, if Dawkins is “assuming that both are explanatory theories”, then he keeps good company with the contemporaneous readers of Darwin who believed the same.

Another point of carelessness regarding 19th-century thought emerges after Gray criticizes Dawkins for not demonstrating a nuanced understanding and evaluation of different philosophies of science: “empiricism”, “irrealism”, and “pragmatism”. In the very next paragraph, Gray introduces “positivism” in the 19th century, as if it were a singular school of thought, and not in actuality a range of philosophical responses to (among other things) Hegelian negativism. When he returns to positivism in a later paragraph, Gray shows just how poorly he understands this discourse when he writes:

More intelligent than their latter-day disciple, the positivists tried to found a new religion of humanity—especially August Comte (1798–1857), who established a secular church in Paris that for a time found converts in many other parts of the world. The new religion was an absurdity, with rituals being practiced that were based on the pseudo-science of phrenology—but at least the positivists understood that atheism cannot banish human needs that only faith can meet.

No. Comte tried to found a new, humanist religion in the latter half of his intellectual career, a fact which drew criticism from other thinkers developing philosophies of science in the positivist vein. While public thinkers like William Whewell, John Stuart Mill, and G. H. Lewes all wrote works negotiating positivist points of view, their negotiations of Comte’s religiosity was secondary to their negotiations of his earlier work, and even then, the ensuing schools of thought differed widely. In the introduction to his Illustrations of Universal Progress (1864), Herbert Spencer articulates this division plainly when he writes:

But it is not true that the holders of this doctrine and followers of this method are disciples of M. Comte. Neither their methods of inquiry nor their views concerning human knowledge in its nature and limits are appreciably different from what they were before. If they are Positivists it is in the sense that all men of science have been more or less consistently Positivists; and the applicability of M. Comte’s title to them no more makes them his disciples than does its applicability to the men who lived and died before M. Comte wrote, make them his disciples.

My own attitude toward M. Comte and his partial adherents has been all along that of antagonism. … I deny his Hierarchy of the Sciences. I regard his division of intellectual progress into three phases, theological, metaphysical, and positive, as superficial. I reject utterly his Religion of Humanity. And his ideal of society I hold in detestation. … The only influence on my own course of thought which I can trace to M. Comte’s writings, is the influence that results from meeting with antagonistic opinions definitely expressed.

I also call attention to this inaccuracy of Gray’s because it exists so blatantly in service of ad hominem argument. Can any other purpose be divined from the construction, “More intelligent than their latter-day disciple, the positivists”, than to suggest that Dawkins is a fool who would do well to be more like his amicable, unified, humbler positivist forefathers?

This idealized version of Victorian discourse emerges elsewhere in Gray’s review, and I for one am I tired of public thinkers deciding that the appropriate “tone” for intellectual debate today should be based on such glosses of our past. For someone who wants Dawkins to be more self-reflective in relation to the influence of British colonialism on modern rhetorical practices, Gray also ignores how discussions about science in the 19th-century were routinely used to reinforce cultural notions of ethnic, geographical, and sex-based superiority: By no means are these the “glory days” of such debate. And then there are more basic issues with his history:

Unlike most of those who debated then, Dawkins knows practically nothing of the philosophy of science, still less about theology or the history of religion. From his point of view, he has no need to know. He can deduce everything he wants to say from first principles. Religion is a type of supernatural belief, which is irrational, and we will all be better off without it: for all its paraphernalia of evolution and memes, this is the sum total of Dawkins’s argument for atheism. His attack on religion has a crudity that would make a militant Victorian unbeliever such as T.H. Huxley—described by his contemporaries as “Darwin’s bulldog” because he was so fierce in his defense of evolution—blush scarlet with embarrassment.

As someone who has read the major attempts to describe a philosophy of science in the 19th century, let me just go right ahead and say that 19th-century thinkers had just as much difficulty articulating and understanding these ideas; a rational system for scientific inquiry was a concept constructed, not present in anything resembling a finished state, throughout the era. The closest construction might be John Stuart Mill’s A System of Logic (1843), which was well-received, but the latter half of the century still had scientists conflating the limits and nature of deductive and inductive thought.

Also, it is patently dishonest to describe T. H. Huxley as a “militant Victorian unbeliever”; Gray’s sentence structure relies on a modern reader directly aligning believer-in-evolution with atheist, but Huxley was adamantly agnostic Deist, counting himself “not among atheists, for the problem of the ultimate cause of existence is one which seems to me to be hopelessly out of reach of my poor powers” and going on to write that, “[o]f all the senseless babble I have ever had occasion to read, the demonstrations of these philosophers who undertake to tell us about the nature of God would be the worst, if they were not surpassed by the still greater absurdities of the philosophers who try to prove that there is no God.”

Surely, though, these quotes mean that, even if Huxley isn’t a “militant … unbeliever” he would still “blush scarlet with embarrassment”–but if Gray wants to condemn Dawkins for adopting what Gray terms a “missionary” speaking style, he can choose no worse counterpoint than Huxley, a self-admitted sermonizer (ribbed often by Spencer for the same “clerical affinities”), who makes a fond anecdote in “Autobiography” of “preaching to my mother’s maids in the kitchen as nearly as possible in Sir Herbert’s manner one Sunday morning when the rest of [his] family were at church.”

Ultimately, Gray returns to his opening gambit–drawing the most tenuous of connections between Dawkins’ text and Gray’s own store of opinions on the man. Responding to Dawkins’ lament that Darwin never attained the rank of “Sir”, and what this implies about the British honours system, Gray writes, “It is hard to resist the thought that the public recognition that in Britain is conferred by a knighthood is Dawkins’s secret dream.”

Is it hard to resist, though? Or do sentences like these, with all they advance in the way of poor argumentation by a public thinker who makes a point of upbraiding others for their lack of philosophical rigour, say far more about John Gray than Richard Dawkins–and even, more critically, than the work of autobiography ostensibly up for review?

Cronenberg’s Big Mouthful


Consumed
David Cronenberg
Scribner

While reading the first novel by David Cronenberg, acclaimed director of over a dozen unsettling films, I asked myself all the obvious questions: Why did Cronenberg think print the right form for this story? Could it have been a film? These might seem like biased inquiries, but since Cronenberg brings his long career in cinema to bear on the promotion of this work, it seems fair to reflect on why he jumped mediums, and whether that jump worked.

Ultimately, I had to concede that a book was the right vehicle for this story: The text makes arguments I doubt would translate well to the big screen, and offers a level of technology-worship that works best when written out in loving detail. However, as with many first novels, the very concepts that make this a difficult piece to film bloat the final third with relentless exposition. Put simply, Consumed is not just a novel; for better and for worse, this thriller aspires to philosophical statement as well.

As with so many of Cronenberg’s films, Consumed is also a visceral and brutal text, making extensive use of sex and the grotesque in conjunction with other ideas of consumption–not least of which being media-related. To this end, our protagonists are two intimate photojournalists on ostensibly different freelance paths: Naomi is investigating a shocking French scandal–one member of a famed philosophical couple found dead and partly eaten; her partner, Aristide Arosteguy, missing and presumed guilty. Meanwhile, Nathan first follows a medical practitioner performing under-the-table operations on the dying and dysphoric, then stumbles upon an even more striking subject after tracking down the originator of a contracted STI.

These plot-lines eventually dovetail (as does, unfortunately, the STI), but while cannibalism remains a core concept throughout, Cronenberg is clearly thinking about this subject more figuratively, with fellatio and self-mutilation (though very much present) giving way to deeper questions about the loss of self. For instance, is there an even more grotesque way in which we can be (and are) consumed? And if we knew what it was, could we ever escape that form of consumption? The book reveals a way in which two characters certainly try, but ends on an ambivalent note.

The journey, on the other hand, offers its own imaginative pleasures. Granted, you either have to love camera technology or appreciate Cronenberg’s love of technology to get through huge, descriptive swaths of this otherwise-lean book, but this fixation extends to the unsettling, near-future potential of other gadgets, like the bio-3D-printer and neuroprosthetics, so there is variety, on the whole.

Consumed is also filled with the sort of strong, definitive sentences one expects in a thriller. For instance, in describing the cannibalized philosopher (as she looked prior to death), Cronenberg writes: “A sixty-two-year-old woman, Célestine, but the European version of sixty-two, not the Midwestern American mall version.” And though the term “bipolar” emerges later (indirectly) in a paragraph about another character, Cronenberg has already made the point plain enough in writing: “She hated her own volatility, the cycling so easily between manic confidence and crushed, hopeless insecurity.”

If Cronenberg’s description ever flags, it tends to do so when negotiating female sexual response–a tedious, but predictable lapse even in a book with so much sex. My favourite example arose when the book’s running parallel between the mysterious French philosophers and the sexually-intimate, jet-setting photojournalists is drawn explicitly, and Cronenberg writes, “The thought made her giddy, and some juices began to flow.” Cue (in my head at least) a thought-bubble blooming over dreamy Naomi, ’90s-TV-commercial-style, and in it a carton of Tropicana opened and poured into a glass.

Consumed is, in other words, not a perfect work, and it’s especially weighed down by the trap of third-act exposition, as well as the desire to make a much larger philosophical point in an otherwise lean, mean thriller. Nonetheless, this story definitely belongs in book form, and Cronenberg, master of body horror that he is, has still produced a(n unsettlingly) meaty tale.

An Irish Black Comedy on Strange, Everyday Faith

Calvary
Calvary
John Michael McDonagh
Reprisal Films

It shouldn’t surprise people that a film about religious belief might fascinate an atheist, but this fact often does. Billions of people espouse belief in a god, and so for them a god is real, which in turn makes a kind of god–the god in the minds of human beings–an unequivocal reality that informs our cultures and communities. When a film comes along that openly deals with such beliefs and their social implications, why would an atheist inherently opt out?

Calvary, written and directed by John Michael McDonagh, is an especially strong contender in this category: a film that negotiates a great many difficult human crises without offering easy answers, and which pairs the “mysteries” of a god with more pragmatic mysteries here on Earth. In particular, this film follows Father James (Brendan Gleeson), who at the outset is given notice in a confessional that he will be murdered in a week’s time for the sins of another priest. He then goes about his rounds as usual, visiting a range of conflicted parishioners in his small Irish town.

The ultimate “game” of the film, then, is trying to figure out who the would-be killer is, and waiting to see if Father James will actually die. Father James claims to know who his would-be killer is from the outset, but all viewers have is a gender and an accent, which could easily describe over half the people Father James meets. In the meantime, there are a host of other puzzles awaiting Father James on his rounds: the woman who may or may not be happily battered by either husband or lover; the prisoner torn between feeling godlike when killing and looking forward to a heaven where he no longer wants to hurt women; the atheist doctor; the smug publican; the man with everything and nothing at all; the old writer longing for a quick death.

Calvary wisely mocks the more stereotypical of these characters and scenarios, while the sense of danger for Father James nonetheless intensifies as the week progresses. But the most significant relationship in the film is perhaps that between Father James and his daughter, introduced to us after her failed suicide attempt. In one of their conversations, the relationship between earthly father and child becomes a good metaphor for heavenly father and earthly child–up to and including the child’s fear of the father going away. Since we know that Father James might literally die, this conversation invites reflection on the possible death of a heavenly father, too, and what that might mean to the human beings who remain attached to him.

This interpretation certainly resonates with other conversations throughout the film, in which Father James’ church is relentlessly identified as a dying institution by people glad to see it go, and also with the fact that his specific parish church is razed to the ground. These signs of destruction in turn make the final stand-off between Father James and his would-be killer all the more significant: If the beach where they meet is supposed to be Father James’ “Calvary”, then Father James–spiritual leader, earthly father, and vessel for metaphors about a heavenly father–has been acting in a pointedly Christ-like role throughout, with all the complicated implications therein.

Without spoiling the ending, this framework thus presents serious questions about the nature and relevance of sacrifice in relation to personal beliefs, whatever those might be. As Father James explains earlier on, he does not hate a man he yelled at in a moment of weakness; he simply thinks this man has no integrity, and that this is the worst thing he can say about a person. Calvary negotiates the importance of many moral lessons, but none more so than this: That our beliefs–be they rooted in trauma or greed or indifference or faith–are not static ideals but active principles, hurtling us against one another in all manner of not-so-easily anticipated ways.

So tread carefully, Calvary rather deftly seems to say.

Lewis Wolpert’s Gender Trouble

41i6xBTGUGL._SL500_AA300_

I own two books by developmental biologist Lewis Wolpert: How We Live & Why We Die: The Secret Lives of Cells and Six Impossible Things Before Breakfast: The Evolutionary Origins of Belief. I enjoyed the former for its simplicity, but I found the latter simplistic, presenting too narrow a claim and too selective a data set in exploring an otherwise intriguing topic.

Consequently, when I encountered an article in The Telegraph promoting Wolpert’s latest book, I wondered on which side of that precarious line (between simple and simplistic) this work would fall. The headline was not promising—“Yes, it’s official, men are from Mars and women from Venus, and here’s the science to prove it”—and despite the playfulness of its literary heritage, the title of the book itself, Why Can’t A Woman Be More Like a Man?, also invokes a terrible (and terribly long) history of women being regarded as the inferior, half-formed, child-like sex. 

Obviously, however, there are biological differences between male-sexed and female-sexed persons, and I am more than happy to entertain new information therein, so I read on. I just also know, as a doctoral candidate studying the rhetoric of science writing, that the veil of empiricism has long been used to forward poorly evidenced claims that also conveniently affirm pre-existing (and often oppressive) world views. 

In the era I study, Francis Galton, Herbert Spencer, and Charles Darwin are all guilty of this charge to varying degrees, but the nineteenth century by no means has a monopoly on the scientifically-shielded rhetoric of sexual and racial superiority. Just last month, for instance, there was outcry over A Troublesome Inheritance: Genes, Race, and Human History, wherein science journalist Nicholas Wade argued for a genetic basis behind the (stereotyped and historically skewed) behaviours and societal outcomes of the “three major races” (Caucasian, African, Asian). In a New York Times letter-to-the-editor responding to David Dobbs’ book review, 140 human population geneticists expressed disagreement with the conclusions Wade drew from their work. Wade, in turn, claims such disagreement is “driven by politics, not science”—though, again, it is the lack of concrete scientific background in Wade’s work to which these researchers overwhelmingly object.

The Case at Hand

As it turns out, similar missteps emerge in The Telegraph article, authored by Wolpert himself. Though he makes it clear he understands the controversial nature of his topic, he does little to demonstrate that his book is capable of rising above such cultural bias. Certainly, he asserts an intention to focus on the science alone, writing:

In recent years the politically correct argument has emphasised social causes to such an extent that it has sometimes virtually ignored our genetic inheritance and the role of genes. I have set out to look at the important biological evidence we may have been ignoring.

The trouble is, if this article is any indication, Wolpert has difficulty identifying and ruling out possibly mitigating factors in behavioural studies. Let’s take a look at the rhetoric in these two paragraphs, for instance:

Children have sexual feelings at a young age. Small boys often get erections after the age of about seven, and by puberty more than half of all males will have tried to masturbate. It is only when girls reach puberty that they may begin to do so. There are strong biological and also some social influences determining homosexuality. A surprising finding is that the odds of a boy being gay increase by one-third for each elder brother he has.

About half of men think about sex every day or several times a day, which fits with my own experience, while only 20 per cent of women think about sex equally often. Men are far more likely to be sexually promiscuous, a throwback to evolution where procreation was all-important. The need for a more emotional attachment found in women must also have an evolutionary basis.

I had to laugh at the first paragraph; parents of young female children will absolutely arch their brows at the claim that girls do not explore their genitals until puberty. But I am not going to scrounge around for studies to “prove” this because the data in question is not biological: It amounts to self-reporting, a form of research heavily influenced by social factors. 

(In my first year at university, for instance, I discovered that many women from more religious/conservative backgrounds had coded the term “masturbate” as a male activity, and thus something they intrinsically could not do. When explained that masturbation meant touching one’s erogenous zones—any erogenous zones—for pleasure, a conversation about the sexual exploration these women had, in fact, been doing could finally emerge. Our choice of language goes a long way to informing our research results.)

Morever, Wolpert himself notes the higher incidence of gay male persons in families with older brothers (a social influence), and then makes more claims that are wholly based on self-reporting. Unless Wolpert’s book shows some neurological basis for the claim that men think about sex more than women, he is simply responding to sociological research skewed by the cultural factors that frame male and female sexual self-disclosure.

Indeed, a good example of this disconnect between self-reporting and actual biological response even emerges in a study he later alludes to—a study which itself attests to the wide reach of female arousal. When he writes, “In contrast, both male and female erotica cause sexual arousal in women, whether heterosexual or lesbian,” I recalled the research reviewed in the New York Magazine article, “What Do Women Want?” This study involved both self-reporting and biological monitoring of male and female persons (straight, gay, and bisexual alike) in response to a range of visual stimuli. The results were remarkable: Women showed patterns of arousal so indiscriminate that even pictures of bonobo coitus got their “engines” running—but do you imagine for one second that these women self-reported the same, full range of response? (It’s a good article—well worth the read.)

More trouble emerges in the assertion that “[m]en are far more likely to be sexually promiscuous, a throwback to evolution where procreation was all-important.” Putting aside the same cultural issue with self-reporting, there are two problems with this claim: 

1) We live in an age of genetic testing, which has offered a striking fact of non-paternity: Averaging between different population totals, 1 in 10 children are not biologically spawned by their fathers. This figure is as low as 1% in some populations, and as high as 30% in others, which itself attests to a strong cultural influence in patterns of non-monogamy. Keep in mind, too, that not every act of infidelity will culminate in a child, and these figures offer a level of environmental complexity that this article does not even entertain as a possible source for divergent sex-based outcomes.

2) We also live in an age of re-testing and retraction, which as of late included a resounding challenge to the classic fruit fly study used to argue that males benefit more, evolutionarily, from promiscuity. Bateman’s 1948 research is a perfect example of a study with findings that went unquestioned because they reinforced pre-existing cultural beliefs. However, when recent researchers attempted to replicate his findings, they found serious problems with the construction of his data set, and an underlying bias towards using progeny to identify fathers alone (as if a spawned fly does not need both mother and father to exist). Suffice it to say, then, such research is “showing its age”, and at present the verdict is still out on whether males disproportionately benefit from non-monogamy.

Finally, as rhetoric goes, the last sentence in the above excerpt is the sloppiest: “The need for more emotional attachment found in women must also have an evolutionary basis.” The first issue I have with this sentence is the vagueness of its conclusion: Wolpert initially claims “biological evidence” to be the focus of his book, but “evolutionary basis” can mean a couple things. We might be looking at this issue genetically, or we might be looking at it in relation to cultural memes. Wolpert is not clear in this regard, allowing him to shuffle this comment about female emotional attachment in with the rest of his “biological” claims as if they were one and the same: As it stands in relation to the “evidence” he forwards here, they are not.

My second issue is with the word “must”. Really? Well, that was a fiendishly quick, non-scientific way of dispensing with alternatives. It may well be that there are concrete, genetically-moored differences underlying these relationship behaviours, but in the absence of clear genetic evidence to this end, one must at the very least eliminate cultural factors first. 

For instance, in the Mosuo culture of Southern China, where women hold the bulk of social power and are entirely in control of their sexual encounters, an entirely different set of relationships exists between the sexes, both when it comes to sexual solicitation and to childrearing. This would be a perfect example of different cultural memes informing different sexual behaviours, but such communities are thus also a thorn in the side of anyone attempting to mark disproportionate female “emotional attachment” as having a definitively biological origin.

Why does any of this matter? 

Simply put, it is disingenuous and counterproductive to present a binary between a “politically correct” version of scientific research and a “just-the-facts” version that appeals to objectivity, but ultimately hides behind the “common sense” argument of dominant cultural beliefs. If we assume that the pursuit of scientific knowledge stands to benefit human beings by providing us with the most accurate understanding of our world and our interactions within it, every effort should be made to ensure that accuracy, not personal affirmation, is the ultimate aim of all research.

What makes this aim difficult, granted, is that we are all human, and as such, liable to jump to conclusions that confirm pre-existing biases. This has to be forgiven, to some extent, but we commit an additional, wholly avoidable mistake in pretending that any data we gather can exist outside this cultural lens. Mistakes like this only serve to uphold that initial bias.

To use an example within Wolpert’s sphere of inquiry, consider the well-known fact that female persons (on average) perform poorer on spatial reasoning tests than male persons (on average). The jury is still out on whether this, too, is cultural (for instance, consider the study comparing puzzle completion times in matrilineal and patrilineal societies), but let us imagine for a moment that it is entirely biological. In the context of our culture, with its long history of regarding female persons as intellectually inferior, such data is often read in lockstep with the conclusion that female persons simply do not have the aptitude for, say, careers in STEM subjects. 

Now, putting aside that there are many detail-oriented skills for which female persons (on average) outperform male persons (on average), imagine if the same determinist conclusions were drawn from the equally well-known fact that male persons (on average) lag significantly behind female persons (on average) in reading comprehension. Even if we similarly discovered a fundamentally biological origin for this gap, could you imagine anyone then seriously concluding that male persons do not have the aptitude for, say, careers in law, literature, policy-making, or academia?

Of course not. But precisely because such sloppy conclusions are routinely drawn around facts that seem to fit a pre-existing cultural narrative, science writers have a responsibility to ensure that, if they are going to arrive at conclusions that will reinforce already-oppressive cultural narratives, their evidence- and argument-based paths to these conclusions are impeccable.

Both gaps, by the way, can be bridged to some extent, as evidenced by a range education strategies implemented in response to the observed existence of such gaps in the first place. In social transformations like these, we thus see the tremendous, real-world benefit of knowing as much as we can about actual human beings and their environments.

The trouble simply arises when science writing touted as empirically rigorous seems to be neither, as in the case of most of the examples Wolpert presents in this Telegraph article. I am not advocating for a “politically correct” science by any measure; I simply expect that, in the absence of definitive biological evidence for specific gender stereotypes, a seasoned science writer will a) recognize his cultural context, up to and including personal biases, and b) take better care in addressing and excluding other possible explanations for the sex-based divergence of specific human behaviours before claiming a fundamentally biological causation.

Wolpert might do all this in the book itself, granted. He might painstakingly review recent challenges to old, status quo research on male/female aptitudes and sexual proclivities. He might likewise acknowledge the dangers of a scientific over-reliance on self-reporting, and more openly concede that there is an important difference between the evolution of “memes” and genes. And if he does all this in Why Can’t A Woman Be More Like A Man?, his argument might very well hold together on a strictly empirical accord.

I’ll never know, though, because this promotional piece of his does little to inspire reading on—and the literary world is filled with so much more.

On Authority: Why Paying Attention to How We Pay Attention Matters Most

The world did not sit idly by while I studied for my final doctoral exam this summer. While I read nineteenth-century science textbooks, philosophical treatises, works of natural theology, university lectures, experiment write-ups, and a range of fictional accounts involving the natural world, violence swelled about me—about us all—in its many awful forms.

Outside mainstream news, I knew Syria, Myanmar, and the Central African Republic were still sites of strife and potential genocide. Meanwhile the urgency of other brutalities, like news of the kidnapped young women in Nigeria (who for the most part remain prisoners today, along with newer victims of Boko Haram), had begun to fade in the Western press—still horrific, but nowhere near as immediate as word of the tortured and murdered teens initially said to have instigated fresh hostilities along the Gaza Strip. Against the photo essays that emerged from Israel’s ensuing airstrikes—the pain and the loss and the fear writ large on so many faces—events elsewhere took an inevitably muted role in Western coverage.

Even macabre word of a new organization sweeping across Iraq, openly murdering religious minorities or otherwise cutting off access to resources and regional escape, did not gain as much media traction for most of the summer, despite these events prompting forms of American aid-based intervention. Rather, it would take the beheadings of American and British citizens for ISIS to take centre stage in Western media—a position it currently occupies alongside a sweeping NFL scandal in which celebrity sports stand indicted for their role in a broader culture of domestic abuse.

Granted, the US had other issues in the interim between Israel and Iraq, with widely resonant protests emerging from Ferguson, Missouri, after police officer Darren Wilson shot and killed Michael Brown on August 9. (For some sense of the event’s global reach, consider that Palestinians were sending online tips to American protestors in the wake of aggressive anti-demonstration tactics.) And while the racialized outcry underlying this situation should have come as no surprise to most, the “worst Ebola outbreak in history”—which reached Global Health Emergency status just days before Ferguson, and has continued to spread since—revealed a systemic battleground all its own: A vaccine exists. Canada has “donated” 800–1,000 doses. The WHO will “decide” how these are dispersed.

(And speaking of Canada, lest I be labelled one of those northerners who calls attention to US crises without looking inward: This summer’s news also brought into focus our own, systemic issues with the indigenous community — peoples at higher risk of lifelong poverty, incarceration, all manner of abuse and disease, going missing, and turning up dead.)

The above are by no means overtly malevolent facts—the WHO, for instance, is attempting “ring vaccination” while efforts to accelerate drug production proceed—but the systems of power these terms invoke do exemplify the very status quo of global disparity (in overall affluence, levels of health education, and resource mobility) that foments such virulent outbreaks in the first place. There is violence, in other words, even in systems that seem orderly and objective, and we cannot ever discount the role that language plays in reinforcing a deadly world.

With this in mind, I was struck this summer by both how much and how little attention the rhetoric of authority claims received amid this coverage. In the “much” column we have, for instance, the work of Gilad Lotan, who crunched an immense amount of social media data to identify the information silos in which followers of Israeli-Gazan conflict tended to position themselves—each “side” receiving and reinforcing different news items through like-minded media outlets. We also have reflections like that of John Macpherson, who explored the professional tension between objectivity and emotion in photojournalism, and just recently, a poll of Ferguson’s citizens, which indicates an extreme racial divide among individuals tasked with interpreting the events of August 9.

But underlying these pieces is also the paucity of self-reflection: Lotan’s data sets would not be as impressive if the vast bulk of readers were not so starkly divided in their consumption of news media. Nor, too, would the recent Ferguson poll pack quite the wallop without so many participants deciding definitively, incompatibly, and above all else culturally what happened between Darren Wilson and Michael Brown.

I should not be surprised, granted: The study of “how we know what we know” is a difficult one even when we raise up children with the vocabulary to consider such questions (whatever that vocabulary might be: a point to be revisited below)—and an almost Herculean task once we, as adults, have settled into a way of thinking that seems to serve us well.

Indeed, though I spent a great deal of time thinking abstractly about knowledge this summer, I also often found myself confronted by articles so provocative, I had to share them immediately, and to rail vehemently about some key point therein. Each time I indulged, though, one of two things happened: I either did research soon after that undermined the legitimacy of the original piece, or found myself too deeply affected on an emotional level to engage with earnest responses in any other register.

Knee-jerk reactions are, of course, common, and understandably so. In a practical, everyday sense, we almost by necessity draw conclusions that gesture towards deductive reasoning, while actually better resembling glorified gut instinct: The sun will come up because it always comes up. You just missed the bus because you always just miss the bus. That danged table leg caught your toe because it always catches your toe. And so the fuzzy thinking builds up.

Above all else, we acclimate to what is familiar. We grow comfortable in whatever strange, personal assumptions have long gone uncontested. Our propensity for confirmation bias then fills in the gaps: We know that an article retraction never fully dislodges belief in the original article. We know that narratives aligning with pre-existing beliefs will continue to be referenced even when shown to be untrue. We know that following an event in progress, acquiring facts as they unfold, invariably leads to high amounts of inaccurate knowledge difficult to dislodge.

So it is with human beings, who have jobs to go to, children to care for, relatives and friends to attend to, and the self to care for after hours. How might we even begin to overcome such shortcomings when the cadence of our lives often seems antithetical to deep reflection?

The dangerous answer has often been the unquestioned affectation of an orderly and objective system of thought. This differs from an actually objective system of thought—an unattainable ideal—in that, even if we can provide a full and logical progression from propositions A to B to C, the validity of these propositions will still inevitably be tied to their cultural context, from whence more of that messy, glorified gut instinct emerges.

As a doctoral student, I study science writing in the nineteenth century, so I have seen this flawed affectation play out throughout history. In particular, I can demonstrate how the rhetorical strategies of inference and analogy allow respected authors to leap from specific sets of empirical knowledge to broader, un-evidenced claims that just happen to match up with the authors’ pre-existing views on, say, “savage” cultures or female intellect. This is by no means a mark against empirical evidence; human beings are simply very good at making unjustified connections between data sets, especially when such connections can reaffirm what one already believes.

Similarly, in the Ferguson shooting and Gazan conflict this summer, “facts” quickly flew into the realm of inference, with terms like “witness” and “casualty” and “discrepancy” taking on markedly different characters depending on their source. Just as Michael Brown underwent three autopsies, so too has all manner of “hard” data in both cases been sifted through to exaggerated effect, and always with that human inclination towards finding a story that fits by any means, however loosely deductive.

In short, the danger of affected objectivity is that it cannot exist apart from the irrationality of those human beings applying it. Nonetheless, the “fair and balanced” approach to a given situation is often positioned as the only “reasonable” path to change or justice—a claim wholly disregarding that, for millions of human beings, the groundswell of personal experience, community anecdote, and emotional outpouring is the only “truth” that ever matters in the wake of violent world events.

When dealing with the rhetoric of authority claims, and their role in how we respond to violent events, our crisis thus cannot get much simpler than this: We need to recognize ourselves as emotional beings, with emotional prejudices derived from everyday experience as much as personal trauma, when engaging with narratives of violence—narratives, that is, which by their very nature tend to be emotionally charged.

This is not about ceding the role of logic in response to serious and dramatic world and local events. Nor is this about forsaking all available evidence and refusing ever to make safe assumptions about a given situation or issue. This is simply about recognizing ourselves as predisposed filters of wide swaths of competing information, and making an effort not to act as though we have a monopoly on truth in situations involving other human beings.

This may seem straightforward, but our patterns of news consumption this summer, as well as the activist strategies that emerged in response to a variety of issues, suggest we have a long way to go. While protests tend to arise from an urgent and legitimate place of outrage, an effective response to systemic abuses must not be based solely on popular outcry, or else it risks establishing a “rule of mob” in place of “rule of law”. Conversely, though, the rhetoric of “rule of law” and affectations of objectivity go hand-in-hand: If the former is not sufficient to address a given crisis, we have to take even greater care with the sort of authority claims we accept, lest they obscure any truly drastic changes needed to better our world.

Back in Business

It’s been a long slog of a summer, and my return from vacation left me with a surprise that will be no surprise to anyone in academia: The slog never ends!

Nevertheless, I’ve been holding off on much in the way of commentary while trying to focus on my doctoral studies, and so very much has suffered for it. A writer needs to breathe in many directions, and this myopic attention to academic detail has caused more in the way of anxiety than necessary. So! It’s time to get back to posting here, and writing and submitting fiction, and generally returning to the fray of real-world debate. I look forward to it all.

One quick note for the moment: I celebrated a wee publication in my relative absence from this blog. “The Last Lawsuit” was published in Bastion Magazine, a newer online venue of rising acclaim. This publication of mine is the last coming down the pipe at the moment, due to my incredibly poor submissions schedule these last few months, but it was a pleasure to work with the editor-in-chief, R. Leigh Hennig (and the art and other content was pretty nice, too!). Now it’s time to knuckle down, write, and submit anew.

I hope everyone reading this has had far better luck maintaining “balance” between all their interests this last while. Best wishes to you all!