Revolution Express: One Big Train with a Whole Mess of Semi-Allegorical Parts

Joon-Ho Bong
MoHo Films / Opus Pictures

One of the most amusing reviews I’ve read of Joon-Ho Bong’s first English-language film forwards the complaint that, for a movie set within a relentlessly speeding train, there sure are a lot of still camera shots in Snowpiercer. Since this train is the entire universe for the last survivors of humanity, after a failed attempt to halt global warming turned the world into a popsicle 17 years ago, such a grievance is equivalent to complaining about any film set on Earth that doesn’t constantly remind us that we’re hurtling at 108,000 km/h around the sun, and 792,000 km/h around the centre of the galaxy.

Though there are few poor reviews of Snowpiercer, what others exist likewise call attention, in their fascinating nitpicking, to the tenuous tightrope Bong walks between realism and allegory, especially within the Western film tradition. Abrupt tonal and mythological shifts in this narrative–which would have been completely at home in South Korean, Chinese, or Japanese cinemas–here serve as a reminder of how literalist North American scifi/action films tend to be (Elysium, for instance, and The Purge, and pretty much any apocalyptic/post-apocalyptic film in recent years, with perhaps the exception of The Book of Eli.)

Snowpiercer is in fact two stories: the story of one lower-class revolt on a stringently class-based train, and a metacommentary on the closed-system nature of stories about revolution in general. I’m not speaking figuratively, either: Bong makes that second story damned clear when he repeatedly emphasizes the existence of a sketch artist among the lower-class “tail section” denizens, whose work raises to hero and martyr status those involved in resistance efforts. In short, the story of the revolt is being created alongside the revolt itself–a fact a later character will also, more blatantly observe when All Is Revealed.

In consequence, if you simply focus on all the details in this film they will surely drive you batty: how little sense various population numbers make in their contexts; why an opening title card lies to you; why anyone with backstory X should be repulsed by the truth of the tail-section’s food source; how this speedy train takes a whole year to make one trip over its entire route; and why a machine part that performs a fairly straightforward, automatic motion could not be replaced when so many other, luxury items seem to appear from nowhere.

There are two ways to respond to these inconsistencies. The first is the route taken by many a negative reviewer–to hold these and the initial presentation of stock archetypes against the director, as signs that this film is too ridiculous and/or two-dimensional to merit serious consideration. The second is to remember that the entire premise of this film is completely bonkers–a massive train filled with the last of humanity, which makes a circuit of the whole world at breakneck speed during apocalyptic fallout–and then to assume that the director is aware of this absurdity, and to start paying attention to other weird elements therein. These include, but are not limited to: the mystic sensibilities of Yona (Ah-sung Ko); the hyperbolic absurdity in certain cars (e.g. the New Year’s scene, and the classroom car); a synchronicity between the mood in a given first-class car and the concurrent mindset of our presumed hero, Curtis (Chris Evans); the fight scenes that immediately become legend; the seeming impossibility of killing off certain bad guys; and the staging of various human beings in thematic tableau (right down to colour contrasts) for the final stand-off.

Read together, Snowpiercer is very clearly meant as an allegory–and not just for class struggle in our world but also for how class struggle can itself be a tool of oppression. As an indictment of audience expectations, Bong’s latest also has something to say about how the conventional yearning for new leadership is in many ways no revolution at all. He does this, too, while having characters resist both Western and Eastern conventions: While Yona’s father (Kang-ho Song) tries to preserve her in the role of passive participant (following a men-act/women-dream motif that emerges in quite a bit of South Korean cinema), Yona breaks from this role in the third act, while Octavia Spencer, as Tanya, gets to be part of the revolt all throughout (instead of wringing her hands while strapping menfolk try to recover her child). There’s definitely something to be admired, too, about not treating a wildly improbable scifi/action premise as anything more than mythopoetic backdrop.

However, much as I disagree with the nitpicking of negative reviews for this film, and much as I see myself watching this film a second time down the line, I did find the quest template so mundane at times that the film never quite stabilizes on this higher, allegorical plateau. Yes, Snowpiercer shatters all its stock hero archetypes in the end, but for most of the film Curtis is still the young, reluctant leader of a revolution against Wilford (Ed Harris), the man who built and runs the train, while John Hurt plays a fairly standard Wise Old Mentor in Gilliam, and Jamie Bell, your standard young recruit to the cause (Edgar).

Moreover, one first-class enemy, Mason (Tilda Swinton), does not acquire full coherence as a character until a gesture during a provocative early speech is repeated near the film’s end; and even then, if you accept what’s implied about her character at that juncture, it means accepting yet another inconsistency–this one related to Curtis’s supposed uniqueness among all on board the train. (But if we don’t accept it, we’re left with just another a two-dimensional villain, so it’s a tough call.) When we do get to the end, too, the relentless exposition dumps (another standard feature of quest narratives) are themselves played straight–which means that after almost two hours spent undermining the typical quest narrative’s structure, we’re left with a pretty boilerplate reversion to form for the close.

Maybe this is just the nature of the beast, though; maybe there really is no escaping the world–its most familiar narrative traditions–that train. What would it mean for us to step utterly outside all three? And what alternatives might be waiting for us if we did? For all that it might wobble on its tracks, Snowpiercer moves with distinct thematic purpose–even if its final destination seems an eternity from sight.

Excerpts from the Day’s Readings: Auguste Comte and Slippery Mental Frameworks

I’ve been sitting on a more reflective post, but the long slog of daily readings persists, along with a few other inanities of doctoral student life, so that essay will have to wait awhile. For now I’m just under a month from my last (I hope) major doctoral exam, and today’s readings are two books by Victorian thinkers (John Stuart Mill and G. H. Lewes) on the early philosophy of continental writer Auguste Comte, best known as the father of positivism, but also as a critical figure in the rise of humanism.

This last is an especially intriguing point for modern reflection. I’m not a fan of atheist “churches” or related assemblies, as are forwarded by some non-religious people today, but I’m even less a fan of pretending that these ideas are anything new. Put simply, in his later life, after struggling with mental health concerns both within and outside institutions, divorcing in 1842 (a rare and serious act in the 19th century), and losing a close platonic friend in 1846, Comte’s views and applications of positivism changed drastically… and he created his own, secular church.

Indeed, from the ideal of this dead friend, Clotilde de Vaux, as a moral paragon in her feminine virtue, came the “Religion of Humanity”–a ritualistic faith patterned after the Catholic Church, with Comte as the “high priest” (read: pope), and women and the working class primarily targeted for conversion therein. As Richard G. Olson notes in Science and Scientism in Nineteenth-Century Europe, this whole movement also prompted quite a few philosophical reversals for Comte. For instance: “Whereas in the Positive Philosophy Comte had complained that sociology had heretofore been crippled by the failure to subordinate imagination to reason, in the Positive Polity he touted the importance of imagination and claimed that ‘Positivism is eminently calculated to call the imaginative faculties into exercise.'”

This is the sort of stark mental shift one must expect as plausible in all human beings–even (or perhaps especially) those who significantly contribute to a number of fields. And sure enough, Comte made a significant impact on both the philosophy of science and the development of the social sciences. To this end, he outlined three stages of societal development towards the apprehension of truths: the “theological”, which has three phases of interpreting active, conscious wills at work in the world; the “metaphysical”, which involves abstract, idealized concepts that nature moves toward without the need for conscious wills; and finally the “positive”, in which one recognizes the critical role of observation and positive verification of hypotheses in the physical sciences before the benefits of empiricism can be turned to the study of human society. From this framework emerges a coherent rationale for valuing experimental findings when seeking to describe the world: in short, a 19th-century advancement of the scientific method.

It bears noting, too, that Comte’s negotiation of both the physical sciences and the social sciences held serious philosophical weight at the time. This was an era when philosophical doctrines like utilitarianism, which advocates as morally correct whatever action maximizes the overall good, were crudely applied to sociopolitical theory on the back of ill-gotten and ill-used “facts” about the world. As I mentioned in my post about John Kidd, for instance, there were certainly men of prominence with skewed notions of what the overall “good” looked like: Leaning on natural theology, Kidd especially argued that Britain’s social welfare system took away the opportunity for the poor to practise (Christian) humility, and the rich to practise (Christian) charity, with members of all classes resenting each other in consequence.

Nor did such manipulations of worldly “knowledge” escape public notice: Dickens railed against a caricature of utilitarianism in Hard Times (1854), arguing that actions taken from a place of pure reason could produce nothing but social and individual misery. While his caricature lacked philosophical finesse, it was not far off the mark from how major leaders of industry and government officials were actively distorting such ideas to their own economic advantage. Though originally from the continent, Comte’s work–first translated into English by Harriet Martineau in 1853, but widely known in England prior–thus offered a more coherent and widely-accessible method of making inquiries into the state and needs of the world. As Martineau writes in her preface to The Positive Philosophy of Auguste Comte (1853):

“My strongest inducement to this enterprise [of translation] was my deep conviction of our need of this book in my own country, in a form which renders it accessible to the largest number of intelligent readers. We are living in a remarkable time, when the conflict of opinions renders a firm foundation of knowledge indispensable, not only to our intellectual, moral, and social progress, but to our holding such ground as we have gained from former ages. While our science is split up into arbitrary divisions; while abstract and concrete science are confounded together, and even mixed up with their application to the arts, and with natural history; and while the researches of the scientific world are presented as mere accretions to a heterogeneous mass of facts, there can be no hope of a scientific progress which shall satisfy and benefit those large classes of students whose business it is, not to explore, but to receive. The growth of a scientific taste among the working classes of this country is one of the most striking of the signs of the times. I believe no one can inquire into the mode of life of young men of the middle and operative classes without being struck with the desire that is shown, and the sacrifices that are made, to obtain the means of scientific study. That such a disposition should be baffled … by the desultory character of scientific exposition in England, while such a work as Comte’s was in existence, was not to be borne, if a year or two of humble toil could help, more or less, to supply the need.

In short: Martineau’s translation of Comte’s work offered a philosophical foundation for empirical inquiry that would allow a wider range of persons to evaluate any “facts” put before them about how the world should be, and why, on the basis of how the natural world currently is and the natural laws it summarily follows.

In his later evaluation of Comte’s work, Mill takes particular care to negotiate the metaphoric landscapes that don’t translate well (a word in French, for instance, having a different cultural history than even its closest approximation in English), but he also takes care to note that Comte’s work also addresses how huge paradigm shifts change an entire culture’s consciousness–and how readers in any climate would do well to take similar care not to repeat their predecessors’ ideological errors. In relation to Comte’s second stage, for instance, Mill writes:

In repudiating metaphysics, M. Comte did not interdict himself from analyzing or criticising any of the abstract conceptions of the mind. … What he condemned was the habit of conceiving these mental abstractions as real entities, which could exert power, produce phaenomena, and the enunciation of which could be regarded as a theory or explanation of facts. Men of the present day with difficulty believe that so absurd a notion was ever really entertained, so repugnant is it to the mental habits formed by long and assiduous cultivation of the positive sciences. But those sciences, however widely cultivated, have never formed the basis of intellectual education in any society. It is with philosophy as with religion: men marvel at the absurdity of other people’s tenets, while exactly parallel absurdities remain in their own, and the same man is unaffectedly astonished that words can be mistaken for things, who is treating other words as if they were things every time he opens his mouth to discuss. No one, unless entirely ignorant of the history of thought, will deny that the mistaking of abstractions for realities pervaded speculation all through antiquity and the middle ages. The mistake was generalized and systematized in the famous Ideas of Plato. The Aristotelians carried it on. Essences, quiddities, virtues residing in things, were accepted as a bona fide explanation of phaenomena. Not only abstract qualties, but the concrete names of genera and species, were mistaken for objective existences. … To modern philosophers these fictions are merely the abstract names of the classes of phaenomena which correspond to them; and it is one of the puzzles of philosophy, how mankind, after inventing a set of mere names to keep together certain combinations of ideas or images, could have so far forgotten their own act as to invest these creations of their will with objective reality, and mistake the name of a phaenomenon for its efficient cause.

Mill goes on to point out that this is precisely the point of Comte’s three stages–this metaphysical fixation on abstracts-as-absolutes being an intermediate phase in humanity’s approach to understanding the world, somewhere between using words to invoke notions of a divine will and “the gradual disembodiment of [such] a Fetish”, whereafter words are simply used to identify phenomena with consistent and natural causes that can be understood through empirical inquiry.

In all the works I’m reading today–referencing Martineau’s 1853 translation and Olson’s modern literary criticism while delving into Mill’s and Lewes’ 19th-century British revisions of Comte’s doctrine of positivism–the overwhelming theme is thus one of mental frameworks in flux. On an individual level, we see this in both Comte’s personal life and argumentative double-standards that persist in all eras. Likewise, on a societal level, massive paradigm shifts mark the whole of our written record, while the impact of a given philosophy even within a specific period is by no means culturally stable.

To my mind, a doctoral programme in the humanities tasks its students to live in a similar state of flux: capable of holding a wide range of competing histories and ideas in unrelenting tension. The trick is that, both throughout and at the end of this process, I also need to be able to synthesize these tensions concretely for a wide range of audiences. I haven’t yet mastered this last part, but… I’m getting there, I hope. One bloody book at a time!

Cheers and best wishes to you all.

Conversation Enders: The Problem with Hero-Worship

Working part-time at a local bookstore is a great reprieve from the isolation of my studies. Just as I get to know many customers’ personal lives, so too have many of them learned that I’m a doctoral student working towards her (hopefully) last major proficiency exam. When they ask me what I’m reading that day, I therefore have an opportunity to frame my studies as something useful for a general audience–and sometimes this effort goes well, but at other times the real learning experience is my own.

Two weeks ago, the book of the day was Charles Darwin’s The Descent of Man (1871), a work I’d only read excerpts from in the past. When a customer asked about its relevance, I explained that this was the work in which Darwin–ever tentative about rocking the boat with his research–made explicit that human beings were subject to his theory of evolution by natural selection, too. This book caused tremendous controversy for precisely that reason, though Darwin had gone to great lengths to forestall his comments on human evolutionary behaviours until after extensive (and I mean extensive) review of the physiognomy, general behaviour, and mating pressures among various species of molluscs, fish, insects, birds, quadrupeds, and other primate species first.

Darwin received considerable criticism and ridicule for The Descent of Man (1871), which solidified the ideological “threat” first intimated in On the Origin of Species (1859), by openly integrating human development into the theory of evolution by natural selection.

But The Descent of Man has cultural significance in another capacity, too, so my synopsis for the customer included that this was also the text in which Darwin, every bit a person of his time, corrals his extensive field research on other species to make sweeping comments about the mental inferiority of women, to say nothing about the general inferiority of non-white persons. For instance:

“The chief distinction in the intellectual powers of the two sexes is shewn by man’s attaining to a higher eminence, in whatever he takes up, than can woman—whether requiring deep thought, reason, or imagination, or merely the use of the senses and hands. If two lists were made of the most eminent men and women in poetry, painting, sculpture, music (inclusive both of composition and performance), history, science, and philosophy, with half-a-dozen names under each subject, the two lists would not bear comparison. We may also infer, from the law of the deviation from averages, so well illustrated by Mr. Galton, in his work on ‘Hereditary Genius,’ that if men are capable of a decided pre-eminence over women in many subjects, the average of mental power in man must be above that of woman.”

“It seems at first sight a monstrous supposition that the jet-blackness of the negro should have been gained through sexual selection; but this view is supported by various analogies, and we know that negroes admire their own colour. With mammals, when the sexes differ in colour, the male is often black or much darker than the female; and it depends merely on the form of inheritance whether this or any other tint is transmitted to both sexes or to one alone. The resemblance to a negro in miniature of Pithecia satanas with his jet black skin, white rolling eyeballs, and hair parted on the top of the head, is almost ludicrous.”

I wouldn’t call it “enjoyable” to read such assertions–to encounter work after work (especially ones written from a position of authority, be it scientific, religious, or political) making such petty, ignorant comments at the expense of other human beings–but as a student of literary history, I find neither of these to be shocking or exceptional prejudices. They hurt, granted, but they hurt in largest part because they attest to much broader histories of exclusion and oppression. I do tend to forget, however, that many others have a different relationship with persons of note: a relationship that tends to cushion the individual from their context whenever we like something that individual did. And indeed, the customer who’d first asked about my reading was deeply troubled by my summary. “Darwin said that?” he said. “Darwin believed that?”

I tried to emphasize that Darwin’s comments did not erase his many positive contributions, but the damage was done. To try to offset these uglier aspects of Darwin’s biography, I then blundered further, by pointing out that even prominent early-20th-century suffragists, women who made great strides towards gender equality under the law, still advocated (as a great many did at the time) for eugenics policies–but this only saddened the customer further.

Now, by no means do I consider this customer’s reaction unique, but it was affecting, and I am more familiar with the other side of this flawed argument: people, that is, who will dismiss any significant contribution by a prominent individual because of some perceived failing elsewhere in their biography.

Last year, for instance, while studying for my first major exam, I made the mistake of marvelling at an historical echo: comparing, that is, John Stuart Mill’s succinct moral argument against Christianity (as found in his 1873 Autobiography, describing his childhood move from religion) with the equally succinct moral argument against Christianity used by Christopher Hitchens in more recent debate. Both regarded the notion of vicarious redemption through Christ as morally bankrupt, so the only real difference was that Hitchens could add, through a conservative estimate of the age of our species provided by modern anthropology, the absurdity of believing that a loving god watched “with folded arms” for some 95,000 years before acting to redeem the species, and even then only through barbaric sacrificial rites.

My fundamental point entailed how little had changed in these arguments–how vicarious redemption was an affront to young Mill in the early 19th century just as it was to seasoned Hitchens in the early 21st century–but my colleague interjected by shifting the conversation. This person was incredulous that I would invoke Hitchens at all, with his foreign policy views being what they were–and didn’t I know what kind of uncomfortably antiquated views he once shared about working women and motherhood?

My customer’s implicit tethering of historical significance to modern moral character, as well as my colleague’s dismissal of an argument on the basis of the speaker’s other beliefs, both rely on a fallacious connection between a person’s assertions in a given field, and that person’s actions in another. This isn’t to say that there is never transference between spheres (for instance, a researcher does not lose their knack for researching just by changing the topic of their research) but the existence of such transference still needs to be demonstrated unto itself. (So to carry forward the analogy, if a researcher who’s demonstrated excellence in one field comes out with a book involving another field, but that work lacks proper citation for all major claims therein, we would be safe in assuming that an adequate transfer of pre-existing research skills to new topics had not been demonstrated.)

These troubles of course resonate with that well-known philosophical fallacy, argumentum ad hominem (argument [in reference] to the man [doing the arguing]). But to invoke this fallacy on its own is, I think, to overlook the bigger picture: the powerfully human frustration many of us share with the acts of hero-worship we as individuals and as communities reinforce every day.

One of my favourite examples of this tension lies with Paracelsus, the 16th-century physician who railed against the practice of accepting the truth of a given medical claim based on the prestige of its original author. Instead, he argued that the human body had its own store of healing power, that diseases could be identified by predictable sets of symptoms, and that personal experimentation was thus to be preferred to taking the word of someone, say, in fancy dress, boasting cures made of exotic ingredients, who had simply studied the words of ancient healers in selective institutions of learning.

But as Paracelsus became popular for his resistance to classist medical practices (since the mystification and centralizing of medical “knowledge” only really served the interests of gentleman practitioners), his own ego, in conjunction with an eagerness among many others to defer to perceived authority, meant that, even as he championed self-knowledge, Paracelsus was also quick to declare himself a monarch of medical practice, and so to gain followers in turn.

While Paracelsus’ birth name, P. A. T. Bombast von Hohenheim, is not actually the source of the term “bombastic”, Paracelsus itself means “beyond Celsius” (the Roman physician). Despite Paracelsus’ motto, seen above (alterius non sit qui suus esse potest: let no man be another’s who can be his [own instead]), such self-aggrandizement gained Paracelsus many devotees well after his death.

In essence: Whenever it garners popularity, even resistance to groupthink can generate a sort of groupthink of its own.

The 19th century played its role in glorifying this human tendency, too. Thomas Carlyle’s “Great Man” theory of history–a way of constructing cultural mythology that fixates on narratives of individual virtue and genius–still pervades our thinking so thoroughly that we tend to pluck our “heroes” from their historical and cultural contexts, or otherwise strip them from the fullness of their humanity, in order to exalt specific contributions they might have made. The potential for error here is twofold: 1) in treating any human being as perfect, or approaching perfection, due to the significance of their words and actions; and 2) in condemning entirely the work of any person who, once exalted, is thereafter found to be (shockingly) an imperfect human being.

But therein lies the difficult catch: What if someone else–or a whole community of someone-elses–has already committed the first error? What if you’re born into a culture that already exalts certain human beings as essentially without fault, either by claiming them to be virtuous directly or by downplaying all the problematic aspects of their life stories?

How can we counteract the effect of this first error, save by risking the second?

This is no idle, ivory-tower conundrum, either: Whenever we uphold the merit of an argument through the presumed impeccability of its speaker’s character, we leave ourselves open to losing that argument the first time its speaker’s character ceases to be impeccable. And yet, we cannot allow people to remain in positions of authority whose “imperfections” perpetuate serious social harm, either through word or through act. So what option remains?

More history seems to me the only answer: The more we understand and accept the fallibility of all our most notable figures, the more we can dismantle routines of hero-worship before they ever get so extreme as to require the fallacious distraction of character assassination in the first place.

Now, obviously this kind of work runs at odds with many spiritual beliefs: beliefs in living representatives of a god on earth; beliefs in a human being who is also a god; and beliefs in human beings who claim to have transcended to another plane of existence, be it through yoga, meditation, or drugs. But even most people who would consider themselves spiritual can appreciate the danger of charismatic leader-figures–the present-day godhead of Kim Jong-Un; the Stalins and Pol-Pots and Hitlers of history; the Mansons and the Joneses of smaller, still devastating cults. So there is some common ground from which to begin this conversation-shifting work.

What we now need to put on offer, as a culture, is a way of valuing significant social contributions unto themselves. When we separate those contributions from the maintenance of individual reputations, we only further benefit society by making the process of refining those contributions easier down the line. Likewise, we need to acknowledge figures of note in the most dignified way possible: by not erasing their personhood in the process. When we allow even those who contribute significantly to their communities to continue to be seen as human beings, and therefore ever-in-process, we make the path to positive social contribution seem less unattainable (and hazardous) for others.

Granted, hero-worship is an understandable cultural norm. Many of us want to be inspired by the work of human beings who’ve come before us, and want to imagine ourselves as the potential site of inspiration for others in turn. But whether our hero-worship is fixed on a record-breaking athlete, or a soldier awarded for valour, or a scientist who made a significant breakthrough that will save thousands of lives, or an activist who stood up to oppression in a way that rallied others to their cause, or a community organizer or family member who, in their own, lesser-known way made a terrific impact on our quality of life… hero-worship still sets an untenably high standard for us all.

When that athlete emerges as a perpetrator of rape, or that soldier is found to have tortured prisoners during their tour of duty, or that scientist to have plagiarized prior work, or that activist to have resorted to brutal acts against civilians in their resistance efforts, or that community organizer or family member to have molested children, we are all rightfully devastated. And yet, even then, we tend to get defensive, and our knee-jerk response is often to make excuses for the individual–as if histories of significant action can ever be reduced to stark lists of pros and cons. No, X hours of community service do not excuse the predation of Y children; and no, X impressive rescue missions do not entitle anyone to Y assaults on inmates.

But if we really want to nip such heinous rationalizations in the bud, what we need is a better social narrative for human contributions in general. Here, then, are a few suggestions as to actions we can all take to deflate the culture of hero-worship that muddies the waters of so many critical conversations. If you have others, I welcome their addition in the comments:

1) Practise making biographical assertions without using the rhetoric of relativism, even (or especially) when those biographical notes are ugly. For instance: 1) David Hume held deeply racist views about non-white persons. 2) David Hume’s racist views, and his expression of them in his writings, were commonly accepted in his culture. 3) David Hume’s writings include significant contributions to the history of philosophy. Not “BUT these views were commonly accepted” and not “BUT David Hume’s writings include”. Ask yourself, too, why such rationalizations seemed relevant in the first place.

2) Do not deny your revulsion at the destructive words and actions of your fellow human beings–not even those who have long since passed on. Do ask yourself what destructive behaviours future humans might be equally repulsed by among people of our day and age. How much do our words and actions really differ from those of past figures of note? What is the most effective way to forward a given conversation without recapitulating their errors?

3) If spiritual, put aside notions of divine inspiration when assessing the conduct and argumentation of religious leaders and historical icons. Is their conduct and argumentation impeccable (that is, distinct from the flaws we see in other human beings)? If not, ask yourself what benefit is derived from shielding these flaws under notions of divine sanction. And what are the risks?

4) If not spiritual, consider a prominent figure you find yourself defending the most in conversation. Are you defending the validity of the person’s arguments, or the person’s character (with the implication that by defending the person’s character you’re still defending the legitimacy of their arguments)? If the latter, why, and to what end? How does this forward meaningful discourse?

Hero-worship starts early, and our media culture is exceptionally good at building people past and present up to untenable standards of excellence. Once there, we often defend the reputations of these “Great People” so zealously that we limit our ability to build upon their greatest contributions, or else bind their characters and their contributions so tightly together that when the former falls, so too, in the public eye, does the relevance of the latter.

If any single, pithy adage could thus sum up the quality of discourse possible in such a culture, it might read: “Great minds discuss ideas; average minds discuss events; small minds discuss people.” Eleanor Roosevelt’s name is most often associated with this assertion, but it wouldn’t matter one whit to the quality of this statement if someone else had said it first.

…Which is a relief, because the saying has a far older, most likely anonymous provenance. So without denying the many difficult and outright ugly histories that surround our achievements, I have to ask: How many of our best works might be easier to build upon or amend if we could just get past the celebrity-status (for better or worse) of any human beings therein involved?

On Being Human: Johansson in a Tale of Ash and Mist

Under the Skin
Jonathan Glazer
Mongrel Media / Film4 & BFI

Our fear of the unknown forms the emotional basis for many films, and our fear of the unknown in ourselves, even more. Rare is the film, though, that explores both without offering facile resolutions, and that makes constrained and intelligent use of all filmic elements to uphold its central themes.

Under the Skin lives up to its title–getting under its viewers’ skins in many ways–by being just such a rarity. The concept is fairly straightforward: Scarlett Johansson plays an alien in a human skin, whose central task seems to be the seduction of adult men, who are thereafter ensnared in an ultimately fatal process of absorption. But nothing about this situation fits tidily with notions of what it means to be, and to not be, a human being.

For one, our protagonist is no 1950s femme-fatale, delighting in the ruination of men, but rather a methodical worker (under the watch of mysterious non-human men on motorcycles) whose human mannerisms are at first only present so long as a potential target is in sight. And even the choice of target proves unsettling in its reversal of cultural stereotypes; from within her creepy white van we’re made to view isolated men of all ages as potential victims–even though, at first, we’re led to believe that our alien never takes anyone by force; and though she certainly never shows superhuman strength.

Moreover, while our alien hones her skill at this task, her indifference to everything else around her is surely meant to provoke the audience, to prompt viewers to plead with her: Be more human. Please, please, please, be more human. But even the most wrenching acts of indifference are turned, in the end, against us and our supposed humanity. In the second act, for instance, our alien picks up an isolated man and turns her now well-honed charm on him–only this time the man (played by Adam Pearson) has a severe facial deformity (neurofibromatosis), which changes entirely the significance of being treated as an object of desire by a more normatively beautiful human being.

Any viewer at this point should rightly feel uncomfortable about the tension at work on screen: If our alien treated him with the same revulsion over his external appearance that other human beings do, he wouldn’t be at risk of the same fate at her hands. But in treating him as though his external appearance should have no bearing on his fundamental worth, our alien performs a level of human equality actual human beings routinely struggle to approach.

The film revisits this tension between the tender and the brutal in our “humanity” at later, even more critical junctures, but this tension would not be possible if two things were not true throughout: For one, alien though our protagonist is, she can and does experience her environment–its sights and sounds and multitude of idiosyncrasies. Without this capacity to be impacted by her time among us, the plot could not significantly advance. And yet (for another), even as our alien shares with us this one certain trait, we have to be confronted time and again with opportunities for her to act more human in consequence (and for us to hope she acts more human in consequence) and to have each and every opportunity snatched from us in the end.

To achieve and maintain this relentless tension, Under the Skin needed to be both aesthetically striking and subtle–and it was. At one juncture, when our alien is driving through the streets of Scotland, the soundtrack takes on the breathy character of someone in a spacesuit–because, in a way, what’s mundane to us is space exploration for her. Intimations of alien physiology are likewise dealt with in delicate asides–through passing comments about heat and cold, and our alien’s attention to the smallest of insects around her–while a sequence of ash and mist becoming indistinguishable before the camera is a classically understated visual affirmation of the film’s central theme.

The sequence for male ensnarement, riffed upon three times throughout the movie, is a similarly minimalist affair that only makes the horror of the whole situation all the more startling, and grotesque. “I’m dreaming,” says the last potential victim, and when our alien agrees I realized that this whole visual metaphor was just that: the closest the human mind could come to describing an entirely alien process of predation and destruction.

The vocabulary of cinematic art being what it is, I can thus understand why certain elements (especially one dance scene) might be considered “Lynchian”, or the film compared in other ways to classics like Altered States and 2001: A Space Odyssey. But such comparisons amount to little more than irrelevant shorthand; the film more than stands on aesthetic merits all its own. In the thematic tradition of all great science fiction, Under the Skin is a slow-building exploration of our fidelity to certain notions of what “being human” means. You’ll find no easy answers here: just one hell of a nerve-wracking chance to sit with, and perhaps confront, the alien within.

A Quick Excerpt from the Morning’s Readings

Nothing like a bit of 19th-century libertarianism to start the day!

This is from The Bridgewater Treatise of chemist and geologist John Kidd, published in 1833. At this time, as I mentioned in my last essay, social prescription is very much bound up in notions of “natural” law, which of course carries the implication of divine sanction. Kidd in particular is writing about why the “poor laws” go against the law of nature because they compel the rich to pay for certain institutions of “relief”. Sound familiar?

(Warning: 19th-century writing is often dense and the syntactic rules sometimes differ, especially when it comes to the liberal use of commas.)

In the mind of the pauper, with all his challenging and all his boisterousness, there is still the latent impression, that, after all, there is a certain want of firmness about his plea. He is not altogether sure of the ground upon which he is standing; and, in spite of all that law has done to pervert his imagination, the possessory right of those against whom he prefers his demand, stares him in the face, and disturbs him not a little of that confidence wherewith a man represents and urges the demands of unquestionable justice. In spite of himself, he cannot avoid having somewhat the look and the consciousness of a poacher. And so the effect of England’s most unfortunate blunder, has been, to alienate on the one hand her rich from her poor; and on the other to debase into the very spirit and sordidness of beggary, a large and ever-increasing mass of her population. There is but one way, we can never cease to affirm, by which this grievous distemper of the body politic can be removed. And that is, by causing the law of property to harmonize with the strong and universal instincts of nature in regard to it; by making the possessory right to be at least as inviolable as the common sense of mankind would make it; and as to the poor, by utterly recalling the blunder that England made, when she turned into a matter of legal constraint, that which should ever be a matter of love and liberty, and when she aggravated tenfold the dependence and misery of the lower classes, by divorcing the cause of humanity from the willing generosities, the spontaneous and unforced sympathies of our nature.

See? Nothing changes: We have to deal with the same rhetoric today–the naive notion that if the government would just butt out, instead of compelling people of means to support a social safety net for all through taxation, charity of a personal nature would easily reassert itself to fill the gap, and the world would be in better balance.

The very next year, on the back of such grievances, the “New Poor Laws” would come into effect. Especially in their earliest conception, these would be harsh measures, essentially further penalizing the poor for being poor. You might be familiar with the outcome of these laws from such works as Dickens’ Oliver Twist and A Christmas Carol; the workhouses he describes in each were first established under this 1834 legislation.

Back to a long day of reading for me!

Greetings to the Swarm!

Hi folks!

I’ve seen a huge spike in readership on this little blog in the last few hours, thanks to the very kind and unexpected promotion by Dr. Jerry Coyne of an essay I wrote in response to a review by Michael Robbins. Huge surges come with consequences, though, so I’d just like to make a few quick points.

1) Welcome! Thanks so much for adding my blog, and I hope some of my future posts will prove as interesting to you as this essay clearly did. I’m certainly looking forward to writing more posts about my readings now that I’ve had some success translating the relevance of 19th-century literature for a 21st-century audience–so thanks for that boost in confidence!

2) As I mentioned in my original comment, feedback is very much welcome. I’m a second-year doctoral student of English literature at Wilfrid Laurier University, studying styles of science writing in the nineteenth-century. This means that I am a scholar in process, and with any luck I’ll remain a scholar in process throughout my life. I am therefore not presenting myself as a definitive authority; just as a life-long learner interested in promoting a conversation that involves the fruits of my research to date. I will make mistakes, and I will hopefully be in a position to own up to those mistakes so as not to derail the conversation.

3) More to the point, I will always strive to maintain a courteous tone in conversation, and ask that commenters here do the same. As a student of English literature, language clearly matters to me, but that doesn’t mean I’m not going to make mistakes. To this end, I welcome all comments about rhetoric and vernacular (as well as content, of course) that are not forwarded in bad faith. In turn, I’ll try to signal any amendments I might make with clear “EDIT” markers in the original text.

4) That said, I’m in the middle of an intense reading list going into my final proficiency exam, so I cannot engage in much online discussion for the next while. I will try to post my readings and reflections here more often in the coming weeks, but I’ll have to beg patience if my participation in comment threads is inconsistent until after the end of August.

5) Again, welcome! A huge surge in reading numbers can be a little terrifying, but I look forward to many fruitful exchanges in the months to come.

Cheers and best wishes to you all.

Enough Already: The Anti-Atheist Article Shows Its Age

Michael Robbins, writing for Slate Magazine, recently contributed to that most robust literary genre, the anti-atheist op-ed, with a review of Nick Spencer’s Atheists: The Origin of the Species. “Review” might even be too strong a term for this piece; though the book is touched upon, its formal introduction is buried, and assessments of the text itself are routinely subordinated to Robbins’ own views on science and religion.

To this end, Robbins draws from what we’re sometimes left to assume are Spencer’s arguments, as well as standoffs with atheists (including those from that great bastion of critical discourse, the online comment thread), to make a broader set of claims: that today’s popular atheists are out of touch with the history of atheism; that these atheists just don’t “get” the point of religion, which is clearly all about metaphor, not explanatory power; and that if they truly “got” the history of atheism, modern atheists would understand that any secular morality is only an fragmented descendant of religious morality. For Robbins, then, Nietzsche is the ideal atheist–an atheist who felt that a world without a god is horrifying, and deserves to be mourned.

Such articles always have their easy cannon-fodder, with the likes of Dawkins or Hitchens thrown in as de facto examples of what Robbins terms “evangelical atheism” and others have termed “militant atheism”. These terms almost never appear with any sort of textual evidence (for instance, in what way “evangelical”–knocking on doors to spread the good word of atheism? and in what way “militant”–agitating for the persecution of believers?), and so serve as little more than caricatures in an already highly-caricatured debate.

Other terms in Robbins’ article are likewise, predictably heated, with “Dawkins and his ilk” identified as the “intellectually lazy” successors to Spencer’s history. This generic flogging of Dawkins should be a warning for anyone seeking insightful commentary about science and religion; it only signals for the reader that this piece is not going to concern itself so much with ideas as with the people who forward them. Robbins even ends his article with a quote lamenting the lack of such ideas-based discourse–“Everyone is talking past each other and no one seems to be elevating the conversation to where it could and should be”–without expressing any self-awareness as to how rhetoric like his keeps this conversation off-point.

And yet, such rhetoric is by no means novel: I haven’t read Spencer’s book, but Robbins references people being termed “atheist” who would more likely be regarded as theists today. He doesn’t go into great length on this point, but if his source text is indeed a good summary of the history of atheism, it should address instances throughout that history when the term was used as an insult or threat, either to demarcate people whose beliefs differed from the status quo, or to identify those who held their disbelief too strongly.

In an 1874 essay, for instance, Thomas H. Huxley notes that 17th-century Descartes–who worked extensively to rationalize the existence of a god–had been considered an atheist by the Jesuit community. Meanwhile, Huxley himself, though best known today as “Darwin’s bulldog” and a strong advocate against religious interference in scientific progress, did not identify as an atheist; rather, he sneered at those who took up that term as being too sure of themselves–just like folks who rail against the tenor of “new” atheism today.

By introducing Huxley to this discussion, I should surprise no one by adding that I write my concerns about these relentless anti-atheist pieces as a doctoral student of nineteenth-century science writing. It is from this same critical focus, as well as from my position as a human being with eyes and ears in the world, that I take issue with many of the arguments treated as self-evident truths in articles like Robbins’. Even putting aside the obvious strawman tactics, Robbins’ central arguments, drawn in part from Spencer’s text, just don’t hold historical water. I cannot comment on Spencer’s original framing, since I’m receiving his text through a powerful filter, but Robbins’ arguments are slippery enough as is. He writes near the outset:

Spencer’s point, of course, is that this received wisdom is naive nonsense—it gets the history of science and the nature of religious belief wrong, setting up an opposition between reason and faith that the church fathers would have found rather puzzling. … Few historians take this myth seriously, but it retains its hold on the vulgar atheist imagination. To believe it requires the misconception that religion exists primarily to provide explanations of natural phenomena.

Yes, the early church fathers believed that “reason” was that which brought you closer to god; that there could be no reason without god, such that the idea of dividing the two was incoherent. We have to remember that Aristotlean logic in particular dominated the Western world for almost 2000 years before other modes of evaluation gained a significant foothold; in this system an argument could be structurally valid, but its soundness still relied on the accuracy of its premises–and there were a heck of a lot “common-sense” premises in that era that we know today are not phenomenologically accurate.

(Robbins even cites one such common-sense premise later, when he presents the idea of a “universe from nothing” as a concept intrinsically meriting contemplation. Despite himself arguing that “since the very beginnings of Christianity, Basil, John Chrysostom, Gregory of Nyssa, Augustine … all assumed that God’s creation was eternal, not something that unfolded in six days or any other temporal frame”, he does not consider the possibility that a “universe from nothing” might thus be nonsensical. Certainly, recent research attests to even the “vacuum” of space being occupied, such that there’s no evidence we ever sprang up from true philosophical “nothingness” in the first place, but it’s just striking to note how pervasive such “common-sense” incoherencies remain today.)

Suffice it to say, then: Yes, “reason” today is a much more secular term, involving fewer a priori assumptions than its precursor. But what Robbins really overlooks is that this shift in meaning was a difficult transition, precisely because the Judeo-Christian god was expected to have explanatory power for natural phenomena. Medieval texts in particular are rife with this thinking–our very world a direct, macrocosmic extension of the human microcosm, with the stars overhead placed there so that we might read our destinies in them.

But at the turn of the 19th century, the earnest pursuit of natural theology–that is, the practice of evidencing the Judeo-Christian god through such self-centred studies of nature–started to lose its footing. Though late-18th and early-19th-century geologists and astronomers were careful for decades not to present their findings in such a way as would stir up public controversy about the overwhelming divergence of empirical data from Biblical record, the accumulation of so many dissenting data points could not be ignored forever.

Natural theology didn’t go down without a fight, though: At its 19th-century height, a series of (originally) eight treatises–The Bridgewater Treatises, written by gentlemen of considerable standing in science communities–were issued in the 1830s to assert the persisting explanatory power of the Bible in relation to the natural world. Nor was this whole push happening on the margins of socio-religious discourse; though Rev. Robert Chambers’ The Vestiges of the Natural History of Creation (1844) offended many clergymen for its deviance from Biblical record to account for new geological data (and early scientists for its sloppy, credulous, lay-person reporting on findings from other fields), it remained a popular work throughout the century, undergoing a dozen editions and attaining even a royal audience. The book makes appeals for the existence of a god despite all apparent evidence of absence in the natural world, but Chambers’ is a much-diminished godhead, a Deistic omniscience “behind the screen of nature”, who exists and acts in pointed contrast to the personally-involved creator believed in by so many of the day.

I should emphasize that this was all going on prior to Darwin’s On the Origin of Species (1859) and The Descent of Man (1871), the latter of which caused trouble by spelling out that, yes, evolutionary theory really did apply to human beings, too! For a while, the old age of the earth in relation to Biblical narrative could be accounted for by there being multiple periods of flood and upheaval, with the Biblical flood being just the last of this series. However, even the universality of that flood was falling apart under empirical scrutiny, and this had serious theological implications for the story of Adam and Eve–the critical touchstone on which all notions of redemption through Christ were based. If all the floods were regional, did the sin of Adam and Eve only touch a particular lineage? And later, when the theory of evolution came into the picture, when did Adam and Eve sin in this gradual progression of species?

Robbins goes on to assert that a definition of religion “must surely involve reference to a particular way of life, practices oriented toward a conception of how one should live” but then disdains the claim that religion “is a scientific theory,” “a competing explanation for facts about the universe and life” (quoting Dawkins). Indeed, Robbins sums up his opinion on that view as follows: “This is—if you’ll forgive my theological jargon—bullshit.”

It’s not simply that Robbins (and others like him; his article is merely representative of many more) is inaccurate when he presents Christianity as he then does, as an allegorical exercise throughout the ages, absent real-world interaction with empirical input. Rather, in doing so he also erases a powerful history of human struggle–among theists and atheists alike. Reading 19th-century texts as an atheist myself, I’ve always been struck by how difficult understanding one’s moral duty becomes when known facts about the natural world change so dramatically, and when the question of what your god wants of you becomes so convoluted in consequence.

For instance: 19th-century England being a hotbed of poverty and disease, works of fiction, religious pamphlets, and opinion pieces were at odds over whether state reforms to improve the lot of the most vulnerable went with or against the Christian god’s plans. Since the natural world was so full of suffering, but nothing happened that the Christian god did not will into being, maybe suffering and income disparity were meant to exist–to give some humans a chance to practise humility, and others to offer charity?

Alternately, was this god showing his condemnation of industrialized England? Was this god awaiting human action to repair what humanity had wrought? Was that the reason for the rampant spread of disease, and the difficulties disposing of so much waste? But if so, why were so many innocent children suffered to be born into this system, when their isolated circumstances rarely gave them a chance to accept Christ before their untimely deaths?

And what about all those new animals being shipped to England from all over the world–more species than could ever have fit on the Ark? What did we owe our fellow species if we were all actually part of one long chain of being? Was vivisection unjust, or were we still the divinely-chosen stewards of the Earth, as Genesis suggested, entitled to do with the rest of the world as we wished?

While Robbins insists on this separation between religion as “a conception of how one should live” and any conception of the world drawn from empirical evidence, he also argues that, since religious moral traditions have historically preceded secular moral traditions, movements like humanism are simply degenerate versions of religious belief (drawing here, in part, from political theorist and atheist John Gray). This puts Robbins in murky territory around the notion of moral education–but thankfully, this murk is easily cleared up when we move past the false dichotomy of science and religion being the only possible answers as to morality’s origins.

Simply put, even just looking at the range of Christian cultures in existence today, we see moral divergence: Some support the death penalty; some do not. Some have marriage equality; some tacitly or even overtly sanction the incarceration, torture, and murder of gay persons. Some believe in equality between the sexes; some believe in strict gender roles. Some believe in genital mutilation for children of both sexes; others for children of one; others for children of none. Some believe in using the state to provide a basic social safety net to help everyone in times of distress; some believe that charity should start–and end–in independent efforts through the church and at home.

But I don’t doubt that I could ask Christians in each and every one of these cultures where they get their sense of how one should live, and receive the same answer: “From the Bible.” Or possibly: “From Jesus.”

Similarly, evidence about the natural world is only as good as the culture that receives it. For some, the idea that certain brutal acts are present in nature is enough to suggest that we should sanction those acts in human societies; for others the brutality of nature is as good a self-serving incentive as any to build a better, safer community for all. For others still (perhaps those more attuned to suffering around them), the way the world is should simply never preclude us from trying to shape it otherwise.

Which brings us to the shape of our culture–this digital, Anglocentric, North American community in which we see time and again the popularity of articles like Robbins': anti-atheist rhetoric by an author who nevertheless claims to want a more thoughtful discussion, a discussion in which atheists and theists are speaking directly to one another instead of over each other’s heads. But in a review that centrally castigates a caricature of modern atheism on a poorly-evidenced charge of historical ignorance, Robbins instead evades important histories of his own: histories of thoughtful theists, learned and layman alike, who over the last two millennia looked to the natural world assuming it carried literal Biblical histories both within and upon it.

Robbins and similar religious writers try to chalk up such theists to mere fundamentalists, and accuse atheists of targeting the “low-hanging fruit” of Biblical incoherence and Creationist nonsense instead of tackling “sophisticated” arguments like David Bentley Hart’s, which involves a “ground-of-all-being” god-concept: ineffable, Deistic, (still male), yet somehow of personal relevance when contemplating how best to live. But for all these attempts to place the god debate outside the world we all live in, the great bulk of Judeo-Christian history still lies with those theists who believed in a personal, present, and active creator as described in the Bible, even as both the natural world and weight of social history revealed less and less synchronicity with Biblical descriptions and prescriptions over time.

Diminishing the reality and diversity of such Biblical adherents–and thus dismissing consequent atheist concerns about how to build a better society when people still believe in this sort of god when making political and personal decisions–isn’t even “talking past each other”; it’s denying the full and profoundly human range of voices at the table. Surely we’re capable of more.

Reading Note: Robert Chambers, Vestiges of the Natural History of Creation

I usually post my doctoral reading notes on Facebook–a bizarre move on my part, since status bars really aren’t made for longer observations–but I’m going to try to publish future notes here instead. I’m just under two months from my last doctoral exam (which will be followed by an oral defense the week after), so synthesis of my studies is of the utmost importance these days.

To this end, the essence of today’s reading note is simple: Sometimes I’m struck by how quickly certain ideas have been adopted into the broader cultural consciousness in relation to others. Reading works from periods in tremendous paradigmatic flux just drives home how fragile and uneven our progress towards a better understanding of the world always is.

In 1844, well before Darwin’s tentative accumulation of facts pursuant to the emergence of diverse species, a book called Vestiges of the Natural History of Creation sparked a heated cultural debate. Not for its blatant, Eurocentric racism, which was depressingly standard to the period, but because it tried to align the weight of geological, cosmological, anthropological, and geo-linguistic discovery with Christian beliefs… an exercise that really never works out in religion’s favour. To account for the overwhelming evidence of forms emerging from other forms in increasing diversity over deep time, the book’s anonymous author (a reverend, as it later turned out) conceded that it would be inane (if not also insulting to the concept of omniscience) to posit a creator who personally acted in the creation of each life-form, instead of one who created laws for the universe–generative “expressions” of his will–and let those laws play out in accordance with a greater, unseen design.

The author’s eventual arguments around the existence of “evil” are predictably clumsy in consequence: It’s not the Judeo-Christian god who makes/allows for suffering… it’s just a collision of his various divine laws, set in motion at the beginning of time and untouched since, that necessitates the creation and destruction of so many “inferior” forms, and which has children and other innocents enduring sorrow and strife through no personal fault (rather, in consequence of others neglecting natural/divine laws–even laws that might be unknown at a given time, like the appropriate hygienic measures to take in response to plague). Oh, and then there’s something about how war is an exceptive case stemming from behaviours that are on the whole good for human advancement–as if it would be impossible for an omnipotent being to construct such laws in a way preventing heinous extremes?

To be sure, this sort of deism always yields muddled rationalizations, and in consequence, though many Christians enjoyed the text, Vestiges received heaps of scorn from many other Christians who disdained the whole argument as almost atheistic in its excuses for the natural world being so very much at odds with revealed scripture. You can see for yourself why that accusation might be forwarded, after passages like the following:

It will occur to every one, that the system here unfolded does not imply the most perfect conceivable love or regard on the part of the Deity towards his creatures. Constituted as we are, feeling how vain our efforts often are to attain happiness or avoid calamity, and knowing that much evil does unavoidably befall us from no fault of ours, we are apt to feel that this is a dreary view of the Divine economy; and before we have looked farther, we might be tempted to say, Far rather let us cling to the idea, so long received, that the Deity acts continually for special occasions, and gives such directions to the fate of each individual as he thinks meet; so that, when sorrow comes to us, we shall have at least the consolation of believing that it is imposed by a Father who loves us, and who seeks by these means to accomplish our ultimate good. Now, in the first place, if this be an untrue notion of the Deity and his ways, it can be of no real benefit to us; and, in the second, it is proper to inquire if there be necessarily in the doctrine of natural law any peculiarity calculated materially to affect our hitherto supposed relation to the Deity. It may be that while we are committed to take our chance in a natural system of undeviating operation, and are left with apparent ruthlessness to endure the consequences of every collision into which we knowingly or unknowingly come with each law of the system, there is a system of Mercy and Grace behind the screen of nature, which is to make up for all casualties endured here, and the very largeness of which is what makes these casualties a matter of indifference to God. For the existence of such a system, the actual constitution of nature is itself an argument. The reasoning may proceed thus: The system of nature assures us that benevolence is a leading principle in the divine mind. But that system is at the same time deficient in a means of making this benevolence of invariable operation. To reconcile this to the recognised character of the Deity, it is necessary to suppose that the present system is but a part of a whole, a stage in a Great Progress, and that the Redress is in reserve. (281-2)

In short: Everything we’ve discovered doesn’t really match up with Christian doctrine, but since denying the natural world would mean denying a creator anyway, the only rational move of the devoted Christian is to have faith that there is a grander design at work than meets the eye (or than is revealed in nature). Such a concession that natural theology isn’t going to offer definitive proofs of a god unto itself would have been (and indeed was) a tremendous affront to many prominent Christian thinkers of the time.

Now all this was fairly old-hat to me–the difficult, 19th-century push-pull between religion and empirical discovery–but after some 100 pages of fairly coherent documentation around recent geological discoveries, I kept stumbling over other aspects of this immensely popular text–ones that illustrated a far more nebulous understanding of the world. These included the author’s credulous assertion that one amateur scientist had made insects appear from electric current alone (a purported instance of spontaneous creation), and the claim that skin colour and skeletal structure could be changed just by moving to a different world region, and the blatant struggle to explain how macro-life-forms sometimes seemed to appear from nothing, or else adopted a different taxonomical make-up from season to season.

Put simply, the reverend believed, as did many in his time, that there was a latent capacity for life to move from “inferior” to “superior” forms under the right circumstances (and again, obviously in this analysis, “human” was the superior form in the natural world, while “Caucasian” was the superior form within the species)… and that’s when I remembered that Pasteur’s formalization of the germ theory of disease, as well as Mendel’s pea plant experiments on trait inheritance, had not yet come to pass. Though Vestiges clearly attests to a culture primed for the theory of evolution–a theory that would not be articulated in full for another 15 years, and which still isn’t properly understood by multitudes today–it also marks the uneasy end of a time period before other, just as powerful scientific discoveries: discoveries that would take far less time to be incorporated into the cultural consciousness.

A hell of a lot has happened in the last 170 years, with our knowledge of the natural world growing by leaps and bounds, but when I read works like Robert Chambers’ Vestiges, I’m reminded how much such progress is always contingent upon popular context. As Chambers himself notes:

The reception of novelties in science must ever be regulated very much by the amount of kindred or relative phenomena which the public mind already possesses and acknowledges, to which the new can be assimilated. A novelty, however true, if there be no received truths with which it can be shown in harmonious relation, has little chance of a favourable hearing. (142)

In 1844, the geological record and immense biodiversity evidenced by global exploration demanded a new understanding of the world–and the groundswell towards that monumental paradigm shift in the sciences was already clearly underway. But over a century and a half later, many ideas not even properly anticipated in popular science texts like Chambers’ (like the existence of microbial organisms, as well as trait inheritance [Chambers holds that environment matters more, and shows no inkling of the existence of even a precursor concept to genes])–have taken firm and unquestioned root even in parts of the modern world where the theory of evolution, for all its cultural priming in years prior and since, still has not.

To me, this reads as resounding testament to scientific discovery not always being enough to improve human knowledge, especially when pitted against prevailing cultural mythologies like Judeo-Christianity. This in turn leaves me a bit haunted: What other scientific advances have we already made, or stand on the brink of making, that, for want of a more palatable social narrative, still won’t gain public support for decades yet to come?

A Complicated Convergence: Gender Fluidity and Trans Advocacy

It should be a truth universally known that only an asshole would advocate for policies that limit the quality of life for their fellow human beings. Ignorance might also be an excuse for such behaviour–but only to a point. As with all things in life, we can only try to make the best possible choices with the information we have on hand–but if that information ever changes, we have a responsibility to augment our views in turn.

With this in mind, I’d like to explore an important implication of recent cultural shifts toward supporting the transition of children to the gender they self-identify as having. But I want to do this with the full, emphatic understanding that, whatever causes a child to self-identify in a way that might necessitate state and medical intervention to accommodate their transition, the ability of this child to feel safe remains tantamount. In no way whatsoever would I ever advocate for a child to be made to feel further distress because of a perceptual schism in their gender/sex identity.

What bears noting, however, is that in a culture where marketing for children’s products is heavily driven by gendered messaging (and perhaps even more so today than in other decades), prominent narratives of trans children in mainstream media uphold a rigid gender binary. In the stories of parents coming to understand their child’s gender-identity, we often see a linearity drawn between the child having preferences culturally-aligned with a differently-sexed person, and their parents using these cues to take the child’s gender-claim seriously. A child born female who identifies as gender-male might, for instance, have short-cropped hair in these articles, boast Batman paraphernalia, and be described as loving rough-and-tumble sports. Similarly, a child born male who identifies as gender-female might wear dresses, grow her hair out, and be described as showing an interest in dance.

There are many reasons why this might be the dominant approach for such stories. It might be that trans narratives like these fit best with the mainstream understanding of gender (that is, adhering to a gender binary), and so rank higher than a story, say, of a child born sex-female, who loves dresses and the colour pink and wearing long hair, yet self-identifies as gender-male. It might also be that this narrative is more comprehensible to the parents of a child with seemingly atypical preferences, who might find it easier to accept and accommodate these differences by simply regarding and reinforcing their child as belonging to the “other” gender.

But the more frequent reason in these articles is that a child just “knows” which gender they belong to–even at a very young age. And this is a messy argument, one that pointedly ignores a) how powerfully children are influenced by the media around them, b) how very much young children prefer easy and rigid categorizations (e.g. good person, bad person) and so need to be trained into a more nuanced view of the world, c) how much role play is a critical part of childhood development, and d) how ignorant children are about their own biology, let alone their performances of self within the world.

(And if anyone doubts that d) is true, try to remember the nonsensical views you or your peers held at young ages about how babies were made. As for b), noted queer sex columnist Dan Savage has a great anecdote in The Commitment about how his child, adopted at birth by Dan and Dan’s male partner, disapproved of gay marriage because a kid at kindergarden told him marriage was between a man and a woman. But really, anyone who’s been around small children should recognize this rigidity–whether it manifests in a sudden fussiness about new foods or an incessant need for simplistic reassurances about things being safe or not, good or bad. I have plenty of anecdotes just involving my eldest nephew, but I’m not going to get into them here.)

Absolutely, if a child comes to see a disconnect between their inner sense of gender and how they are identified by others, helping that child feel more comfortable with him- or herself, and advocating for a world in which that child will be safe performing whatever gender-identity they hold, is vital to the construction of a better society for all.

But when a child (or their parents) leans on a rather traditional set of gender stereotypes in order to assert this personal identity, we need to reflect on what this says about the persistence of a rigid, if also relentlessly, culturally reinforced gender binary. When we take biological components out of the picture (which we do when we talk about gender today, in contrast with “sex”), what does it even mean to be “male” or “female”? Does it mean to prefer UFC over knitting? Steak over salad? It can’t have to do with sexual preference, because very “male” persons love other very “male” persons, and very “female” persons love other very “female” persons. Does it mean conveying your love through monosyllabic grunts and indirect acts of kindness versus long conversations about feelings, hopes, and dreams? Does it mean preferring the works of Hemingway to Woolf? Tarantino to Nancy Meyers?

We know that there are some clear behavioural trends along sex-based lines, but the jury is ever-out on how much these trends would exist without cultural reinforcement. We know that communication styles and conflict-resolution styles differ widely in relation to culture, and that there are cultures in the world where male persons are regarded as masculine for performances that would seem feminine in North America–especially in relation to emotional displays and an interest in fashion. There is similarly nothing innately feminine about the colour pink; in other eras and cultures, we know the colour has just as intuitively been coded “masculine”, and might in fact be seeing a come-back among men’s fashions in our own day and age.

More to the point, though, we cannot make definitive assertions about a person’s identity based on peripheral preferences and activities. A sex-male child who loves pink dresses at the age of five, for instance, might go on to claim any number of personal identities: transgender, gay, non-trans* heterosexual with gender-normative hobbies, or maybe even non-trans heterosexual with a passion for cross-dressing. All we can safely say about the child of five is that they currently love pink dresses. And really, what is up with the adult urge to imply more?

Again, I write none of this to deny the existence of children who live with a powerful disconnect between their understanding of gender in the world, and how they perceive themselves. But we do need to acknowledge that these children are nevertheless reacting to a heavily gendered world in the first place–a binary that leaves male, female, and inter-sexed persons negotiating personal performances of identity against overwhelming gender stereotypes their whole lives through. In consequence, whenever articles about trans children reinforce narratives of linear progression between a particular hobby or clothing preference and an inner gender identity, they can simultaneously forward the cause of trans advocacy while undercutting the fluidity of human experience, and limiting the potential for performative exploration to be a perfectly normal, non-deterministic facet of our lives.

For the record, I don’t identify as having an inner gender identity–just my natal sex and the gender others read off me. I am neither trans nor cis, though I know when most people see me on an average day, they will identify me as a culturally-normative woman, on account of my female-typical anatomy, which I make no attempt to suppress through reactive forms of dress. I know many of my personal experiences as a human being are directly tethered to being perceived in this gendered way by others–from experiences of sexist language all the way up to sex-based acts of violence–but I have no deep-down conviction that I Am Woman. Nor do I believe that I Am Man. I have a body–I am a body–but to say that my inner self is female would be, to me, as incoherent as saying that my inner self is brown-haired and brown-eyed.

This said, the Don’t Be An Asshole rule inevitably advocates for helping any person who feels a painful disconnect between their inner identity and their external performance to achieve a better harmony between the two. But in the process, I can still wish for a cultural landscape that doesn’t promote such schisms in the first place: a world where children can just have their colour, activity, and fashion preferences, whatever these might be, without worrying about whether they have the right gender identity to match. On this accord, for all our happy progress towards improving the quality of life for fellow, struggling human beings, even the most excellent of current trans child success stories still suggests that we have a hell of a lot of work to do.

*I use “non-trans” in lieu of “cis” because “cis” also signals a difficult binary that doesn’t encompass the full range of gender/sex identities.

The Scientist and the Writer

Between doctoral readings, I’ve been trying to squeeze in research for a personal project. Often these secondary readings make me second-guess myself as much as my primary readings (relentlessly, that is, which I consider an important, if excruciating part of the process), but The Bonobo and the Atheist was the rare text that made me feel right on track.

Frans de Waal is a primatologist whose attention to the linguisitic skewing of primate research led to a significant (if gradual) transformation in how we view the social dynamics of our biological “next-of-kin”. For decades the primatology literature was impeded by two critical biases: The first involved the casual grouping of bonobos as a subspecies of chimp, despite significant social and physical differences between the two (socially, bonobos are matriarchal primates who primarily use social grooming and fluid sexual behaviours to maintain order, while chimpanzees are patriarchal primates with a strong predisposition to violence as a social mediator). The second involved an all-too-human tendency to read every social behaviour between members of other primate species as inherently selfish, amoral, and manipulative–language that therefore saves notions of altruism, morality, and nuanced politicking for the common good for us.

De Waal’s research, especially with bonobos, challenged this tacit division between our species and other primates. His books to date, including Chimpanzee Politics and Our Inner Ape, do exceptional work in outlining human-centric biases in many of our test parameters for “human” traits in other animals, as well as demonstrating the social fabric that other primate species–both on an individual and group level–consciously and carefully maintain.

His latest book builds on this research by exploring the emotional roots for a range of socially-beneficial behaviours. These include the seemingly gut instinct for sympathy that manifests in incredible levels of care among other primate populations, as well as acts of immediate reconciliation (or reconciliation after long absences) in the wake of conflicts that caused strife or injury. They also include a means of stabilizing group behaviours that allows juveniles to get away with almost anything, but leaves adults anxious even in the absence of an alpha when they do something that subverts the usual societal rules. Also mentioned (of course) are the displays of grief that we’ve only recently allowed ourselves to accept mark other primate behaviours when confronted with the spectre of death.

De Waal’s anecdotes from field research throughout his career are a delight to read, and his portraits of a few human beings involved in this sort of research offer similar insights into human narrowness. Early on, for instance, he describes the very sad life of one George Price:

George Price was an eccentric American chemist, who … became a population geneticist trying to solve the mystery of altruism with brilliant mathematical formulas. He had trouble solving his own problems, though. He had shown little sensitivity to others in his previous life (he abandoned his wife and daughters and was a lousy son to his aging mother), and the pendulum now swung to the other extreme. From a staunch skeptic and atheist, he turned into a devout Christian who dedicated his life to the city’s vagabonds. He gave up all of his possessions while neglecting himself. By the age of fifty, he was sinewy and gaunt like an old man, with rotting teeth and a raspy voice. In 1975, Price ended his life with a pair of scissors.

De Waal goes on to note that Price made the very common mistake of assuming that altruism and selfishness were polar opposites, when in fact (as de Waal argues for much of this book), plenty in nature seems to attest to a middle-ground: a kind of altruism that helps us as individuals by supporting the community at large, and which arises quite naturally in any species with mechanisms for parent-child bonding.

I call attention to this mediation of extremes because de Waal unfortunately does not extend this same principle to another major theme of his book. Originally I thought the very title, The Bonobo and the Atheist, referred primarily to de Waal (an atheist) in relation to his major topic of study, but when de Waal is not forwarding professional anecdotes about primate species or reflections on the religious paintings of Hieronymous Bosch (used in this volume as a kind of binding narrative thread, with varying success), he spends an inordinate and quite honestly baffling amount of time airing personal grievances about “Neo-Atheists”. I could not tell if this was simply an attempt to create a sense of tension in the work–pitting a strawman of American atheism against a defence of human spirituality, while simultaneously arguing that the preconditions for empathy and morality are already amply in evidence in other primate species without the need for supernatural awe and wonder–but regardless, suffice it to say, on this accord the book becomes a confusing mess.

From relaxed Catholic beginnings in Holland, de Waal writes of attending a god debate in America (a completely different socio-religious context) and seems to imply that questions on related matters, like whether or not morality is universal, aren’t also debated on different occasions. This, despite citing Sam Harris’s The Moral Landscape, which is reduced in summary to equivocation: Why pick on the genital mutilation of women in predominantly Islamic states when boys are circumcised right here in North America? A fair question–but also one that self-evidently acknowledges that the reach of “neo-atheist” discourse is much longer than simple existential queries.

De Waal then goes further: After cursory acknowledgement of the low status atheists have in American politics, he implies that these “neo-atheists” “sleep furiously” because of inner demons, and even asserts at one culminating juncture that he has a hunch the most outspoken are just reacting to childhood traumatization by recapitulating the same dogmatism in another form. You’d think, if this was an actual belief on his part, he would therefore have some compassion for even the most strident “neo-atheist” discourse, but he spends much of the rest of the book mocking modern atheist writers for failing to see any good in religion–even painting Alain de Botton, one of the most prominent atheist/religious accommodationists today, as only “grudgingly” giving religion a place at the table.

(And again: If I were to suggest that some group was only publicly protesting a pervasive social practice from a place of childhood trauma, I would… not take the approach that the traumatized individual just needs to embrace how hunky-dory the source of their trauma really is.)

Ultimately, de Waal compares “neo-atheists” to a group of agitators outside a screening of Titanic, loudly insisting to everyone leaving the theatre that the movie was a work of fiction, and the characters played by paid actors–as if everyone leaving the theatre didn’t already know this! As if they weren’t knowingly enjoying the movie on entirely different grounds! I have seen some damned good critiques of modern atheist movements in America, but this… this was an incredibly tone-deaf, wilful misreading of a religious context to which de Waal is a self-admitted non-native: a context in which, even if nearly half of Americans do actually understand the theory of evolution and accept the deep-time of our species and our planet, they sure as heck aren’t willing to acknowledge one of the most essential concepts in modern biology when answering survey questions about the origin of human beings.

(And don’t even get me started on the prevalent belief in demons and demon possession, concepts that yield horrible, horrible treatments of persons either simply exercising independence or suffering from mental illness.)

As a Victorian scholar, however, I also noted times when de Waal demonstrated that, scientist though he may be, interpreting the writings of historical scientific figures is not his strength. In his negotiation of T. H. Huxley on religion, I realized that the rhetorical strategies Huxley used in de Waal’s excerpting–rhetorical strategies that leap off the page to someone who routinely studies literature of that period–were taken literally, so as to assert as definitive a kind of spirituality even among 19th-century agnostics meant to deride modern North American atheists. De Waal also uses incredibly cagey wording to imply the exact opposite of Huxley’s views on animals-as-automatons: Whereas Descartes believed that humans and animals were strictly divided by the presence of free will / spirit in the former, Huxley takes readers through a series of experiments that show humans belong on the same animal spectrum–just as easily reduced to automatic action, with zero indication in any animal species that the spirit, if it exists, is anything more than a sound produced when the “bell” of the body is struck. (That is: We have no evidence of the spirit motivating action, and plenty to show that all animal action presents as environmental reactions through the material self.) De Waal, however, abbreviates all this to suggest that the existence of feelings are negated by automaton status, and that Huxley treated other animals as distinct from humans–neither a fair representation of Huxley’s argument.

Though I cannot comment on the quality of de Waal’s art criticism, it was when instances like the above started to pile up in the book that I went from shaking my head through any section not directly about de Waal’s primate research, to sitting a little taller, and reflecting on my own sense of audacity in working on a personal project attempting to bridge the science/pop-culture gap from the position of a literary historian. Certainly, many very good writers have already bridged this gap–from Stephen Jay Gould, scientist with a passion for history, to David Quammen, journalist-turned-scientific-popularizer, to Richard Holmes, Romantic-era-biographer with a flair for capturing the wonder and delight of early scientific explorations. But as of late, mired in heaps of personal and professional research, I’ve been left doubting whether I have what it takes to venture into this field, much as writing histories of science for popular audiences would be as much a joy as a privilege for me.

Reading The Bonobo and the Atheist this weekend, though, affirmed for me what I hold to be a fairly basic tenet of authority claims: Though academic training should provide you with invaluable and diversely-applicable skills for research, expertise in one area does not make you an expert in another. De Waal writes best when he sticks to his strengths, and worst not when he ventures into new terrain (new terrain should always be the goal!), but when he does so without making every effort to tread lightly and mindfully therein.

I could say the same about quite a few other prominent writers, from all manner of religious, scientific, and related cultural discourses, but my interest lies simply in learning from their literary weaknesses, and making it a priority not to make the same mistakes. New mistakes, perhaps–but that’s just part of the process, too. All the best in yours!