The world did not sit idly by while I studied for my final doctoral exam this summer. While I read nineteenth-century science textbooks, philosophical treatises, works of natural theology, university lectures, experiment write-ups, and a range of fictional accounts involving the natural world, violence swelled about me—about us all—in its many awful forms.
Outside mainstream news, I knew Syria, Myanmar, and the Central African Republic were still sites of strife and potential genocide. Meanwhile the urgency of other brutalities, like news of the kidnapped young women in Nigeria (who for the most part remain prisoners today, along with newer victims of Boko Haram), had begun to fade in the Western press—still horrific, but nowhere near as immediate as word of the tortured and murdered teens initially said to have instigated fresh hostilities along the Gaza Strip. Against the photo essays that emerged from Israel’s ensuing airstrikes—the pain and the loss and the fear writ large on so many faces—events elsewhere took an inevitably muted role in Western coverage.
Even macabre word of a new organization sweeping across Iraq, openly murdering religious minorities or otherwise cutting off access to resources and regional escape, did not gain as much media traction for most of the summer, despite these events prompting forms of American aid-based intervention. Rather, it would take the beheadings of American and British citizens for ISIS to take centre stage in Western media—a position it currently occupies alongside a sweeping NFL scandal in which celebrity sports stand indicted for their role in a broader culture of domestic abuse.
Granted, the US had other issues in the interim between Israel and Iraq, with widely resonant protests emerging from Ferguson, Missouri, after police officer Darren Wilson shot and killed Michael Brown on August 9. (For some sense of the event’s global reach, consider that Palestinians were sending online tips to American protestors in the wake of aggressive anti-demonstration tactics.) And while the racialized outcry underlying this situation should have come as no surprise to most, the “worst Ebola outbreak in history”—which reached Global Health Emergency status just days before Ferguson, and has continued to spread since—revealed a systemic battleground all its own: A vaccine exists. Canada has “donated” 800–1,000 doses. The WHO will “decide” how these are dispersed.
(And speaking of Canada, lest I be labelled one of those northerners who calls attention to US crises without looking inward: This summer’s news also brought into focus our own, systemic issues with the indigenous community — peoples at higher risk of lifelong poverty, incarceration, all manner of abuse and disease, going missing, and turning up dead.)
The above are by no means overtly malevolent facts—the WHO, for instance, is attempting “ring vaccination” while efforts to accelerate drug production proceed—but the systems of power these terms invoke do exemplify the very status quo of global disparity (in overall affluence, levels of health education, and resource mobility) that foments such virulent outbreaks in the first place. There is violence, in other words, even in systems that seem orderly and objective, and we cannot ever discount the role that language plays in reinforcing a deadly world.
With this in mind, I was struck this summer by both how much and how little attention the rhetoric of authority claims received amid this coverage. In the “much” column we have, for instance, the work of Gilad Lotan, who crunched an immense amount of social media data to identify the information silos in which followers of Israeli-Gazan conflict tended to position themselves—each “side” receiving and reinforcing different news items through like-minded media outlets. We also have reflections like that of John Macpherson, who explored the professional tension between objectivity and emotion in photojournalism, and just recently, a poll of Ferguson’s citizens, which indicates an extreme racial divide among individuals tasked with interpreting the events of August 9.
But underlying these pieces is also the paucity of self-reflection: Lotan’s data sets would not be as impressive if the vast bulk of readers were not so starkly divided in their consumption of news media. Nor, too, would the recent Ferguson poll pack quite the wallop without so many participants deciding definitively, incompatibly, and above all else culturally what happened between Darren Wilson and Michael Brown.
I should not be surprised, granted: The study of “how we know what we know” is a difficult one even when we raise up children with the vocabulary to consider such questions (whatever that vocabulary might be: a point to be revisited below)—and an almost Herculean task once we, as adults, have settled into a way of thinking that seems to serve us well.
Indeed, though I spent a great deal of time thinking abstractly about knowledge this summer, I also often found myself confronted by articles so provocative, I had to share them immediately, and to rail vehemently about some key point therein. Each time I indulged, though, one of two things happened: I either did research soon after that undermined the legitimacy of the original piece, or found myself too deeply affected on an emotional level to engage with earnest responses in any other register.
Knee-jerk reactions are, of course, common, and understandably so. In a practical, everyday sense, we almost by necessity draw conclusions that gesture towards deductive reasoning, while actually better resembling glorified gut instinct: The sun will come up because it always comes up. You just missed the bus because you always just miss the bus. That danged table leg caught your toe because it always catches your toe. And so the fuzzy thinking builds up.
Above all else, we acclimate to what is familiar. We grow comfortable in whatever strange, personal assumptions have long gone uncontested. Our propensity for confirmation bias then fills in the gaps: We know that an article retraction never fully dislodges belief in the original article. We know that narratives aligning with pre-existing beliefs will continue to be referenced even when shown to be untrue. We know that following an event in progress, acquiring facts as they unfold, invariably leads to high amounts of inaccurate knowledge difficult to dislodge.
So it is with human beings, who have jobs to go to, children to care for, relatives and friends to attend to, and the self to care for after hours. How might we even begin to overcome such shortcomings when the cadence of our lives often seems antithetical to deep reflection?
The dangerous answer has often been the unquestioned affectation of an orderly and objective system of thought. This differs from an actually objective system of thought—an unattainable ideal—in that, even if we can provide a full and logical progression from propositions A to B to C, the validity of these propositions will still inevitably be tied to their cultural context, from whence more of that messy, glorified gut instinct emerges.
As a doctoral student, I study science writing in the nineteenth century, so I have seen this flawed affectation play out throughout history. In particular, I can demonstrate how the rhetorical strategies of inference and analogy allow respected authors to leap from specific sets of empirical knowledge to broader, un-evidenced claims that just happen to match up with the authors’ pre-existing views on, say, “savage” cultures or female intellect. This is by no means a mark against empirical evidence; human beings are simply very good at making unjustified connections between data sets, especially when such connections can reaffirm what one already believes.
Similarly, in the Ferguson shooting and Gazan conflict this summer, “facts” quickly flew into the realm of inference, with terms like “witness” and “casualty” and “discrepancy” taking on markedly different characters depending on their source. Just as Michael Brown underwent three autopsies, so too has all manner of “hard” data in both cases been sifted through to exaggerated effect, and always with that human inclination towards finding a story that fits by any means, however loosely deductive.
In short, the danger of affected objectivity is that it cannot exist apart from the irrationality of those human beings applying it. Nonetheless, the “fair and balanced” approach to a given situation is often positioned as the only “reasonable” path to change or justice—a claim wholly disregarding that, for millions of human beings, the groundswell of personal experience, community anecdote, and emotional outpouring is the only “truth” that ever matters in the wake of violent world events.
When dealing with the rhetoric of authority claims, and their role in how we respond to violent events, our crisis thus cannot get much simpler than this: We need to recognize ourselves as emotional beings, with emotional prejudices derived from everyday experience as much as personal trauma, when engaging with narratives of violence—narratives, that is, which by their very nature tend to be emotionally charged.
This is not about ceding the role of logic in response to serious and dramatic world and local events. Nor is this about forsaking all available evidence and refusing ever to make safe assumptions about a given situation or issue. This is simply about recognizing ourselves as predisposed filters of wide swaths of competing information, and making an effort not to act as though we have a monopoly on truth in situations involving other human beings.
This may seem straightforward, but our patterns of news consumption this summer, as well as the activist strategies that emerged in response to a variety of issues, suggest we have a long way to go. While protests tend to arise from an urgent and legitimate place of outrage, an effective response to systemic abuses must not be based solely on popular outcry, or else it risks establishing a “rule of mob” in place of “rule of law”. Conversely, though, the rhetoric of “rule of law” and affectations of objectivity go hand-in-hand: If the former is not sufficient to address a given crisis, we have to take even greater care with the sort of authority claims we accept, lest they obscure any truly drastic changes needed to better our world.