MAGGIE CLARK

Writer. Student. Victorian Lit Scholar. Baker of a Mean Pie.

Talk Like a Conservative: Conversation-Starters in an Era of Right-Wing Extremes

121025082613-kennedy-nixon-debate-horizontal-large-gallery

In the wake of the Republican National Convention — part of an American presidential season that has Canadians as riveted as we are unnerved — I want to talk a little about information silos, and how we can work to talk past them even as the rhetoric on social and mainstream media alike grows tense with word of failing democracies and mass shootings the world over. I come to this conversation in part as a humanities scholar: specifically, as someone who studies how nineteenth-century British society narrated an ideological clash between new astronomical data, general philosophical methods, and long-standing spiritual beliefs. I also come to this conversation as someone raised to think critically, and independently, under a banner of conservative thought. It is in this second capacity especially that I’d like to discuss a possible path through what feels some days like a No-Man’s-Land of entrenched political views and blatant disregard for our shared humanity.

I grew up in a Progressive-Conservative household — “PC” referring, when I was a child, both to a national and to a provincial conservative party in Canada. My first “real” books were Ayn Rand’s Anthem and George Orwell’s Animal Farm, with their not-so-subtle warnings against the tyranny of collectivism. My father was an active member of the PC parties, so election seasons meant envelope-stuffing, door-to-door pamphleteering, and eating burgers and hotdogs with a great many charismatic, friendly people who cared deeply for their communities. While religion was a matter for me to decide for myself, as a freethinker encouraged to read broadly, my father’s politics remained a major component of household life. During elections, we of course had conservative signs in our front yard. We also subscribed year-round to The Toronto Sun, from which I always read the political op-eds (then the Max Haines crime column, then the “Sunshine Girl” page, then the comics). And since the PC party was a stark minority in our Toronto riding of York South-Weston, I grew up with a body of political information at odds with whatever I’d hear from others in the neighbourhood or at school.

This last factor was especially important to my critical development. When Ontario PC Premier Mike Harris enacted his Common Sense Revolution, I was the only child in my class who had read the accompanying “Blue Book” and heard conservative explanations for Education Minister John Snobelen’s new funding model for schools. This meant that, when teachers told students that layoffs were occurring because the conservative government had underfunded classrooms, I had a ready counterpoint: the government wasn’t directly laying off anyone; it was the school boards, placed in a funding model meant to equalize dollars-per-student across rural and urban areas and reduce the deficit, who chose to lay off on-the-ground workers rather than trim their own bureaucratic fat. This was the PC position, and when I held it, I could understand why others held it, too: It described a system of managerial excess that was hurting public education, and tricking everyday workers into thinking higher-level government was to blame instead. As stressful as it was to occupy a minority political view at the time, surrounded by students and teachers who regarded the PC Party as cold-blooded destroyers of their community, that schism proved an important epistemological stepping stone.

In particular, I grappled a great deal, in this schism’s wake, with the idea of objective truth. If I had my views because I grew up in a PC household, and others had their views because they grew up in NDP or Liberal households, what body of facts — if any — did we share as a society? What objectivities could any of us concede? Was there any way to make everyone happy? Did everyone deserve to be? Today, as a scholar, I regard the ability to hold ideas in tension as the crux of a humanities education, but this education should begin long before post-secondary. For me, it began in my conservative home.

In fact, despite offering conservative counterpoints at school, I was under 10 when I realized that I didn’t agree with everything my father believed. (My father would later, teasingly note that this was the great failing of raising a thinker — you could never be sure that they’d think alike.) One conversation was particularly transformative for me, because it was the first time I heard myself parrot a position with which I did not agree. He had been trying to explain a conservative principle regarding the unjust burden of taxation, and had used the example of a swimming pool: What if some people wanted a facility built, but others felt this was a waste of resources? Who should have to pay for the pool? I remember giving my answer automatically — the answer I knew he was after — but even as my father praised my choice (“the people who want the swimming pool”), I felt a strong sense of disconnect from that point of view. The answer didn’t feel right. But why?

For one, I knew I was being set up with the choice of facility — a swimming pool and not, say, a firehouse. But people aren’t always rational agents, so what might seem frivolous to some might seem self-evidently essential to others. What if only some people wanted a firehouse, for instance, and other people believed in waiting-and-seeing? Should everyone suffer from an ensuing fire because there was opposition to the resource? I knew taxpayers shared an investment in basic infrastructure like roads, sewage, and the electrical grid, but where did that sense of essential public services end? Was there an objectively correct standard, or was it arbitrary — and if arbitrary, how did ethical communities decide what to invest in collectively?

Also, I was and am the eldest of four siblings, so when my father posed the idea that the majority should not have to go along with the whims of the minority, I understood how catastrophically that could play out in the real world. On road-trips in the family station wagon — four of us crammed into the back with an extra seatbelt — there were six bladders and six stomachs to contend with, plus the possibility of nausea from reading on the highway, and any number of muscle cramps or family spats that might make the trips insufferable. In the micro-societal bubble of our brown ’84 Dodge Aries, we therefore had two people in positions of authority, four in the minority, and a relentless need to make concessions for the good of the whole.

My gut feeling, then, was that, although society could tell swimming-pool-lovers to build their own facility, this would never be a neutral choice. It would always be a choice that also fostered social division, such that ethical societies had to consider the unity cost of turning down one group’s sincere desire for specific resource allocation. Certainly, there are many things we all have to do on our own in life, but was there an objective line in the sand dictating where communal efforts should stop, and “by your own bootstraps” begin? Or was this, too, an arbitrary construct? And if arbitrary, by what mechanisms did an ethical society decide on its collaborative limits?

What I realized in the ensuing days and weeks was that, if we regarded the scope of essential public services as the end-result of a conversation — a negotiated give-and-take between a multiplicity of special interests — then at the end of the day, we wouldn’t simply have a swimming pool; we’d also have an increased sense of societal good will, and perhaps other public resources besides — resources that might not have emerged as possibilities until we started deliberating together over the pool.

What I couldn’t figure out at the time, though, was how this notion of communal well-being would fit into a conservative party’s mandate. What, to a traditional conservative, did being part of a “community” entail?

The answer is not simple, because for a time there were two major conservative parties in Canada, and in my late adolescence I grieved, as my father grieved, when the national PC Party and the Reform Party merged into their current form, the Conservative Party of Canada. The Reform Party, as a body of moral conservatives, caused a great many “Red Tories” (fiscally conservative, socially liberal people) to jump ship in that first year. I too had taken to calling myself a Red Tory, and at an Ottawa convention my father and I attended after the merger, I was stunned to discover a conservative party preoccupied not with economic issues, but social ones — specifically, the homosexual scare. Gay people wanting to marry? Gay people wanting to be around children (as if they weren’t already)? My heart sank when I found myself embroiled in argument with a sitting MP over whether homosexuality was “natural” and whether what happens between other animals in nature is a fair basis for deciding questions of human dignity. It was an argument that could not be won; gay people, to him, were simply an abomination. I was horrified by my physical proximity to proof that people in Canadian government could deny the humanity of their constituents — myself included, as a queer person at that convention.

I must add, though, that I found comfort soon after this confrontation in a washroom, from older PC members who saw me in tears, and who — after hearing my story — told me they were just as disturbed by the high-handed encroachment of Christian intolerance into party discussions that had previously been about jobs, foreign policy, and related resource-management. It was an upsetting time for a great many people, who saw the merger’s reduction of the Canadian political spectrum as a step towards American-style politics — but it was upsetting precisely because of the immense diversity of conservative views initially at stake. Many moved for moral reasons, for instance, to the rising Green Party — which, at its outset, boasted fiscal conservatism: taxing waste, not people; corporations, not income — but which later shifted from the language of economic pragmatism to rhetoric that better appealed to traditional liberals.

Prior to this merger, too, the national and provincial PC parties I had known were already a complicated intersection of high- and low-income families. Many ridings celebrated the conservative tendencies of first-generation immigrants (too liberal in their original countries, but firmly business- and family-values-oriented here), and happily drew persons from a wide range of backgrounds into the fold and onto the campaign trail. Other ridings retained a rhetoric of suspicion towards the influx of Other People — whose work ethics, displacement potential, and subsequent economic value remained suspect, even as anyone, of any colour or background, could become a leech in PC eyes by virtue of relying on social assistance or ending up in prison.

The sense of community cast by this uneven range of social demographics gave me the impression of a conservative party taking “all comers” who prioritized fiscal responsibility and the value of hard, independent work. And yet, the actual policies forwarded by the PC Party suggested stark biases towards certain demographics over others. For instance, well-to-do members talked about corporate and income taxes as an undue social burden, and lower-class members, who were living through more extreme financial pressures day-to-day, resonated deeply with the rhetoric of paying less — even when the party’s subsequent platform addressed the former’s tax brackets more than their own. The latter demographic didn’t resent the former for its success; rather, the latter looked forward to being that successful, too, and were already worried about who would try to take their money once they had any. In the meantime, they often advocated against anyone they perceived as getting an unfair advantage, when “everyone” was struggling to support their families on precarious budgets. Unsurprisingly, then, party platforms often stigmatized prison education and welfare programs as forms of social theft from those who worked hard without breaking the law — a rhetoric that established moral outrage as a fundamental facet even of ostensibly by-the-numbers conservative affairs.

Now, I had been taught from a young age the dignity of work. I framed the first five-dollar bill I ever earned, and as a young teen I babysat during the school year and nannied in summers, volunteering my day’s wages on principle when a scooter went missing on my watch. My first formal job was janitorial; I cleaned toilets at the Canadian National Exhibition, and I was proud to be earning my own money, irrespective of the nature of the work. And yet, this rhetoric of moral outrage from older conservatives confused me; if work was its own reward, instilling us with purpose and self-discipline and self-respect, along with financial independence, then why were so many people fixated on what other people were doing with their time? Why was the desire to shame or punish unemployed people more of a driving force around welfare policy than the desire to see others given the best chance to reclaim work’s inherent benefits by returning to their industries of best fit?

Sure, an out-of-work accountant could do manual labour in exchange for welfare — but at a cost to their long-term economic potential, eating up time better spent keeping their skills up-to-date; or interning as a step back into the industry where they’d have the greatest impact; or as needed, retraining for similar industries with better job prospects. When we treated work as something Other People — straw-people — were always trying to get out of, weren’t we diminishing the value of labour itself? How could we build a thriving economy based on punitive, not purpose-driven work?

As the above suggests, my conservative childhood taught me how to frame public policies in terms of optimized socioeconomic benefit. Prison education and arts programs were vital, I learned to argue, because studies showed that they reduced recidivism, which would in turn drive down future taxpayer costs, and return inmates to the workforce as taxpayers themselves. Social welfare systems similarly benefitted society as a whole, because even if a few people gamed the system, studies showed that children of parents on welfare had a better chance of breaking the cycle of social dependence, thus lessening our future tax burden. In this way, I wasn’t a “bleeding-heart liberal” when I advocated for seemingly non-essential social resources; I was demonstrating rational, conservative self-interest.

These days, I don’t hold any party allegiance. I vote my conscience, as best as I’m able in a first-past-the-post system that requires choosing either the best prime minister / premier or the best local MP / MPP. Sometimes I vote to keep a given candidate out of power. Sometimes I vote for the person whose party platform cleaves closest to my views. Sometimes I vote for the person who seems most likely to listen to competing perspectives on a local level, even if their party generally holds views that differ from my own. From plainly partisan beginnings, I quickly learned that politics, even in its North-American-celebrity-cult form, always works best when treated as an conversation, ever-contingent upon individuals willing to give and take.

However, if we ever excelled at this notion of give-and-take, I see fewer signs of it this year than most in the US and Europe. Maybe this has to do with the kind of panic that only 24/7 global media can engender, by highlighting an intense number of crisis events the world over. Or maybe this is a social media issue, in which ideological insularity — curated both by our conscious in-group choices and online algorithms beyond our kenning —makes it easy to dehumanize our opponents, on both sides of the fence, until we’re left with a sense of futility about ever seeing eye-to-eye again.

Whatever the cause, I know I’m seeing a lot of familiar conservative rhetoric around Donald Trump and Brexit (yes, even in little old Canada), but not much about how to build meaningful conversations with their advocates — fellow human beings, that is: family members, PTA colleagues, coworkers, people we interact with in our daily lives, who favour such right-wing extremes. Instead, I see the rhetoric I remember from my childhood, in classrooms filled with frustrated Liberal and NDP students and teachers who couldn’t fathom how the PC Party could be so dedicated to destroying our social contract. Today the charge feels the same: How can these Republicans and the older British generation be so dedicated to self-destruction?

Meanwhile, the lower- and working-class people behind both Trump and Brexit remain familiar to me. Like many members of Canada’s conservative parties, these are people who understand financial stress in ways that make them insular, rather than expansive, in their views about our social obligations to one another. They therefore resonate (even if it’s not to their actual benefit) with the claims of richer people that we’re all being asked to take on too much — to trust too many people with different backgrounds when our own communities seem unstable, to support too many people who break the law (e.g. undocumented workers, or inmates), and to give others a leg up (e.g. those on welfare, or historically marginalized peoples) when “everyone” has it rough. It rarely occurs to these folks that the “winners” in this system are another symptom of how broken that system is. Rather, the conviction that society is serving everyone’s interests except their own seems to reduce the question to How do I become one of the winners? How do I rise to the top in the current system? Because if they did it, surely they’ll help me do it, too.

And yet, for all that this tribalism has come to manifest in the most painful racism, xenophobia, and general fear-mongering, there lies at its core a thread of shared self-interest I am confident we can still build upon. Moreover, it’s a core thread we’re going to have to build upon — irrespective of the outcome of the upcoming U.S. presidential election (and its consequent impact on North American and Trans-Pacific trade), or the continuing fallout of Brexit for the rest of the European Union . As our culture continues to automate industry, we’re going find ourselves with fewer low-skill jobs to go around, and this requires rethinking how we constitute employment in the first place.

What we need to decide in the next few years is how we will value work , but thankfully, this is one political conversation we can still have across the spectrum, at a remove from specific candidate platforms or government decisions. To do so, though, we need to turn our initial reflections inward. We need to decide first and foremost what work feels like and means to us. Is it something we do under extreme duress — working three jobs, say, because we have too many dependents, or are alone with no support network, or because without work we would have to confront mental illnesses that might be our undoing? Or is work something we do in part out of necessity, but mostly because we enjoy the work itself? Could we stop working entirely, but choose not to, because work brings us a sense of community and purpose?

Once we have a better sense of why we, as individuals, do what we do, the next question becomes: Is this ideal? Do we want better for ourselves and for others — or because we had to suffer through degrading work just to get by, do we believe that others should have to do likewise to achieve the same standard of living? Is work supposed to be a carrot, or a stick? We need to think long and hard about this question, because any societal change is going to have an uneven impact — so even as we move towards a better future for everyone, we might find ourselves slower to reap personal benefits than our neighbours. Can we handle that? Or would we genuinely prefer that everyone else be miserable, too, if we’re miserable right now?

And if we need to come first in any policy decision that stands to improve society on whole, we’d do well to ask next what community means to us. Do we want to live in a society that optimizes safety and security for all, or do we believe that safety and security for some — i.e. those we regard as closest to us, ideologically and ethnically—matters most, even it comes at great cost to the safety and security of others? And if the latter, how much social disparity do we think we can accommodate before the safety and security of our in-group becomes contingent upon the safety and security of everyone else? Is our rational self-interest bubble the size of a city? The size of a province? The size of a country? Can even a country sustain a given level of safety and security if it thrives while the surrounding world burns?

These aren’t easy questions, especially as they extend from theory to praxis. Even if we acknowledge that we do work — desperately, degradingly — only because we must, how would we go about changing our circumstances? Even if we feel that work should be more carrot than stick, how much of an obligation does that give us to shoulder the load of our struggling fellow citizens? And even if a country can never enjoy long-term safety and security while the greater world burns, how do we go about effecting change on a global level, as citizens limited by our immediate and local responsibilities?

What I want to suggest is that, even if these aren’t easy questions, they remain questions we can raise, and should raise, across political-allegiance lines in the coming months. Irrespective of whether your family members or friends advocate political candidates or positions that you regard as society-enders, the social contract only ends when we give up on the conversation. And even then — even if someone has gone and soiled themselves because they were having a meltdown instead of using the facilities at the last pitstop, leaving a huge stink in the car for everyone else to deal with until the next exit — the road-trip isn’t over. The world goes on. All that changes is how much will need doing, whenever we’re ready to come together again, and rebuild what we’ve lost.

Who You Gonna Hashtag? Ghostbusters in an Age of Online Cynicism

ghostbusters-2016-cast-proton-packs-images

Before I talk about Paul Feig’s 2016 reboot of Ghostbusters, I want to talk about Stranger Things (2016), the sure-footed Netflix original series with a Freaks-&-Geeks-meets-X-Files vibe. Stranger Things, created and directed by the Duffer Brothers, is set in 1983–and not just in terms of product placement, fashion, and set design. There’s one plot-line in particular, involving teen romance, that establishes the series not as ’80s nostalgia, but as a sincere attempt to replicate a different era, warts and all. When a smooth-talking young man uses persistent, invasive behaviours to wear down a young woman he wants to sleep with and maybe date, I half-expected the show to take the more recent thematic approach, of escalating his persistence into outright sexual assault–but it didn’t, because it’s 1983, and relentless male overtures that women patiently tolerate and politely decline (right up until they don’t) are written into the culture.

They sure as heck are written into the original Ghostbusters (1984), a movie I passed on to two nephews before they turned four–dancing to the theme music together, making s’mores for the critical fight scene, and otherwise teaching them to delight in all its haphazardly paced inanity. I love Ghostbusters. But as an adult, I also recognize the complicated line it tows around Dana Barrett (Sigourney Weaver), an elegant, accomplished woman we see putting up with the over-friendliness of her neighbour, Louis Tully (Rick Moranis), and then the more aggressive encroachments of Dr. Peter Venkman (Bill Murray). Ultimately, the Gatekeeper and the Keymaster unite–which means that the most diminutive man in the film “lands” the bombshell through non-consensual sex–but everyone is fine in the end. No emotional consequences ensue.

You can’t have a plot point like that in 2016, for two reasons: 1) sneaking a sexual “win” is tough to pull off without sending an unwelcome message about consent, and 2) polite toleration is no longer a culturally celebrated mode of female response to relentless male overtures. Even writing the above sounds uncomfortably permissive–and yet, my point is that media representations do change. Today, female persons receive more messaging about shutting down the Venkmans and the Tullys in their lives than they did 30 years ago, which means you’ve got a different set of male-female relationships on screen, and a different, more self-determining set of expectations for women in general. And of course you’re going to set a new Ghostbusters movie in the present. For suspense-building purposes, Stranger Things had to be set in the 1980s, when communications technology abounded, but remained limited to wall phones, walkie-talkies, and ham radios. However, it makes no sense for a reboot of Ghostbusters, which delights in tech and animated spectaclenot to make full use of its advances in the years since the original.

No wonder, then, that this film has been the subject of so much scorn. At its heart, the 2016 Ghostbusters is a slapstick, zinger-ridden fun ride–with all the same pacing dilemmas as the original (because it hits all the major plot beats of the original)–but it is also of its time, and being of its time means it’s not enough for Ghostbusters to be a fun ride anymore. Not in an age of relentless pre-release click-bait speculation, trailer deconstructions, Twitter abuse, and surrounding social anxieties about casting, political turf wars, and Hollywood reboots in general. All the hype around this film has been especially entrenched in gendered and racialized discourse–and as Ghostbusters emerges from its first weekend, those same lines in the sand reverberate in a staggering number of hostile audience reviews, especially driving down its IMDb rating.

How could a movie set in the present, and produced from day one in the worst of that present, not absorb all this online debate? Referencing Reddit and gendered hostility in a YouTube review are just surface indicators of the film’s self-awareness in this regard, but that self-awareness is also at the thematic core of the film. The question of legitimacy, for instance, provides the narrative arc for two lead characters, and after making terrible compromises in an effort to please other people, our main protagonist (Erin Gilbert, played by Kirsten Wiig) learns to be secure enough in her own accomplishments without external affirmation. The film itself is a larger embodiment of this same lesson: conscious of the original’s shadow, and society breathing down its neck over any changes, even as it seeks to stake out a place all its own, in the now.

The critical question then becomes, does all this self-awareness detract from the film’s entertainment value? I was afraid it might, when I read reviews claiming that the principal bad guy is an MRA caricature, or that every single male character in the film is “stupid” or “evil”. But the story of the Ghostbusters has always been a tale of social outcasts trying to find their place in the world, and very few people aren’t the butt of some joke or another in the original. Also, Ghostbusters II (1989) has creepy Dr. Janosz Poha (Peter MacNicol) trying to reincarnate a brutal tyrant for Reasons, which may or may not have to do with his accent? So it was time to let this latest movie speak for itself.

And to answer some of the hyperbole first, no, every male character is not “stupid” or “evil”: Bennie the delivery guy (Karan Soni) clearly has the upper hand when messing with lead Ghostbuster Abby Yates (Melissa McCarthy); there’s a street artist who disregards fourth-Ghostbuster Patty Tolan (Leslie Jones) and in the process inspires the iconic Ghostbusters logo for Jillian Holtzmann (Kate McKinnon), a.k.a. Top Halloween Costume 2016; and Homeland Security are never arrogant idiots, unlike EPA-wonk Walter Peck from the original, who goes and releases all the ghosts Venkman, Stantz, Spengler, and Zeddemore have captured and contained. In fact, in the 2016 Ghostbusters, it’s one of the team herself who lets pride get the best of her, leading to this film’s fatal variation on the original scene. And did I mention how often our main protagonist is made to look like a fool, from prepping for class in an awkward sexual position, to treating restaurant windows like sliding doors before the final show-down, to always proving at a loss for her faculties at the mere sight of the team’s secretary, Kevin (Chris Hemsworth)?

It’s a Ghostbusters movie, for heck’s sake. All the original Ghostbusters get cameos (even the late Harold Ramis), we see all our beloved figures and set pieces from the original playfully subverted (Staypuft, Slimer, the firehouse, the Ghostmobile), and as Rowan North (Neil Casey) steps up his evil plan to use the dead to make all of humanity* pay for failing to recognize his genius, we see ghost sightings increase to an epic showdown that differs from (while also building on) the original. Even variations to classic scenes (like the sighting at a heavy metal concert, in lieu of a hotel banquet hall) are all about incorporating the old and the new. At every turn, the film pays tribute to its beginnings while laying groundwork for a body of canon that better reflects the world today.

Similarly, 2016’s Ghostbusters fill quirky personality niches and riff off each other for the bulk of the film’s comedy, just as they did in the original. Holtzmann is appropriately brainy in ways that routinely almost get her colleagues killed, while offering up a wealth of ever-improving gadgets that should make the film’s merchandising department sing. Abby provides some food and slapstick humour, but mostly spends her time verbally sparring with Erin in the way only old friends with old hurts can. And Patty, unlike fears propagated by the unfortunate trailer, does not uphold “every black stereotype.” Rather, she is a book-smart, self-confident, under-stimulated employee of the New York Transit System who finally has a chance to put her extensive knowledge of the history of the city to good use when she nominates herself to the team. The energy between these four plays well, and it’s clear that the whole cast had fun making this film (if in doubt, just watch the credits)–which is as it should be for a goofy summer blockbuster.

The real clincher for me, in mourning the ugliness engulfing this light-hearted film, and what that ugliness says about the state of our storytelling culture despite all surrounding progress, came from a few movie references Abby and Erin throw around in a moment of crisis. When this pair of grown women bonds in hushed, reverent tones over Road House (1989) and Point Break (1991), it occurred to me how often the rifts we experience online today are treated as the result of a long-standing cultural divide. As if female persons never enjoyed kick-ass action flicks when they were kids, and are just recently discovering them in order to subvert and/or ruin them. As if there isn’t a whole body of female persons with a special, eternal place in their hearts for the original Ghostbusters. As if most of us didn’t carry that film’s sense of off-kilter, nerdy joy well into the 21st century, eager to fill the world with more works that celebrate our love for sci-fi, paranormal and otherwise. We may be less tolerant of invasive behaviours today, and the 2016 Ghostbusters‘s plot is most definitely not oriented around trying to get into a client’s pants, but what remains universal to the Ghostbusters franchise is the quest for a good sliming.

And yet, when the cleanest slime a team of Ghostbusters has to endure is the ectoplasmic residue spattered on screen, you know there’s something strange in the neighbourhood. The question is–oh, you know what the question is. What we could really use, as a culture allowing online cynicism and vitriol to dictate the lens through which we tell stories about ourselves in the present, is better answers: answers that will lead to a 2040 society as likely to cringe affectionately at media artifacts from our time, as we do when looking back to 1984.


*I find the MRA claim super confusing for Rowan’s character, because he clearly hates everyone, endangers the life of at least one male character with utter indifference, and plays with the armed forces that come to oppose him like they’re so many toy soldiers. His spiel about being an under-appreciated genius is a classic villain speech, and I suspect if he wasn’t preaching to female Ghostbusters, he wouldn’t be read the same way at all by audiences. I find him much more compelling as an embodiment of the “lone wolf” mentality that plagues our culture right now, especially in light of one important, violent step in his plans. (But even then, again, we haven’t exactly expected the most nuanced backstories for Ghostbuster villains in the past. Why start now?)

What We Talk About When We Talk About Superheros

maxresdefault

I promised myself I’d wait to watch Batman v. Superman: Dawn of Justice. Major critical reviews were scathing, but many graphic-novel fans wildly adored Zach Snyder’s latest, and I wanted to review the film at a remove from the hype either way. It might seem absurd to take the viewing of a superhero film so seriously–it’s just action fluff, right?–but the genre is so mainstream that I find these films offer a meaningful distillation of our major cultural concerns. Between Captain America: Civil War and Batman v. Superman: Dawn of Justice–two films dealing with superheroes in direct conflict–we can especially expect to see a) major cultural ideologies in tension, b) how our most immediate sociopolitical issues are being repackaged in accessible narratives for the widest possible adult audience, and c) how different franchises approach the same general themes.

Sure enough, on the surface, CA:CW and BvS:DoJ share a body of thematic concerns. Both foreground the problem of civilian casualties in superhero battles (a cipher for socioeconomic and military conflicts the world over). Both question the appropriateness of aggregated superpower (corporations, vigilante groups, specific nation-states) operating without bureaucratic, ostensibly democratic oversight. And both… all too predictably orient male choices and perspectives around the desire to live up to a standard of goodness embodied or otherwise influenced by female persons of note in their lives.

And yet, CA:CW and BvS:DoJ are staggeringly different beasts. Even months after BvS:DoJ‘s release, I found the film difficult to watch from a basic coherence perspective–the characters, from very early on, operating in bizarre relationships to cause and effect[1]–and I quickly had to adjust my sense of Snyder’s directorial priorities. Since the very first scene was part dream sequence, I decided that Snyder’s interest lay more with depicting raw emotions–vengeance, grief, helplessness, spiritual disillusionment–in all their self- and communally destructive glory, and thus approached the rest of the film as an impressionistic spectacle. (Imagine Terrence Malick filming a superhero film based on someone else’s source material, and BvS:DoJ comes pretty close.)

This approach to the film mediated most moments of ensuing bafflement with character actions and motivations,[2] but does not change the fact that Snyder’s film has a remarkably different relationship to the above list of thematic concerns than the Russo Bros’ Marvel flick. For instance, although both films address civilian casualties, and both films even use a grieving black woman to drive home the message that white-Western superpowers hurt others, CA:CW gives these same civilians narrative agency to retaliate and dismantle that power base from the inside out. In BvS:DoJ, there is an eerily False-Flag feel to the whole issue of civilian casualties, as time and again the film weaponizes victims or otherwise aligns them with covert government or private-interest plots.

Are there any real victims in this film? Maybe the average citizens that Batman seems to show no concern about terrorizing while he points fingers at Superman for his own impact on civilian lives. Even here, though, Snyder seems to side against the common people of Gotham, because he has a blind black man chided for being worried about “the Bat.” If you’re truly innocent, the counter goes, you should have nothing to fear from vigilante justice. Instead, Batman and Lex Luthor’s childhood victimhood, alongside Superman’s Christ-like salvation spectacle of victimhood, stand above all other suffering in this film, both in scope and narrative agency. This is a far cry from the Black Panther, in CA:CW, rising from the ruins of his ruling-class father’s death to sympathize, in the end, with the average citizen who orchestrated the film’s central plot because his own, average family died in the wake of a superhero battle.

CA:CW and BvS:DoJ also differ in their approach to bureaucratic, democratic oversight, with the former maintaining a slippery ambivalence in its ongoing working relationship with the UN, and the latter… blowing up the relevant (national) government structure, and thereafter disengaging with all notions of power being determined by anything but vigilante beings who bend agencies like the CIA and organizations like the prison system and Gotham’s police force to suit personal whims and agendas. Neither approach is ideal, but while CA:CW at least keeps the line open between real-world international governance and Marvel superpowers, BvS:DoJ amply hints at Snyder’s next major dream project, a remake of The Fountainhead, by firmly entrenching the exercise of legitimate power at the level of private enterprise, legal and otherwise.

When it comes to the role of women, though, CA:CW and BvS:DoJ are both mixed bags. It is, as I noted above, a bizarre facet of both films that male actions seem to be dictated by either bizarrely distorted notions of innate feminine goodness, or else by the trauma of female persons in male lives being harmed by other male persons. For Iron Man and Captain America, Pepper Potts and Peggy Carter respectively influence the critical ideological division these two men embody when confronted with the expectation of consenting to UN oversight. For Batman and Superman, trauma related to their mothers has a powerful, central role in driving them into (and out of) one-on-one confrontation. Of all the major players in these films, only Lex Luthor is otherwise directed, by a tyrant of a father-figure who supplies notions of Old Testament godhood that Lex longs to see murdered in all subsequent forms. Can any of these dudes establish a coherent ethical code from within? Can any modern superhero film establish a meaningful social contract built on anything outside the loss of beloved women?

A razor-thin Wonder Woman offers the only portrait in BvS:DoJ of a superhero whose call to action arises, in keeping with her canon, from a simple need to fight against immediate forces of destruction. (Even Superman is presented in a much more emotionally partial light, and then bafflingly has the gall to criticize his editor, Perry, for his own biases. Is everyone a hypocrite in this film?) And here’s where BvS:DoJ especially fascinates, because Snyder’s been criticized extensively for his rewriting of major characters: Batman murders indifferently and to excess, a huge leap even over Frank Miller’s The Dark Knight Returns (1986), Paul Pope’s Batman, and other deeply  troubled portraits; Lex Luthor is a highly unstable billionaire-heir with angel, demon, and godhead fixations abounding; Jimmy Olsen is a CIA spy murdered very early in the film; and Superman’s just plain inconsistent in his use of his powers, conscience, and reliance on father-figures. And to what end? I sure as heck felt little to no empathy with any of the nihilism on display in these rewrites,[3] and felt little interest in each character’s aspirations as the film progressed.

Nonetheless, what stood out most was what didn’t change: namely, the canon for the female characters. I sincerely wonder if fans who defend this film’s relentless, estranging rewriting of Batman, Lex Luthor, Jimmy Olsen, and Superman would be quite as keen in their defense if the women in this film were rewritten to be just as ugly as the men. As it stands, Lois Lane remains locked in her original canon–another character even points out that what makes her such a good reporter is her ongoing surprise (in lieu of cynicism) at the world’s capacity to go bad–and Clark Kent’s mother is an abidingly loyal, gentle, Christian woman who just wants the best for her son, while Bruce Wayne’s mother is pure victim, as unmarked as the pearls ruthlessly shed in her death. If Lois and both Marthas turned as nasty as Snyder makes his male heroes, would this film have any emotional sway at all?

At heart, though, both CA:CW and BvS:DoJ are comic-book movies that celebrate their first mediums, so as much as I found BvS:DoJ incoherent on the level of character construction and progression, I vehemently disagree with criticism about the efficacy of Snyder’s cinematography. BvS:DoJ is absolutely a graphic novel spectacle–from its disarmingly subtle interweaving of dream sequences, to its pacing and plot sequencing, to the grittiness of its colour/light/texture palettes, to its casual leaping from local to far-flung settings with minimal transitional guidance. CA:CW also achieves a level of comic-book spectacle in its cinematography (most notable during major fight scenes, in the staging of specific superheros in formation or visual contrast; and weakest in its attempt to emulate fantastical superhero pursuits and combat cadence near the outset), but Marvel and DC are very different tonal vehicles. From a visual perspective, both films plainly honoured their respective points of origin; I don’t see much value in debating which perspective is better.

What matters more to me–that is, what leads me to reflect on both movies at a fair temporal remove from their release dates–is how differently the major crises of our time can be figured for the big screen. CA:CW is decidedly a secular film in this regard: the superheroes–goofy, inept, wise-cracking, inquisitive, sentient science/magic hybrids–are too plentiful to be regarded in light of Judeo-Christian mythology, with its singular dominant force lording over humankind. Rather, these superheroes are flawed, if also greatly enhanced beings negotiating appropriate limits to their individualism in a globalized society. Where the movie fails as a means of advancing a coherent message about our particular, real-world body of socioeconomic issues (if fails is the correct term; I’m not expecting a superhero movie to solve all matters pertaining to international war- and peace-time politics) is in reinforcing the primacy of that individualism without really earning it; by playing slippery, covert games with the dominant global bureaucracies right up to the movie’s close.

Conversely, BvS:DoJ is a deeply religious affair. It’s all about the necessary audacity of reaching for levels of exceptionalism that defy the seeming chaos of one’s universe (in the case of Batman), or accepting the exceptionalism thrust upon oneself, though it might leave one reviled by fellow man (in the case of Superman), or else raging against the domineering godheads of the past, and any who seem poised to fill similar roles today (in the case of Lex Luthor). Snyder’s world of men is brutish and sociopathic because, in this filmic universe, that is nothing less than the state of humankind–a cowering enterprise of might striking down lesser might ad infinitum–until a saviour figure emerges, to redeem through the ultimate personal sacrifice the hatefulness in Batman’s heart, and get him to concede that man has the capacity for goodness after all. Lex Luthor brings demons in many forms to challenge that goodness by the end of the film (and beyond), but any Justice League movie that stems from this effort is going to have a difficult transition ahead: namely, moving from Snyder’s narrative of godlike superheroes to a more complex interplay of terrestrial beings with varying powers, working to defeat a common foe.

As an effective analogy for the real superpower issues that plague us today, I plainly side with the narrative coherence of CA:CW. However, I suspect we’d all be remiss in not attending to the deeply emotive character of Snyder’s BvS:DoJ, especially in a year that has seen staggering shifts in political discourse towards a highly charged populist register guided by gut feelings of tribalist fear, outrage, and loss. These, too, are metrics of the real world. These, too, guide the shape of international, national, and local power structures. The emotional core of BvS:DoJ thus does not have to follow even an internal logic in order to be true to life–and as such, Snyder’s complicated film proves every bit as much a cautionary tale as CA:CW, when outlining the possible limits to (and unravelling of) global power structures and individual agency in the years to come.


[1] Bruce Wayne, tearing through Metropolis during Superman’s epic Man of Steel battle, calls his workplace to tell his employees to leave, but then we cut to one office floor, which plainly has a clear, proximate view of the battle, and yet for some reason the notion of evacuation never struck anyone until the boss called? The on-site manager only initiates evacuation because Mr. Wayne has called to tell them to leave (which resonates with the film’s overarching notions of human helplessness without the presence of individual exceptionalism). Not long after this, Bruce looks up from the devastation around his building to see Superman fighting Zod in the sky, and… immediately assumes Superman is the problem? How does Batman, a figure we later realize is similarly feared as a force of indiscriminate vengeance by average, helpless citizens, not even for one second entertain the idea that he and Superman are similarly misunderstood last lines of defense against those who sow chaos and destruction? This is all in the opening beats, so from the outset, Snyder’s film establishes human beings whose actions raise more questions than answers.

[2] I took notes throughout the film, in order to quell my bafflement. Here are a few of the other moments that raised more questions than answers: Why does the beat cop have a shotgun? Why is Batman a sociopath towards criminals? Why does Superman totally dismiss Lois’s concerns that love for her makes him biased at his job, instead of taking the charge seriously? Why does this movie treat all orphans as sociopaths? Why does Bruce Wayne have no qualms about the rest of Gotham being in a state of fear over the Bat? Where is his social charm? Why does the Bat brand have to lead to in-prison murder? Why do we need a literally emasculated man to drive home civilian consequences? Do penises literally need to be at stake if Superman is left unchecked? Why can’t Clark Kent use his super-speed to spit out sports copy and still pursue his other story, so as not to attract Perry’s ire? What in blazes was the point of that second dream sequence? Why is there a Bat signal in Gotham when average citizens fear him? Doesn’t that seem a bizarre policy on the part of the police, to openly invoke a force that makes citizens terrified, not reassured, about the night to come? In a movie where Lex Luthor and Batman are both mentally unstable, but Superman is a secondary presence in the narrative through-line, where are our loyalties as viewers supposed to lie? Why does Lex go to such incredible, unnecessarily elaborate lengths to frame Superman? Why would he need to, after the devastation in Metropolis? Why doesn’t this movie just pick up where Man of Steel left off, and escalate public pressures that way? Why is Superman only thinking about his earthly father? Where is Jor-El in all this angsting? Why is that alien ship so ridiculously accommodating of a human presence? Why is Zod worthy of Lex’s admiration and sympathy, when Lex seems to despise Superman? Why is Superman such a poor communicator on the battlefield with Batman? Why doesn’t Superman just pin Batman and talk to him while he has the upper hand? Why does the pair have to spin so excessively into “bro” territory over the inane Martha business? Why did Lex make a spare demon? Why must the ending drag on so much? When is this over?

[3] The only exception being Alfred, who offers a welcome reprieve from this film’s overall tone with his performance as a cynical, heavily drinking companion to Bruce: half-heartedly attempting to stop his employer’s next deranged impulse before prepping the Batman bling for subsequent use in the field, and sincerely mourning the end of the Wayne dynasty at every turn.

Silliness, Sentimentality, and Depth: Orange Is The New Black, Season 4

landscape-1458225609-oitnb-s4-litchfield-031716

I knew I should write about Season 4 of Orange Is The New Black when I realized I was ashamed to talk about this show, and why I find its material so affecting. This is by no means the first time I’ve self-censored discourse about things I deeply care about, but since my major hurdle with the thesis right now is learning to write in a manner my committee will regard as joyful, not defensive, this self-censoring is a trait I’m trying to weed out. And OItNB, as a body of significant social commentary, seems like a hell of a good place to start.

For context, I also recently appreciated Lady Dynamite, Maria Bamford’s frenetic comedy about life before, during, and after her mental breakdown from bipolar-II. I didn’t want to evaluate the show here, but at the time I reasoned that I simply didn’t want to draw too many parallels to my own experiences in the process. However, scenes in OItNB also brought me to tears on multiple occasions, and I still didn’t want to acknowledge such a TV show’s impact on me in a public forum. That’s when I realized something else was afoot. After all, both shows use an incredible amount of silliness to navigate worlds of tremendous pain–and worse, they play overtly on the sentimentality involved in silliness and suffering alike–so it’s quite likely that I didn’t want to appear silly and sentimental in turn. And how silly is that?

Instead, I was going to write about my latest reading list, a mix of books picked up to reclaim joy in the thesis by studying other writers’ passion for non-fiction (e.g. Zizek’s Violence, Iris Murdoch’s literary biography, Sartre, and Charles Taylor’s The Malaise of Modernity) or as research for the novel-in-progress (Timothy Williams undervalued, minimalist Another Sun, Inger Ash Wolfe’s two-thirds-interesting The Calling, Peter Lovesey’s structurally dynamic The Last Detective). In other words: serious works, dealing with serious issues, or at least tonally heavy works read with serious purpose.

And yet, OItNB has always addressed serious issues. In the first season, we were introduced to a great many flawed human beings (prisoners and prison staff alike) involved in the all-too-often-abstracted institution of prison. We learned, too, about the many forms of exploitation and neglect that drive people into this system, and how difficult it is to avoid recidivism in a culture that uses prison for punishment, not rehabilitation. Moreover, whatever Piper gleaned from her fellow inmates over the course of the season, in the familiar mode of “other people exist so rich white people can grow,” always paled in comparison to what viewers learned about the depth of other lives. The audience, unlike Piper, had access to flashbacks and conversations that would pass a variation of the Bechdel test specific to non-white people, and for all these reasons (plus, yes, overt queer sexuality), the show proved distinct from its outset.

In the second season, a broader central cast was then confronted by violence in many forms: the violence of having to choose whether to betray a loved one for personal freedom; the violence of having one’s physical safety become contingent upon the whims of bruised egos in positions of power; the violence of trying to extricate oneself and one’s community from destructive family ties; and the violence of framing people in positions of power for the transgressions of other people in positions of power. In the backdrop, we also saw how old and sick prisoners are treated as they approach their respective releases, and these were perhaps the only forms of brutality in the season guided more by neglect than by the prejudiced nature of human emotional investments.

In the third season, the show relaxed its overarching narrative arc to show how violence works on a personal level in these inmates’ and correctional officers’ lives, while also introducing the idea of the private prison, and the miserly “for profit” economy it cultivates. Backstories and present-day narratives explored a range of prison economies, as well as the struggle of pregnant women and mothers in general behind bars, but one narrative arc in particular also offered the most devastating moment of the season: an inversion of the guard/prisoner romance plot in the preceding season, and with it a painful reminder that power imbalances always disrupt “normal” human interactions in prison.

For me, Season 3 felt by far the fluffiest–even with that one, excruciating character arc–but its ominous closer (Litchfield inmates celebrating a minor prison miracle in the nearby lake while a whole troop of new inmates march in, doubling the prison population in an afternoon) also suggested that all of the season’s background prison politics (the privatization of the prison, the rise of for-profit ventures, the major change-ups in management) had been gently set up for a difficult narrative arc to come.

And difficult it was. Almost immediately, Season 4 introduces a new kind of prison guard. No more is Litchfield overseen primarily by yokels from the general public, men and women with limited education, job experience, and aspirational chops; now a cohesive “brotherhood” of military veterans with damaged moral codes is headed up by a correctional officer from max who values dominance at any cost. Everything that overcrowding does to the general well-being of Litchfield’s inmates, in terms of entrenching ethnic factions, fomenting white supremacist and Dominican-nationalist turf wars, and generally hardening or breaking people who seek to “win” at prison, is nothing compared to the gradual degradation of all prisoners at the hands of an unchecked guard population interested only in protecting its own.

But even then, the show’s commitment to nuance persists. Not all the guards belong to this new order, and among the old is the rapist from Season 3, a man granted a rare opportunity to look himself in the mirror, acknowledge what he is and what he did, and start the long, hard work of recognizing the ugly culture of exploitative prison guards all around him. I was staggered to see the show–again, known best as a goofy, lesbian-happy dramedy–take this direction, and then to go one further: to show how the victim of the rape, when deciding to forgive her rapist for her own well-being, then has to stand up to the bullying of a fellow female prisoner who uses ultimatums to try to dictate her path to recovery. In so doing, OItNB eschews double standards: all its characters are human beings, and even if the wrong offenders are trapped behind bars, the show reminds its viewers that we as a culture have a responsibility to reflect on how to prioritize rehabilitation and restorative justice across the board–not simply to flip and therefore recapitulate the binary of good/bad, redeemable/irredeemable human beings that the current system already imposes on society’s most vulnerable.

This complexity becomes especially important when the season’s overarching arc culminates [SPOILER] in an inmate’s horrifically preventable death–not at the hands of the sociopathic guards, but accidentally, at the hand of a relatively decent human being in a situation created by these sociopathic guards, and by the prison’s cost-cutting corporate owners, and by a society that prefers to sit back and wash its hands of the whole broken legal system. The person who takes the life of the smallest, most passive, most criminally over-charged inmate in Litchfield–an inmate with love in her heart and a promising future ahead–is the same guard who earlier tried to report a brutal cage-match forced on a mentally unwell inmate by two of the sociopathic guards. He tried to change the status quo that eventually places him in the role of murderer, and a bright young woman (whose radiant smile and sense of wonder with the universe is given the last word in this season) in the role of murdered, unnamed victim.

This humanization matters, because it also explains–explains, not excuses–why the prison warden in turn makes a terrible decision, after wavering all season between false promises from his fellow board members and a desire to make a significant difference in the lives of the women in his custody. Even though this decision comes at the cost of his inmates’ already degraded humanity, the warden acts, in the wake of this terrible event, to protect a generally “good” person from being thrown to the wolves by corporate. The “brotherhood” of the season’s new guards thus becomes no different–only a bit more overt, and extreme–than the “brotherhood” that was always present among the prison staff, and the similar tribalism that manifests everywhere in society, when the right stressors are applied without any hope for imminent release.

In fact, injustice has abounded all season, with perhaps the most egregious example involving our trans character, thrown into solitary near the end of Season 3 for a ruinous amount of time, and almost lost in that system because private prisons play by different rules when it comes to the American freedom of information act. It is a small but important triumph when our warden finally acts–in the wake of prisoner appeals and the relentlessness of the trans inmate’s wife–to force corporate to release her into the general prison population. But past acts of humanity do not secure future behaviour, so when that same warden chooses to protect a colleague instead of honouring the brutally taken, the whole prison population rallies (literally, in this case) for justice. And so the season ends on a cliffhanger, with a young prisoner–not yet doomed to spend her whole life in this barbaric system–holding the future in her hands. [/SPOILER]

Suffice it to say, then: Season 4 runs bleaker than its predecessors, and hits close to home with a wide range of pressing social issues in the American justice system (with secondary relevance to other models the world over). Granted, the presence of a celebrity prisoner, whose very existence at Litchfield also highlights the classist hypocrisies inherent in the whole legal system, offers a reliable fount of levity to the season’s proceedings, and the show makes sure to throw in a few amorous scenes between prisoners (among a wealth of “average” naked women’s bodies–always a pleasant surprise on TV) for similar tonal relief. Still, the season is dominated by inmates and prison staff alike who have been, or else are soon to be, broken by a difficult mix of circumstances beyond their control and choices made without full awareness of the consequences. Everyone hurts. Everything hurts, by the end.

So maybe Zizek’s Violence comes in handy after all, in reflecting on what this show challenges viewers to think about themselves, their choices, and their circumstances. Writing on the response to the “French suburban riots of autumn 2005,” Zizek asks,

What kind of universe is it that we inhabit, which can celebrate itself as a society of choice, but in which the only option available to enforced democratic consensus is a blind acting out? The sad fact that opposition to the system cannot articulate itself in the guise of a realistic alternative, or at least a meaningful utopian project, but only take the shape of a meaningless outburst, is a grave illustration of our predicament. What does our celebration of freedom of choice serve, when the only choice is between playing by the rules and (self)-destructive violence? (75-6)

Over four seasons, OItNB has tightened its focus on the ongoing horrors of American incarceration (though, as a Canadian, I also read this in light of my country’s own systemic issues). By the end of Season 4, we see how private prisons can sidestep federal protections to institute forms of modern slavery, upend the proper chain of events in criminal investigations, withhold information about federal prisoners, and otherwise underwrite a social contract that, on the surface, claims to restore safety and security to society by applying fitting punishments to convicted offenders, without being cruel.

In the process, OItNB also highlights the implicit violence in the corporate language so often employed to help individual human beings shirk responsibility for this state of affairs. To this reassuring cover of a socially acceptable language of ethical transference is then added the myriad personal factors, affecting flawed human beings at every level of institutional authority, that allow an already broken system to compound its injustices in the lives of the living, as well as the dead.

A broader social complicity, one that gently indicts and challenges the world at large, grows out from that language: the background noise, every bit as banal as it is evil, of a culture that reinforces the value of human life in socioeconomic terms. But as Zizek notes above, the real issue is less the horror of this reality than the horror of the absence of meaningful, clearly defined cultural alternatives. Where can we possibly go from here?

OItNB will be back for three more seasons–that’s the contract it’s signed with Netflix–so in a way, the show’s producers have already decided on an answer to the problem their compelling, nuanced story poses: an answer that reinforces the importance of wealth accumulation above all else. But honestly? Season 4 ended at a zenith of incoherent rage and resultant apathy and tribalism, because that’s where we as a global society find ourselves, when confronted time and again with news of systemic injustice we know about but can’t seem to change.

So I’m inclined to suggest that we don’t deserve a Season 5. The next steps–in general–are up to us. They always have been, but how especially tragic would it be if a silly, sincere TV show were to drop its next slate of episodes a year from now, replete with social change built meaningfully from its current anger, and in so doing, leave the real world in the dust. Then it really would be a silly TV show, I suppose, settled firmly in the clouds, but right now it’s something special.

Right now it’s silly, and it’s sentimental, and it’s as deep in the mud as us.

Post-Mortem of Two Stories

One of my strongest convictions as a writer is that a good theme will outlive a bad story–if you let it. If you aren’t so enamoured with a specific attempt to depict a cherished idea in a concrete way that you can’t let it go. The biggest mistake I see novice writers make lies in fixating on a single story, to the point that they would rather edit their baby to death instead of taking constructive feedback surrounding that story, and applying this feedback to the next story, and the next, and the next.

In some ways this issue arises from a misunderstanding of the discipline’s demands; for novice writers, just completing that first story can be an Herculean task, and the idea that one might have to start over, from scratch, can feel like resigning oneself to that same level of work all over again. In reality, the ease and difficulty of writing stories will always fluctuate, but this nuisance fact can only be learned by doing. Some days, weeks, and months, I achieve huge word counts for a wide range of material. Other days, weeks, and months, though, just reaching my minimum daily word count is a slog that makes me ashamed for wasting time. (I’ve heard, too, from authors of very successful books, that this doesn’t change upon “arrival” at any particular level, either; the brain remains fickle every step of the way. Goodie!)

Above all else, though, the inability to let go of a specific story is a trust issue–with yourself. If you believe you have a body of interests, experiences, and concerns worth sharing, worth being used to spin the fabric of whole new universes, you will have an easier time accepting that your future work, just by virtue of being written by you, will manifest those values again. With any luck, too, you’ll be surprised by how they emerge in future work, because unconscious manifestation is the surest sign that you’ve found more natural terrain through which to explore those ideas and experiences in full.

Just yesterday, I sent out two stories that then caught me by surprise, when I realized that they shared certain narrative concerns despite significant differences in subject matter and style. This was a heartening discovery, after weeks of feeling ineffectual and incoherent in my academic work.[1] In consequence–and without falling into the tedious trap of writing much about the stories themselves–I want to discuss these shared concerns. On a selfish level, I hope to affirm for myself that, even if these stories fail, the ideas will persist. On a broader level, I hope the example illustrates, for any novice writers reading this, the importance of seeing past the individual story in general.

To begin, I should mention that a common well-spring for my SFF story ideas is “tired tropes”: ideas so overdone that I enjoy the challenge of trying to make them, if not new, at least interesting again. Time travel, civilization-at-the-end-of-the-world, virtual reality: occasionally, my takes on worn-out concepts sell (“A Gift in Time,” “The Last Lawsuit,” “The Individuality Clause”), which further incentivizes a return to the well.

The stories I sent out yesterday were also “well” stories. I had succeeded with an alien-abduction story before (“The Aftermath”), but that version had been so deeply personal, written in the wake of a serious mental health crisis, that I wanted to explore the initial trope from a different perspective–namely, the social mythology around alien abduction. So, I wrote “O Mothers! O Fathers!”, which follows a pregnant teenager in a highly vulnerable social situation, and her interactions with an unusual community that seems to need her to connect with their own, missing child in another time and context.

What struck me, in writing this story, was the difficulty of negotiating a world of vulnerable people in a way that could stomach, as an author who knows how easy it is to reinforce cultural stereotypes via poor terminology or flat characterization, with the importance of being true to the protagonist’s own voice. And this protagonist is 15. Rough around the edges. Not fully realized, and constrained by the vocabulary of her surroundings. What I wanted was a story in which the readers could sympathize with her while also recognizing the complexity of the people in her surroundings–the teenaged father of the unborn child, the protagonist’s mother, the homeless persons who flock to the strange, cult-like community. I wanted, in short, a fully fleshed-out world.

I’m not going to try to evaluate the success of my attempt here–that’s for the editors of SF&F journals to determine–but I am going to say that, after I’d sent out this story, I realized I hadn’t been writing about “aliens” at all; I was writing through my discomfort with a culture that makes fun of alien-abductee stories as a safe way to deride “trailer trash”: human beings who might as well be taken to other worlds, for all that our culture does enough to extend notions of communal safety and security to the full range of vulnerable human beings. If I had known this was a major thematic concern while writing the story, I probably wouldn’t have been satisfied with the end result, but as it turned out, certain characters’ representations can now serve as a benchmark for my progress to date, in learning to convey these concerns above all else as effective narratives.

(NB: I should also give huge props to Independence Day, which I re-watched a few weeks back, in helping me finish this story. For all that Roland Emmerich profoundly screwed up with the white-washed Stonewall, his campy 1996 SF action flick offered dignity to a diverse range of characters and life circumstances, including a PTSD-ridden war vet and his self-supporting trailer-park family, who are presented in a way that never once uses their social vulnerability as a punchline, even if the war vet’s alien abduction story is a point of merciless derision from other characters in his community. I continue to be baffled that Emmerich could get it so right in 1996, and so wrong in 2015, but that, too, is a good lesson in being careful not to rest on one’s past successes.)

The other “well” story had been started earlier in the year, then dropped like a hot potato once I realized how uncomfortable the trope made me. This was a traditional explorer story, an historical fantasy following 1920s anthropologists on a mission of discovery. On the surface, this story didn’t seem far removed from the usual SF trope of human exploration of a wild, unknown universe–the likes of which offered plots for my first published SF story, “Saying the Names,” as well as the later “A Plague of Zhe.”

However, that similarity is precisely what made the historical fantasy version so grotesque: There’s a complacent, self-congratulatory aspect to shifting this horrific Western-colonial tradition into the allegory of “all humanity” (still usually white, with Western values) meeting, exploiting, colonizing, warring with, and even destroying alien species. In this science-fictional context, writers often call attention to the brutality of Western history, but the trick (and it is a tough one; I haven’t managed it yet) is to do so without also flattening the alien “other” (and by association, the real peoples brutalized by colonial practice) into monocultural stereotypes. And that complacency reared its head right away when I attempted an historical-fantasy variant of this classic SF trope.

Was there any way to write the story I wanted to tell without recapitulating all the grotesque abuses of non-white, non-Western people inherent in the traditional explorer trope? It’s been argued that writing anti-war military fiction is a contradiction of terms (though I’d counter with All Quiet on the Western Front and Catch-22 in a heartbeat); in the same way, even works of explorer fiction that highlight the monstrosity of the whole Western enterprise, like Joseph Conrad’s Heart of Darkness, still reinforce a profound number of racist and otherwise self-entitled cultural views.[2]

So I put the story aside for many months, until I read a work of alternate history that made me realize discomfort was not only necessary, but central to the writing of “For Fear of Little Men.” I won’t name the published story; these days, I am trying very hard to avoid fixating on the deficiencies in any writing but my own. However, I will say that, in reading the story, I was reminded why I dislike most revisionist SF&F histories: When an author appropriates the niftier elements of a given historical period, then rejiggers surrounding social elements to be more egalitarian, less classist, less sexist, less racist, the work itself ceases, for me, to be an interesting exploration of that historical context. You might as well write about old-timey revivalism in the present–which would, I suspect, offer a more thought-provoking exercise than the historical revisionist variant.

This kind of writing, I must emphasize, is very different from writing historical fiction from the vantage point of lesser-known participants–female persons, non-white-persons, differently abled persons, cash-poor citizens–or writing alt-histories that keep major cultural issues at the fore. Damned good work has come from both veins of historical fiction (e.g. Nicola Griffith’s Hild, Kim Stanley Robinson’s The Years of Rice and Salt, Michael Chabon’s The Yiddish Policemen’s Union, Keith Roberts’ “The Lady Margaret”) precisely because such writers are fleshing out our understanding of humanity with all its warts left in.

The story I wrote is epistolary–had to be epistolary–so that the racism of both protagonists, two cultural anthropologists on the hunt for fairies, would be self-incriminating, as opposed to imposed by a lecturing 21st-century narrator. One protagonist is overtly racist, the other a “polite” racist, and both, by virtue of their scholastic context, are also the products of cultural and institutional racism. This self-incrimination is important because one of the protagonists is also a marginalized figure–a female person in academia–but as history plainly reports, it’s possible to be marginalized and to marginalize in turn, and I’m not going to pretend that, by virtue of being female, any protagonist becomes a bastion of enlightenment in all regards.

As with the aliens story, though, my choice of protagonists made it super tricky to write in a way that maintained character biases and authorial biases. For example, I made damned sure that each indigenous group appropriated by my narrators was named explicitly, so readers could go off to learn more about these peoples past and present from other sources, but because the story is told through two diaries, as read by the anthropologists’ supervisor months later, the vantage point of the indigenous guide still gets sidelined. And maybe that’s because, ultimately, this is a story about Western culture confronting the depths of its own monstrosity, worse than that of any fantastical creature we could possibly dream up–but maybe it’s also a fatal flaw in the tale.

Again, SF&F journal editors will decide whether these two stories succeed as stories. What remains for me to ascertain, at this juncture, is the range of my own preoccupations, for a few reasons:

  1. Because sometimes it really does help to affirm the reason one writes. Yes, in a general sense, I know it’s because I want to connect, to contribute to a longstanding storytelling tradition that blends a wide range of exciting and inspiring voices to explore how weird and wonderful and sometimes devastating our shared humanity is–but in a more specific sense, what exactly am I hoping to contribute? And how vastly different are these hopes from the nature of the output itself?
  2. Because, no matter what its subject matter and themes might be, a story needs to stand on its own, so if I’m getting heavy-handed in my narration, knowing why–knowing what’s at stake for me in a given sort of story–should prove useful in helping to change a behaviour that can only destroy complexity in the work itself; and
  3. Because it helps to know how coherent (or incoherent) one’s preoccupations currently are. If the best I can say, after reviewing completed work, is that clearly I like talking about X but can’t for the life of me explain why, dollars to doughnuts those narrative concerns aren’t going to manifest coherently in future work either. Just knowing that I’m incoherent when I try to write about X doesn’t mean I’m going to be able to address the matter right away, but it should help me figure out when to shelve a given story, and move on until I can.

Moving forward from these two stories, I can already see that the novel-in-progress is going to be trouble, unless I can more seamlessly naturalize a few of my representational issues in the short fiction first. The SF mystery, which I’m hoping to complete by summer’s end, is rife with potential narrative traps: a planet populated by indigenous aliens, a human underclass, and a human overclass; a nearby space-station filled with people from familiar (Earth-based) ethnic backgrounds; a protagonist-couple that could oh-so-easily fall into the “men act, women dream” stereotype, among others.

Obviously, my aim is to tell a good story, and I do think I’m getting to be a sure hand at sustaining suspense and building socially significant mystery plots. My issue is going to be stopping myself from trying to impose structural fixes after the fact, from a place of authorial bias, and instead simply trusting that my characters’ complexities–their smallnesses and their surprising depths of insight–will emerge of their own accord. If I let them. If I’m willing to let any failed attempts in the long process go.

Trust in oneself; trust in one’s readers; trust in the potential of any world, any concept, any trope, to rise above its surface limitations: These have to be the most difficult elements of growing as a writer–more so than self-discipline, more so than resilience, and more so than a persistence of joy in the work. Without trust, all your hammering away at daily word counts and submissions queues is still only going to produce a body of didactic and/or reductive work. And sometimes didactic or reductive work gets published! (I would certainly write some of my stories differently today.) But I can’t think of many writers who are satisfied with stagnation, or many who are rewarded long-term in this business for resting on the laurel of “good enough.”

The trouble is, like anything else that relies on our nuisance brains, trust ebbs and flows. The important thing, then, is to work towards its reclamation–story by story, word by word, day by day–whenever it falls away. And oh, it always falls away. But when it’s present, you’ve got a golden opportunity to develop as a writer with greater intensity than usual.

If you love what you do, and life doesn’t get in the way, you will take it.


[1] Technically, I’m still in thesis-mode, but I’ve slowed down a bit, extending my revision deadline to the end of July so that I can reclaim the joy and self-confidence I’ve plainly lost in my academic work. This, I’m doing by reading (passionate writers of non-fiction and philosophy, mostly) and writing every day–sometimes thesis, sometimes essays, sometimes poetry and prose. It’s all useful if it keeps me in daily writing practice.

[2] Chinua Achebe writes best on Conrad’s self-defeating racist underpinnings in “An Image of Africa: Racism in Conrad’s ‘Heart of Darkness’.” I would add that I think Conrad’s more aware of certain forms of racism in his work than is suggested here, but writing his way out of them (even if he wanted to) is a whole other matter entirely.

How We Go Forward, and When, and Why

The following is a reflective essay that begins by invoking a terrible event of sweeping reach: geographically, temporally, and culturally. I move to other matters, both abstract and personal, soon after, so first let me call your attention to this fundraiser, hosted by Equality Florida in conjunction with the National Center for Victims of Crime, for survivors and surviving families of the Pulse Nightclub Shooting. If you’re in a position to donate, please do. Whether or not you’re able to donate, please continue to spread kindness in other ways in your respective worlds. The helpers and the lovers of the world must triumph.

Okay. Reflection time:

From Real Trauma to Abstraction: A Philosophical Cop-out

I only needed a few minutes to process the initial data surrounding the June 12 mass shooting in Orlando, Florida: the number of casualties, fatal and otherwise; the number of difficult labels ascribed to the shooter; the number of people lined up to donate blood and funds; the number of pieces of legislation against queer persons in the last six months; the number and size of mass shootings and related attacks since 9/11; the number of foiled pieces of legislation surrounding firearms regulation. Numbers are easy. Implications are tough.

After, I went for a short bike ride in a vicious wind and picked up Slavoj Zizek’s Event (2014), a Philosophy in Transit text that addresses (in Zizek’s usual, breathlessly free-associative way) a range of ideological engagements with the idea of an “event,” an “effect that seems to exceed its causes” (3). What I wanted to understand, I realized halfway through the frenetic prose,[1] was my own relationship to this latest event, and others like it: how some situations in the news, as in my personal life, offer stimulus for re-evaluation of the self and notions of community, while others do not.

18273650

If the impulse to read philosophy after learning about a devastating world event sounds cold and abstracted, I agree. On June 11, I read and laughed at a year-old article describing the practice of bibliotherapy–and yet, the very next day, there I was, “self-medicating.” I will add, though, that I haven’t felt like myself in the last two weeks: or, rather, I haven’t felt I can trust the verdicts my brain throws out right now. For instance, in the relative isolation of dissertation lock-down, I have felt with absolute conviction that a) I have nothing to offer the world as writer, researcher, or teacher, so b) I should give up entirely, and c) these are perfectly reasonable convictions because I am a well-trained reasoner, so there. At the same time, I have developed a thin skin for criticism in the wake of some tough literary feedback, and now have a nuisance habit of bursting into tears at the slightest corrective received via text. This suggests that I have entered a period of unreliable self-narration, and inclines me to want a second opinion about anything I encounter that seems deeply affecting.

In particular, when the immensity of this mass shooting struck me, my first thought was simply: I am wasting my life. Followed quickly by: Well, that was selfish of me. When I parsed these thoughts, I realized the logic of the first went: “These people were killed for being present in their lives. Their lives were brutally cut short for being lived first with a fullness I doubt I can lay claim to myself. Life is precious and I should be doing more with the time I still have.” While this is all true in some sense, the second thought came from extreme discomfort with a split-section rationalization that turned a senseless waste of human life into something instructive, something ultimately about me.

So I got to wondering. I wondered at how often all of us–not just “the government” or “the media,” but people, individuals even in the middle of active engagement with a given situation–inhabit these moments with an expectation of personal and social transformation. Does the rhetoric of disaster, past, present, and imminent, disproportionately guide our decisions to reshape ourselves and our communities? Is disaster, or at least the threat of disaster, a precondition to seeking meaningful and sustained change? And if so, is that a cultural expectation, or something intrinsic to being a human being?

Legislative discourse is the most obvious ingrained association: Horrible thing happens. Public outcry asserts, “Something must be done!” Something is then done or not done. But is the same true on other levels of social activity and routine? New Year’s Day happens. A new year prompts New Year’s Resolutions. Those resolutions are kept or (more often) not kept. Subsequent holidays then prompt other event-based challenges to notions of self and community, such as your stance on romantic conventions; or the obligatory celebration of family; or highly secularized and commercialized religious days. And these are just the official events that mark the course of our lives. Just how many of our decisions are driven by less obvious Pavlovian cues, and would we make better, more ethical choices if we could somehow make personal and social decisions without specific inciting events in mind?

Too Abstract: Rein It in a Bit

I began this reflective exercise with a news event of great social significance, knowing full well the perfunctory conversations that have already reared up around race, nationality, immigration status, masculinity, creed, firearms regulation, homophobia, and mental illness. As easy as it is to get caught up in knee-jerk responses to any one of these, though, I know that my current schism precedes this difficult slew of social discourses–which is rather relieving, because it means that instead of talking about, say, the throwaway use of the word “bipolar” in relation to the shooter, and how this framework may or may not limit perceptions of personal agency, I can steal a trick from Zizek, and speak through older cultural references (though I promise, I’ll only use one[2]):

In particular, every Star Trek series requires at least one standby who embodies all our anxieties about humanity through Hegelian negation: sentient containers of all things “non-human” that first complicate our initial definitions of humanity, then resolve into a shared questioning about the nature of being, which itself serves as our new definition of “being human.” For passionate Kirk, there is logical Spock; for lover-of-the-fine-arts Picard, there is too-many-decimal-points Data; for hot-blooded Sisko, there is golden-ooze Odo; and for tough-love-cures-all Janeway, there is the ship’s Emergency Medical Holographic program, The Doctor.

In “Latent Image,” a season-five episode of Star Trek: Voyager, The Doctor comes to realize that his memory files have been tampered with, but the reason why is both human and non-human in ethical scope: It emerges that during an emergency medical situation, The Doctor had to make a choice between treating one of two patients of equal need, viability, and professional importance, and he selected the one who happened to be a personal friend, leaving the other to die. Attempting to construct a deeper logic to this decision traps him in a processing loop, which leads to questions among the human crew about whether The Doctor, as an AI, is entitled the time a human would need to sort out a similar “Sophie’s Choice” trauma.

star-trek-voyager-latent-image

The complexity of this episode lies in its negotiation of certitude as a possible component of the human condition. Is The Doctor’s struggle for a perfect understanding of his biases and motivations really any different from our own? I, for one, don’t know why I expect there to be a deeper logic to many of my own decisions; I know enough about the way the mind wavers between possibilities to recognize the power of habit in tipping the scales without my conscious knowledge more often than not. Still, in a few situations over the last two months, I have found myself troubled by “final straws” that led me to sever ties or otherwise instate new, firm boundaries between myself and draining people in my life. What baffles me is the knowledge, on one level, that these “final straws” emerged many times before in my interactions with certain people–but at those earlier junctures, I chose different actions in response. Why? What transformed my trial runs into fully realized course corrections?

Some would argue that what matters is simply that I did make the choice to move away from more destructive interactions. But incomprehension regarding the whens and whys leaves me with less confidence, not more, in my ability to avoid making the same mistakes down the line. If I don’t understand the mechanics of my own transformation, how can I trust in its sustainability? Doesn’t the habit of persisting in unhealthy situations still remain more fully formed than this new routine of letting go?

The Academic Angle

This self-doubt has been amplified by academic feedback that, at its core, challenges my competence as a literary critic (and thinker in general), and my charity as a writer. These are tough accusations to swallow–meanness, pettiness, arrogance, condescension–and I have spent the bulk of the last two weeks trying to process them on a deeper level. I certainly don’t mean to come off as any of the above, but that’s even more alarming: the fact that my writing would veer so far from intentions suggests a profound lack of authorial control. So how did this happen? I know I spent most of the last year trying to establish a baseline of understanding for critical readers who struggled with my writing on a sentence-by-sentence level, and what I produced next seems to have become overly particular, “rigid,” “uptight,” heavy-handed and defensive in the process.

Or was it always? At the proposal phase for this dissertation, I was criticized for constructing a “negative imprint” of my argument; of establishing my work through gaps in other research in ways that were read as actively hostile to preceding scholars. I was seen as quarrelling with, instead of building out from, the work of others in my field. Again, not my intention. The pressure on a graduate student at that juncture in the process is simply (beaten in over and over) to prove that one’s work is of the utmost importance, even if it’s a project about 19th-century histories of science. Why does it matter now? What gaps in the extant scholarship does it fill? Be strong. Be confident. And I thought I was. I thought I was no more decisive than other writers I’d read in my field.

Regardless of my intentions, though, that decisiveness came out as the wrong kind of stridency then, and still seems to be the wrong kind of stridency now. And that is unsettling to hear: an awareness of my subconscious tendencies, made manifest on the page, that I keep rediscovering second-hand. So what is it about me–my thought processes, my internal debates–that emerges in so reactive, so contrapuntal a written manner in the first place?

Now that these behaviours have been so thoroughly pointed out in my academic writing, I also hear them in my casual interactions all the time: How, even when I’m asked my opinion about something I enjoyed–a book, a movie, a song, an experience–I begin with the negative before immersing myself in the positive. I think I’m simply solidifying the strength of my ultimate endorsement by initially ceding ground, allowing for the possibility of error, and generally pre-empting accusations of credulous, wholehearted acceptance of a given product or idea. In actuality, I end up giving the impression of being overly critical, or of disliking the topic of discussion entirely. More often than not, I then spend most of my time defending my initially critical remarks instead of my ultimate praise. And that’s for the stuff I do enjoy.

So how did this habit of thought and written word arise, if it isn’t my intention to be disagreeable? A charitable reading suggests that I gained this tendency from spending most of my formative years on the argumentative defence–which likely isn’t inaccurate, considering how ugly the verbal sparring in my household could be while I was growing up. However, a less charitable reading suggests that, irrespective of childhood circumstances, I as an independent adult remain partial to the easy way through philosophical and cultural discourse. Is it intellectual laziness? Am I more interested in dissecting what is wrong about others’ arguments than in championing what is right?

Whatever the answer, I find myself at a current loss as to solutions. I do not wish to be disagreeable without just cause. I do not wish to be regarded as mean, pedantic, uptight, patronizing, defensive, or otherwise arrogant in either the construction of formal arguments or my response to arguments come before. I want to write–for academics, for fiction readers, for poetry readers–from a place of wonder, a place that doesn’t punish a mutually felt desire to connect.

But right now, something about the cadence of my thoughts seems to turn destructive too often in speech and in print. My fiction depresses without offering opportunities for growth. My academic work scorches where I want only to spark. And so the same brain that turns against its host all too often–that turns against its host even now–is breeding confrontation in all the wrong externalities, too, in a world that already has plenty to grieve, plenty to lash out against, and plenty of reasons to champ at the bit for radical transformation.

am wasting my life, in other words. However, this is a truth that should be confronted on its own merits, and not merely in relation to the news of the day. Waste is simply a quality of my existence, which I should be making strides, always, to reform. And I am, I hope. But the going is slow right now, and perhaps it’s so slow because I know I’m a creature who seems to long for more definitive and external metrics of change than ever exist outside movies, and TV, and even the messy, meandering works of Zizek.

How silly it is, to expect that a personal desire for narrative coherence would align with the dread cadence of the world at large.

But then again, how silly it is, and how wonderful, how precious, how intrinsically and relentlessly transformative, to be here and to be human at all.

[1] I suppose I should note that Zizek would likely align this gradual arrival at an understanding of why I sought out this particular text at this particular time (though it has always been readily available in the bookstore where I work) to his discourse on how all our decisions are unconscious states to which we can only ever be acclimated after the fact. In Event, his example of this occurrence relates to the discovery of being in love, which always succeeds the actual instance of falling in love (149-50). Here, I wrangle with the inverse: an awareness of needing to fall out of X, which also exists at a remove from the actual instance when I will, or do.

[2] The golden rule stands: Never go full Zizek.

[3] Final word goes, again, to this wonderful fundraising page. Be kind to yourselves, and each other.

Goals for the Rest of 2016

Progress comes in many forms. Personal growth does, too. In the past five months I’ve waged battles for balance between dissertation work and fiction work, by alternating long runs of academic writing in “lock-down” mode (limited social contact, closed social media accounts, and a number of nights working until 5am) with shorter, frantic bursts of fiction and poetry. June 1-30 is just such a lock-down month, with the dissertation requiring major revisions in the next four weeks. This schedule seems the most efficient way to complete my PhD without burning out, but the compartmentalization has made it easy for me to feel like a failure in other ways.

In particular, I have made a mess of my writing practice by lurching so dramatically between lifestyles, and the infrequency with which I get to submit new short stories (plus the limited range of markets available for the works I do write, because of story size and type) means that I’ve taken rejection this year harder than I should. I know I have a lot of growing to do with the writing itself, too; I keep churning out short novelettes that take place in heavily researched environments, but which groan when read by editors, a few of whom have noted that the stories are at once overburdened at their size, while also not satisfying all secondary-character and side-plot teasers.

The problem, I suspect? They’re suspiciously similar in word length–around 8,000 to 10,000–which is also the size of most of my dissertation chapters. Dense, uneven prose isn’t the aim even of my dissertation (“hah,” you might say, but it’s true–most of my rewrites have to do with “softening” my language and context-building), and that kind of density doesn’t often work well in gripping narrative, too.

Now, this failure might be promising, in its own way; I’m inclined to think that the level of research and heft of storytelling in these 10,000-word pieces is an indication that I should get cracking on a proper novel. I technically have one on the go, but in July I hope to complete a slightly easier project, by turning an SF-mystery novelette into a book-length read–from 9,000 to 90,000 words. In this way, I hope to start transitioning my failures from the first half of the year into real growth for the second.

Other goals for the rest of the year will include a great many more non-fiction pitches (especially while I’m working on academic article submissions), and a return to shorter fiction (real short stories, ideally 3,000-5,000 words in length!).

What I also need to do, though, is nurture more solitude for my writing process. Drama makes me sick at heart, engenders resentment and rudeness, and fosters other social behaviours I truly despise in myself, like expressing grievances to anyone but the person who has left me so aggrieved in the first place.

At its best, writing is a reflection of my desire to connect earnestly and openly with the world. When the processes by which I seek to better my writing instead generate turmoil, and when they distract me from the One True Teacher (i.e. the submissions process itself), I need to take a good hard look at whether I’m becoming a worse human being as well as a worse writer, for all my efforts to improve at both.

So! I have one month to revise my dissertation into a work that shows more generosity of spirit, more joy, less defensiveness, and more “softness” in general. After that, I have the rest of the year to apply those lessons to the rest of my writing practice: to foster a more exploratory approach to improving as a writer, to reclaim a sense of wonder in everything I do, to diminish interactions that introduce destructive negativity into my day-to-day life, and to reconnect with the inevitability of rejection in a way that coaxes strength enough to weather the many and long writing years to come.

Wherever your own writing and “better human being” practice finds you, I hope this summer is terrifically productive for you as well. Best wishes and good skill to all.

On Being Pigeonholed

A funny thing happened last night, which I’m hoping to articulate here in a way that pays no disrespect to the other writers in the local SF/F/Horror group I run. What I want to discuss is an issue that comes up frequently for writers, and which simply seemed to find a particular voice last night, in relation to commentary on a story segment I’d submitted.

Now, I should mention that there are a few oddities about my engagement with this group, because I generally dislike workshops; because I maintain that the best teacher is the short-story markets themselves; and because I’m easily the most experienced in the group in terms of writing with editorial preference in mind. As such, usually I share only the opening scene of a story with the group, to see if I can hook lay-readers with the concept. The story is fully drafted or nearly so before the group meets, and sent to its first potential market before or soon after our meeting. This routine is critical for me, because the group is not made up of professional editors, so I have to be careful not to prioritize other tastes over my impression of what works for a given market. (The group’s feedback becomes most useful when I’m generating my next story.)

In the case of our most recent meeting, though, passive impressions cohered in some thought-provoking ways. First, a week ago I’d posted to our private channel a segment of “fantasy, of all things.” Then a fellow in the group cheered triumphantly, claiming that the fantasy writers of the group had finally pulled me over to the dark side. I didn’t correct his impression that this was my first fantasy story, or even the first fantasy story I’d shared with the group. I’d submitted one medieval piece before, about a world in which the dead literally sang, but some in the group were newer than that story, so my current medieval work was, to them, my first foray into the genre.

Criticism is, of course, perfectly welcome. I am 100% comfortable with someone not liking a work, and the segment’s dialogue certainly did need the polish I’d given it between excerpt-submission and completed final draft. A reader thrown out of the narrative early is always a bad sign, too, as I learned from the “bovine face” incident with another work, so a lay-reader’s hesitations always have merit on some level, even if the solution is not always clear.

What struck me last night, though, were the concerns about authenticity–namely, that after reading so many excerpts from my SF, this story wasn’t written in “my” voice, whatever that might be, and so lacked a certain je ne sais quoi despite being technically strong. There really wasn’t anything to do with that comment, except to toss the story entirely (which, of course, I didn’t; it’s going out today in pretty much the form I had prior to the meeting), which is why I find myself reflecting on that comment in another way.

In particular, I say I was “struck” because this comment touched upon a common fear: the fear of being pigeonholed; of being seen as the wrong authorial “fit” for a given story. At a time when SF/F/Horror celebrates the rightful inclusion of its full range of writers and experiences, I find myself struggling with how to maintain my personal dislike of being evaluated on anything other than the level of the story itself. I am a female-sexed person with a queer sexual orientation, and I write. Being treated as “queer female writer” feels to me like being put in a subordinate box. Others happily embrace these boxes, for reasons that make perfect sense: sometimes when people put you in boxes, you own those boxes. But that’s their perfectly legitimate choice, not mine.

And I routinely wonder how best to navigate the world with my choice in mind.

For the last seven years, for instance, I’ve achieved a certain number of SF sales, but in the last ten years I have also submitted a great deal of realist fiction, and struggled (with varying degrees of success) to have patience through universal rejection. I have come so close so many times, receiving very kind third-round rejections, but when I study what gets published I’m often surprised I’ve managed to get even that far. I do not write MFA-style fiction, with its focus on the lyrical sentence as the fundamental storytelling unit. I also dislike the popularity of relentless interiority, a kind of “telling” done deftly in major publications by relaying background detail in more provocative and dreamlike ways. The stories I favour–externalized; building impressions of a situation mostly through character actions viewed at a narrative remove–certainly exist in print, but for whatever reason my own attempts have never been up to snuff.

To make matters worse, because of the culture in which I live, when I write in a less internal, lyrical manner, I keep pushing against a nuisance fear that my perceived gender is working against my perceived “authenticity,” especially since these starker, more direct tales most often take a male protagonist.

I waver in taking this fear seriously, of course. Absolutely, there is a gender bias in fiction–propagated by male and female editors alike–against considering the writing of female persons in the same light as writing by male persons. And absolutely, I grow heartsick over stories like Catherine Nichols’, who wrote of her experience using a male pseudonym as follows:

So, on a dim Saturday morning, I copy-pasted my cover letter and the opening pages of my novel from my regular e-mail into George’s account. I put in the address of one of the agents I’d intended to query under my own name. I didn’t expect to hear back for a few weeks, if at all. It would only be a few queries and then I’d close out my experiment. I began preparing another query, checking the submission requirements on the agency web site. When I clicked back, there was already a new message, the first one in the empty inbox. Mr. Leyer. Delighted. Excited. Please send the manuscript.

I sent the six queries I had planned to send that day. Within 24 hours George had five responses—three manuscript requests and two warm rejections praising his exciting project. For contrast, under my own name, the same letter and pages sent 50 times had netted me a total of two manuscript requests. The responses gave me a little frisson of delight at being called “Mr.” and then I got mad. Three manuscript requests on a Saturday, not even during business hours! The judgments about my work that had seemed as solid as the walls of my house had turned out to be meaningless. My novel wasn’t the problem, it was me—Catherine.

I wanted to know more of how the Georges of the world live, so I sent more. Total data: George sent out 50 queries, and had his manuscript requested 17 times. He is eight and a half times better than me at writing the same book. Fully a third of the agents who saw his query wanted to see more, where my numbers never did shift from one in 25.

But for all that I sometimes wonder if a male pseudonym would sell my realist fiction, I am too stubborn for the experiment. If the writing is good enough, my name and sex shouldn’t matter, so I just have to keep improving my stories. If the writing is good enough, and the market conditions are favourable, something of mine will sell in that field eventually. And if it doesn’t? Well, writing practice is still writing practice, in any and all forms. That’s my perspective, and even if it is holding me back, I’m sticking to it.

(This is also why I do not submit stories to calls limited by gender or queer status, even though I have absolutely no problem with those who do, because I understand that activism towards a more equal society comes in many forms.)

But comments like the ones I received last night, about the authentic fantasy voice for a writer of published science fiction, reminded me of the other nuisance fear I have: the fear I confront every time I write an author’s bio for a cover letter. Do I or don’t I mention, when submitting realist fiction, that I have been published before–extensively, but in a more prominent genre category? Or do I play into the belief that SF/F/Horror writing is somehow lesser, and that a writer who publishes in those realms is somehow not as serious as those who publish realist or magical realist fiction alone? That a writer of SF is somehow less suited, less appropriate, for the writing of mainstream fiction, too?

I’ve played it both ways to date: Sometimes I don’t mention any writing credits; sometimes I mention my latest SF works, and try not to sound defensive. No bites with either approach yet, which should be an indication right there that I need to stop worrying about how other people will evaluate my work through knowledge of me, the author. But of course, worries don’t tend to respond to reason. We don’t live in an age of anonymous publication anymore; the author is everywhere! The author sells the vision of their work in extensive interviews and readings and online persona management. So how on earth am I to avoid being read in certain ways, with certain preferences and identities?

The answer, of course, is that I can’t–but I can limit how much I internalize such pigeonholing. Whether or not others make judgments about my writing based on biological sex and sexual orientation, I don’t have to (and I choose not to) focus on anything but writing better stories going forward than any I’ve written to date. Similarly, whether or not others make assessments of my authenticity based on the “lane” of writing in which I predominantly publish, I can and will keep writing the stories that matter to me, in whatever genre I prefer.

After all, at the end of the day, my hope is to write stories that resonate with strangers. If that moment of triumph occurs, they’ll obviously know more about me than I’ll ever know about them–because they’ll have all my vulnerabilities plainly writ upon the page, and can make whatever determinations about me that they like. But if I’ve done my job right, if I’ve written a story right, then I’ll have touched some chord of shared humanity that goes beyond all other labels. That’s all I’m really after. That’s the core of why I write.

With that in mind… I guess I still have a lot of work to do, and a lot of frivolous worries to put aside, in order to get it done.

Best wishes to all of you, in yours.

Two Paths to Short-Story Publication

This is an amusing post for me, because I haven’t submitted much writing this year (The Year In Which I Will Finish My PhD), and in consequence, I only have one poetry acceptance under my belt for 2016.

Nevertheless, I have enough publication credits for short stories to feel comfortable offering some strong suggestions to those seeking to publish their fiction in magazines.

I’m offering these suggestions because I know the sting of rejection–a normal part of the process at any stage of authorial development–and I hate to see others setting themselves up for more rejection than necessary.

In my experience, there are two paths to publication in magazines. Well, three if you count the rigged contests for friends (which I have seen, and which suck)–but two for us regular folk, playing the game with some measure of integrity:

  1. You write the story you love, and after, submit it to magazines that seem a close stylistic fit; or
  2. You study the magazines you love to read, and you write a story you love to match.

There are pros and cons to both positions.

If you take the first approach, finding the right magazine for your story means being flexible with other factors–like whether the magazine is a pro-rated publication, or well-established. If you take the first approach, you keep submitting that story to as many places as possible, however little they might pay or promote your work, until someone believes in your vision.

If you take the second approach, you are writing for a specific market, with the knowledge that there will be maybe a couple other markets that fit the story if/when the first market rejects your work. Likely as not, the market you’re targeting is pro-rated and highly reputable, so it’s a great boon if you get in–but at much higher risk of binning your stories, if you do not.

I have taken the second approach far more often than the first. As such, I get rejected a great deal. I run out of markets a great deal, too, and then have to bin stories I love. Nevertheless, the rewards (when I write a story that gets accepted by my target market) have made all the preceding hardships worthwhile.

I used to take the first approach, but quickly discovered secondary cons to the story-first approach. In particular, whenever I’ve been accepted by smaller publications, I’ve risked the publication folding before releasing my story (it happens!), or the editors getting squirrelly about payment (also a sad reality). I’ve also discovered editors who were far more likely to want to rewrite my stories–and not just for legitimate incoherencies in the text, but because of extremely persnickety matters of personal preference and vocabulary.

Good editors–editors who take only the work they truly believe in, and respect the author in the process–certainly exist at all levels of the publishing world, but in my experience your odds of finding a good editor are much higher among the pro- and semi-pro ‘zines. Because I wanted to work for and with professionals, I moved to the second option. I read pro- and semi-pro magazines avidly, decided which ones I liked, and started keeping specific publications in mind from the first planning stages of future stories.

Now, I have laugh (or I’d cry) at how many people get upset when I tell them that I write a story with a specific publication in mind. First of all, it’s my writing, so it’s my choice. Full stop.

But secondly, there is no rigid binary here: It’s not a choice between “writing the story I love” and “writing the story that will sell.” Sometimes I write a story I love that doesn’t fit with any market, and that’s hard, but the crux of why I write is to convey meaningful ideas–about what being human is, about the struggles human beings face–and these ideas are all present in the “story that will sell.” I’m not writing things I don’t believe in “just to get published.” Indeed, I suspect it’s almost impossible to write something publishable that doesn’t also resonate somehow with a thematic preoccupation near and dear to me.

Thirdly, my way is not the be-all and end-all of ways. If you’ve written a story you believe in, absolutely, stand by your story! Just don’t expect that it’s going to get published in a top-tier magazine. It might be, and cool beans if it is! But if you love your story, also be prepared to go to bat for it at any number of smaller, lesser known ‘zines until someone groks your style.

Remember:

  1. Whichever path you choose, you need to read the submissions guidelines. An editor is a human being, with specific preferences in general, and even more specific preferences for the publications they’re running. Honour their explicit requests not to see, say, any werewolf-themed YA romances until the year 2053. 
  2. Whichever path you choose, you need to read the publication itself. (This is CRITICAL, because submissions guidelines are often vague and misleading; a magazine that says it wants “experimental” fiction might mean “the kind of experimental fiction that is actually status quo in writing workshops right now” or it might mean “genuinely weird and risky work, the likes of which you’re not going to find anywhere else.” Only reading the magazine will give you a sense of what the editors mean by certain adjectives.) And finally:
  3. Dumping your story in all submissions queues is not a neutral move. If you do not demonstrate knowledge of a given magazine’s explicit or implicit preferences, just relentlessly submitting modern-dystopia zombie stories to a second-world fantasy magazine, you’re going to become known by the slush pile readers and editor, and not in a good way. Not in a way that encourages any of them to take a chance on your future work.

Now, occasionally, I do take risks on unknown publications, but these risks tend to uphold my underlying preferences when submitting fiction.

At the end of 2015, for instance, I submitted to Liminal Stories, which had yet to release a first issue at the time. Instead, I looked at the editors of this new publication, who all had significant professional backgrounds in the discipline, and I read their submissions requirements carefully. I’m not sure if what they currently have on the site is the same as what was present in December, but the description below is still a pretty good example of why submissions guidelines alone are not enough:

Liminal is searching for stories of a particular tone and tenor, regardless of form.  We like stories that are strange and unsettling, sharp-edged and evocative.  Although we will consider any genre, we have a soft spot for weird fiction, magical realism, soft science fiction, and those uncatagorizable stories that straddle the line between genres.  Liminal stories should linger in the mind and evoke emotion in the reader.

That first line says it all: The editors are looking for a “particular tone and tenor,” meaning that they’ll know what works for them when they see it. They offer a range of possibilities, but it’s rather open-ended: “weird fiction, magical realism, soft science fiction, and those uncatagorizable [sic] stories that straddle the line between genres.” You can get a sense from this list as to what won’t fit right off the bat–hard military SF, rigid secondary-world fantasy, traditional horror–but what will fit is, well, up to the editors’ preference. 

My story got to one of their final rounds of consideration, and received an extremely positive rejection that only proved bittersweet because this publication had been my last hope for a story (now binned). But–amusingly–when I saw the works they’d accepted for their first issue, I realized that if I’d had samples of their preferences on hand at the outset, I’d never have submitted to them in the first place. Abstracted, atmospheric, outside of time and place even when immersed in history: their preferred stories remind me of work from Shimmer, another place I rarely think a story of mine would make a good fit.

To be perfectly clear, though: My assessment of “fit” is by no means a slight against the publication! These editors found the works that they loved, and they went to bat for them, which is the mark of true professionals, and bodes well for the continuation of another excellent SF&F market. Nonetheless, the experience proved a good reminder of why I prefer to have read actual work from a magazine before submitting–because there’s always a limit to what submissions guidelines can tell you about editorial preference.

I suppose I should also address the elephant in the room when it comes to these paths for submission: You need to be honest with yourself about what you want from publication. Do you truly, in your heart of hearts, just want to see a story you love and take pride in find a home? Then take the first path: pitch it to places that seem a good stylistic fit, irrespective of all other considerations.

But if you have deeper motives–if you want to be paid for your work, if you want to be published in top-tier magazines, if you want to become a member of a professional writers’ association–then own those motives, and accept the more complicated path that those motives entail. Take the second approach. Read professional magazines carefully, and write with specific venues in mind.

Yes, I know this goes against the cherished belief in a higher spirit of artistic practice that will simply compel readership and accolades, but tough cookies. Any artistic outlet is also a community. It’s an ongoing conversation–a wonderful, wild rumpus of a discourse!–between readers and writers and editors who share a vision of how best to interpret and delight in and otherwise navigate the world.

So if you want to tell the stories that matter most to you in a way that reaches the widest possible audience, you need to behave precisely as you would if you were standing in a room filled with ongoing, in-person conversations: Either hold true to the conversation you want to have, and keep circulating the room until you find the group to match, or listen in to the group you’d most like to join, and find your own way to lend your voice to its established flow.

It’s that simple, and that relentlessly complex.

Good luck, good skill, and above all else, good reading!

The Lobster: A Tale of Frying Pans and Fires

the-lobster-movie-trailer-images-stills-colin-farrell-john-c-reilly-ben-whishaw

When I first heard about The Lobster, in a December 2013 article hyping movies of note in the year to come, I had no idea I’d have to wait two more years to see this one in theatre. I also had no idea how tenderly comic its misanthropy would be. The original blurb simply noted, as most summaries do now, that The Lobster told the story of a world where people had 45 days to find true love, or else be turned into other animals and released into the wild. Now, before I go any further, I must emphasize that this film is not for everyone: A wild donkey is shot in the opening scene. Rabbits die, and a dog, and there’s a failed suicide attempt played for laughs in the middle, too. But I adored this strange little film.

The Lobster, written and directed by Yorgos Lanthimos, has a surprisingly Hollywood-ized cast, including Colin Farrell as our recently divorced protagonist, David; Rachel Weisz as an unnamed but critical character, providing voice-over narration as well stage presence in the second and third acts; and John C. Reilly as a fellow hopeful at the Hotel where David first stays. Nevertheless, the film has a steadfastness, and playfully erratic approach to narrative voice, and relentlessness with the difficult ambiguities of our interactions with one another, that belies its true European heritage.

Despite its eccentricities, though, The Lobster also follows a fairly secure three-act structure–which, in combination with Colin Farrell’s tremulous sincerity, bleeding through even his own deadpan lines, gives the viewer plenty to cling to even as the plot twists in unexpected directions. So, yes, the film’s first act absolutely takes place in the Hotel, which regulates singles attempting to find true love in 45 days (with extensions granted to hotel guests who successfully tranquilize wild loners on routine hunting trips). However, the film’s second act then takes place among the loners in the wild, who listen to electronic music because they’re loners (obviously!), and who plot dramatic schemes to reveal, 1984-style, the fraudulent harmony of known couples in the Hotel and the City.

The last act then follows an attempt to break from both extremes (in keeping with the film’s constant teasing of rigid cultural binaries), and in so doing offers the case of a single relationship, where two people think they’ve found their soul mates because they have something in common. When this commonality is threatened, the film poses a splendidly difficult question about how far a person can/should/will go to restore that sense of harmony. I certainly won’t spoil the ending, but I will say that the lead-up in that third act reminded me very much of Solzhenitsyn’s Cancer Ward, in which a microcosm of Russia is gathered to be sick and maybe to get well; and in which one beautiful young woman, fated to lose a breast to cancer, builds a tender connection with our protagonist while in hospital; and in which the final third of the novel leaves the reader in the highest suspense as to whether the young man will keep his promise to reunite with her once he, too, passes in triumph from the shadow of death.

I should of course note that The Lobster does not paint all relationships as false enterprises; at least one older couple seems quite content and in sync, and remains unblemished by the many absurd, sometimes violent, often demoralizing deceits and betrayals that mark the rest of the film. However, what the film does do–quite splendidly–is highlight and gently pillory our cultural anxieties not just about our own partnership statuses, but also about the partnership statuses of everyone around us. In the meantime, the film also notes how friendships, alliances, and other bonds are marginalized despite their equal importance to anyone’s attempt to survive in this world. No man is an island, even if he can dig his own grave, but neither can a man simply will himself into bliss among fellow human beings. What a quandary these two facts provoke.

In consequence, though becoming another animal seems to many in the film a sign of defeat, The Lobster itself suggests that there are worse things, far worse things, we do to ourselves and each other in an effort to find our place, than might ever befall a hapless crustacean. If this film is misanthropic, then–and I strongly believe it is–it remains the best kind of misanthropy: The Lobster sees us at our worse, and hates the whole, strange, messy project of humanity enough to play us straight. Amid that brutal honesty, though, any flicker of surviving light becomes all the dearer for what we’ve seen it filtered through.

Maybe that’s not enough for most viewers, but it proved a feast for me.

 

Follow

Get every new post delivered to your Inbox.

Join 773 other followers