A short documentary film on the first month of the anti-Trump resistance. This is the first of a series that I call “History on the Run.”
I don’t know what kind of a person Woody Allen is. Actually, I don’t even believe in “kinds” of people – you know, racists, sexists, rapists, good, bad, ugly. I have seen people I otherwise like engage in awful behavior, and people I find difficult have impressed me with unexpected acts of kindness. So let me rephrase it like this: I don’t know what Woody Allen has done, aside from his work in cinema. I find some of it compelling, and some of it boring, and some of it self-indulgent.
Allen’s talent was initially a large part of what some internet commentators are disingenuously calling a “controversy” over his daughter Dylan Farrow’s accusations of sexual abuse. (If you are somehow unaware of this, come out from under your rock and do a quick Google search.)
At first, the issue was whether one can appreciate the films of a sexual predator. (We’ve been here before, with Roman Polanski and similarly ambiguous conclusions). We were asked to contemplate whether Woody Allen deserved his Lifetime Achievement award at the Golden Globes, following tweets from two members of the Farrow clan excoriating the decision to honor him. The world seemed focused on the moral conundrum of praising a flawed genius, or supporting the work of a likely criminal, who was investigated but never charged. It was about the man and his actions. For some, it was an assault on his character, whether it was merited or not.
Then, after Dylan Farrow wrote an open letter to the New York Times, the dialogue turned from Allen-shaming to slut-shaming. The problem is, one can’t easily slut-shame a seven-year-old, even when she becomes a grown woman. So it is open season on her memory, and her maternal family’s visibility.
Attacks on the Fallacious Farrows have been most pointed at the Guardian, where Suzanne Moore writes off the social-media discussion of the case as little more than an ill-informed “kangaroo court.” Michael Wolff suggests that entire situation was manufactured by the Farrow family to improve their profile and return them to the ranks of Real Celebrity. Even Dylan’s first-person account, Wolff argues, is carefully crafted to appeal to famous young women who will provide public (read: impersonal) moral support in return for some undefined increase in their own popularity. He says that Dylan’s story has resurfaced at a convenient moment for the careers of her mother and brother, and that her allies are swayed by emotion rather than “outside facts” – whatever those would look like.
Neither journalist is accusing her of lying; they’re just telling her that her story only matters in its ability to make and break public lives.
All of these might well be valid philosophical points, and the extreme culturalist in me wants to acknowledge them. All the while, the extreme feminist in me is trying to scream louder than the cacophony of Allen defenders and Farrow detractors. BULLSHIT. BULL FUCKING SHIT, and please don’t pardon my language.
It is the same routine we see nearly every time anyone comes forward with their story of sexual assault. I haven’t yet heard anyone questioning Dylan’s own morals, but that is a small victory. Instead, they are calling out her mother’s family history, mental health, and public profile. They are finding fault in her brother’s recent ascent to the fishbowl. They can’t very well say that a seven-year-old child was drunk at the frat party, or call her a slut, so they use other words so often used to dismiss women’s complaints to undercut those around her. Crazy. Manipulative. Fame-obsessed. Desperate.
It is another way to silence people we find inconvenient. When we insist that the voices in question are coming from people who are mentally unstable (in itself, another rant for another day), who have another agenda, or who are vindictive, their complaints can’t be legitimate, and we don’t have to listen to them. Telling the Farrows not to mar a supposedly brilliant director’s career because they are barmy third-rate celebrities – or, as it seems now, telling the Farrows that they are lying because barmy third-rate celebrities couldn’t possibly have real grievances against a supposedly brilliant director – is part of an old refrain. Don’t talk back to Holy Father. Don’t ruin the young man’s life. Don’t destroy his football (lacrosse, soccer, hockey, accounting) career. Don’t hurt your mother. Don’t you know what kind of pain this would bring upon your family? Don’t tell anyone, and if you do, nobody will believe you anyway.
It takes a good deal of courage to speak up. I imagine that it takes even more to do so knowing that every word will be subject to media scrutiny, amplified by Facebook and Twitter.
If you have a vagina, or your parts don’t conform to your soul, or you are brown, or you act in any way that the herd finds difficult, you probably know what I am talking about. (I am not saying that normative white men can’t understand this, because I know many normative white men who are compassionate, caring, and capable of great empathy… but they also tend to be aware that they’re playing with a stacked deck.) I am sure every person reading this has experienced a moment in which your words were taken with a heaping tablespoon of salt because you were _________ (insert adjective describing other-ness here). Those with vaginas and melanin and non-normative gender identities don’t get a free pass here, though, because we vagina-wearers and non-normative folks are often as guilty as anyone else of slut-shaming, crazy-calling, and manipulation-card-waving.
This is not about determining whether Woody Allen raped his daughter. There were two people in that room. There are no other witnesses. There are no “outside facts.” This is about allowing people to recount their stories of victimization (which is not the same as victimhood) AND TAKING THEM SERIOUSLY.
We could have a long academic conversation about memory and celebrity, or the relationship between narrative and political investment. We could interrogate the idea of consent. But today those things make me feel like we’re running around the problem and allowing its perpetuation. When we question Dylan Farrow’s narrative, we’re not just circling the wagons around the perpetrator. We’re telling another generation of women (and many men) to sit down, shut up, and hide their pain. Don’t ruin a beloved person’s life. Be a good girl and take it.
We can’t know what happened in that room. We don’t know what happened in a billion other rooms on a billion other days to billions of other people. But we can at least try to create a safe space for telling, because while silence is painful, pushing back against the crushing, relentless public doubt that greets a broken silence is much, much worse.
I cried when my mother died in the winter of 2006. I cried when my father died in the spring of 2012. I cried when Nelson Mandela died this week. I felt as if I had lost a close friend, a mentor, a member of my own family.
At first, I was puzzled by the intensity of my grief and my sense of loss. Mandela’s death, though sad, was hardly a shock. He was 95 years old. He had been gravely ill since last June, and had passed into a coma in July. Part of me hoped, as I did when each of my parents fell ill, that the great man would come through, that he would defy the odds, his age, and medicine, and make a full recovery. The world needed Madiba, and I could not imagine it without him. He didn’t, of course, and his death on Friday was simply the last page of a months-long denouement to an extraordinary life.
I never met Mandela. I watched his release on television in February 1990. Though I did not attend the rally in his honour at Jarry Park in Montreal the following summer, I listened to it on the radio and watched the extended coverage on the news. I never shook his hand. He was not a personal friend, a colleague, or a comrade. Yet, in so many ways, I felt closer to him than I do many of my friends, colleagues, and comrades.
Nelson Mandela has occupied a place at the centre of my politics and sense of justice since my father told me his story. I was in grade school, and I had been assigned South Africa for a United Nations day. The night before the exercise, my father sat me down and told me about apartheid, the Bantustans, the pass system, Sharpeville, Soweto, Stephen Biko, the African National Congress… and the long years Mandela had spent in prison because of his struggle for freedom and democracy. “You will be South Africa tomorrow; I know you will do the right thing.”
My first acts of activism were against the apartheid regime. I helped organize demonstrations in CEGEP; I stood along with thousands calling for Mandela’s release, and to protest the Canadian government’s continued engagement – like the uranium shipments from the port of Montreal – with Pretoria. I boycotted FBI orange juice, and any other product remotely tainted by its connections to the Rembrandt Group.
When I danced the high-step to the Special AKA’s “Nelson Mandela” in some long-forgotten punk club in 1985, or to Johnny Clegg and Savuka in the streets at the Montreal International Jazz Festival in 1988, it was with passion and determination. I was not dancing alone; I danced with my friends and comrades, and with Mandela in our imaginations. It was an ecstatic act of musical and political solidarity.
Madiba has always been there with me. I guess I thought he always would be. Now that he is gone, I feel the weight of his absence.
What puzzles me more, however, is the representation of Mandela in the news and social media over the last couple of days. I have had an eerie feeling that I have been watching the “Savage Curtain” episode from the 1969 season of Star Trek played out over and over again. That’s the episode where the ever-intrepid Kirk and Spock do battle with some of the worst villains of galactic history, aided by the two greatest paragons of justice and courage: Abraham Lincoln and the Vulcan philosopher Surak. They’re not really Lincoln and Surak, but mysterious doppelgangers. Spock insists on addressing “Image of Surak.”
It seems as if so much of the media have been addressing Image of Mandela, rather than the man himself. In fairness, I recognize that the Mandela who I lost this week is as much an image as anyone else’s. Yet what puzzles me is how many – though not all – of the Mandelas depart from any reasonable reading of the man’s life and work.
The most common is the whitewashed or right-washed Mandela. This is the one on display in the American corporate media, shorn of his radicalism and revolutionary politics. Most of these reports emphasize his courage, strength and his insistence on peace and reconciliation in post-apartheid South Africa. These were essential parts of his politics and character, to be sure, but when his radicalism is mentioned at all – he was a life-long socialist, close to members of the South African Communist Party, and committed, even after his release, to the armed struggle – it is as an afterthought.
It is as if few people in the media want to ruin their celebration of the great man’s life by mentioning that he was on the US government’s terrorist watch list until 2008, and remained an incisive critic of US imperialism, and close some of the great villains of the American Right’s worst fantasies. In fact, one of the few remotely-mainstream American commentators who has mentioned any of this is the repulsive reactionary gasbag David Horowitz – and then only to denounce him.
Rick Santorum has even gone so far to enlist Mandela – a socialist and a vocal advocate for government-run, free, national health insurance – in his ongoing campaign against the Affordable Care Act. According to Santorum, his efforts to deny affordable health care to the majority of Americans is just like the great man’s resistance to the brutal, racist, genocidal, apartheid regime. You can’t make this stuff up.
More common is the kind of embroidered sampler sentimentality that bloomed all over Facebook, Twitter and other social media sources. We’ve seen this kind of thing before with Martin Luther King, jr., Mohandas Gandhi, and the Dalai Lama – a focus on the kind of comforting, unchalenging platitudes that can be printed over a soft-focus photo on an Internet meme. It’s the kind of thing American and Canadian liberals go in for in a big way; quite like all those pictures of a gentle, smiling Louis Armstrong over the text “What a wonderful world it would be.”
In this, white liberals are able to domesticate Mandela – the revolutionary – like a fuzzy Lolcat in a profoundly racist dynamic. He has become, to so many doubtless well-intentioned people, a kind of “magical Negro.” This is a recurring media figure, closely related to the “Uncle” and “Mammy” of the blackface minstrel show, whose whole purpose is to nudge white characters toward spiritual salvation and reconciliation. He is virtually always portrayed as a wise, gentle usually older Black man who is somehow in touch with a deeper, mystical reality. Think of the John Coffey character in The Green Mile, or Morgan Freeman in virtually anything. While it might give middle-class white people – like myself – the warm-and-fuzzies, and a momentary respite from interrogating our privilege, it robs a man like Mandela of his agency and his power.
It makes him safe.
On the other side, I have noted puzzling critique emerging from the radical Left. This first became apparent in an exchange on my Facebook wall where a friend – a committed activist whom I deeply respect and admire – suggested that, for all his radical efforts as a young man, the post-release Mandela was simply a “bourgeois pacifist” and what, in “Marxist circles,” might be called a “revisionist.”
I’m not so sure about Marxist circles; Communist circles, certainly. “Revisionism” suggests that there is an explicit and pure party line, and that to deviate from it in any way is a revision of the original intent. Considering that (a) Marx’s analysis and critique apply to a radically different form of capitalism than exists today and (b) he called for a “ruthless criticism of everything existing” including his own work, I would have to say that any “Marxist” who condemns anyone as a “revisionist” should go back and actually read Marx.
The most common Left criticism goes something like this: While we should respect the Mandela’s revolutionary work up to his arrest and trial in 1962, the man who emerged from Victor Verster Prison on 11 February 1990 was no longer a revolutionary committed to the armed struggle. He had been tamed. His efforts to seek a peaceful transformation of South African society, his insistence on reconciliation with the White population and his willingness to engage with neo-liberal, global capitalism as president are all evidence of “capitulation.” The whites did not feel the force of African vengeance nor did Mandela immediately transform South Africa into a workers’ paradise. Instead, he played the game of bourgeois liberal democracy and betrayed the Revolution.
Western/Northern/Euro-American (hereafter “Western”) radicals are impatient people. We want The Revolution to be made right now. We want to storm the barricades and, considering that we are virtually all bourgeois intellectuals, we can’t understand why the oppressed proletariat doesn’t rise up right now. We can’t understand why Mandela, a Marxist, allowed himself to be “coopted” and “capitulated” when he finally had the power of the state in his own hands.
This kind of criticism of Mandela is deeply colonialist. It presumes the universality of a Western revolutionary agenda rather that acknowledging that a revolution in another part of the world might address other kinds of issues and look quite a bit different. Mandela’s “failure” to live up to the revolutionary standards of comfortable bourgeois intellectuals like us in the United States and Canada is characterized as a betrayal.
I am sure that Mandela felt that he had some larger issues to contend with than sticking to the Western revolutionary playbook. Like maybe dismantling a century-old regime built on systematic racism and racial violence. This was a society where the minority White population held a monopoly on political and economic power. It was a place where non-whites did not have the right to vote, where Blacks could not legally own or manage businesses, where the education system spent ten-times-more teaching white children than Black children. It was a place where Afrikaans writer Breyten Breytenbach was arrested and imprisoned, in part for violated miscegenation laws, when he returned to his country of birth with his Franco-Vietnamese bride in 1975. I suppose that, if you ignore all of that and more, you could argue that Mandela’s “failure” to bring about an immediate and complete social revolution might look like “capitulation.”
Could Mandela have acted differently? More “radically?” Maybe he could have expelled the entire white population, like Robert Mugabe did in Zimbabwe. We can see how well that turned out. Perhaps, when he spoke from the steps of the Cape Town city hall hours after his release, Mandela could have called for a mass uprising in the streets, and simply accepted that the hundreds of thousands who would inevitably have been slaughtered by Africa’s largest, best trained, and best-equipped army had died in the cause of revolution.
Perhaps, when elected president, he could have used the powers of his office and his phenomenal political capital to nationalize all industries and economically disenfranchise non-Black capitalists, as Mugabe is doing in Zimbabwe right now. He would have then had to accept the embargo that the Western democracies would inevitably have imposed. Millions would have starved, of course, and it would have been impossible for Mandela’s government to fund desperately-pressing programs to address basic needs like housing, food, and education.
Moreover, he could not have done any of this and expect any support in the South African Parliament. The Whites would have opposed it for sure, as would have the South Africans of Asian and South-Asian descent. And despite the fact that, in the romantic imagination of Western intellectual revolutionaries, all oppressed people are a single, undifferentiated mass with a unified hive mind and will, it wasn’t that way at all. Black voters and political leaders would not automatically have agreed. Mangosuthu Buthelezi’s Inkatha Freedom Party held 43, or 20%, of the seats in the 1994 Parliament, and they were never inclined to support Mandela. Even in the ANC itself, there was – and is – a vast range of opinions. It is highly unlikely that a majority could be could be mobilized to pass such legislation.
But what difference does democracy make to a true revolutionary? The good revolutionary knows that democracy is a sham designed to maintain bourgeois power.
It would be better to ask what democracy meant to a man who committed his life to “one-man-one-vote,” who was prepared to die for the principle (we should ask ourselves what we are prepared to die for), who served 27 years in prison for it. For Nelson Mandela, the simple idea that all people must have free and equal access to political expression and power was the first principle. Everything followed from there. If that was not secured, the nothing could be secured.
Joe Slovo, Mandela’s comrade, friend, the minister for housing in his government, and the General Secretary of the South African Communist Party, no less, said this in 1990:
“Our party’s programme holds firmly to a post-apartheid state which will guarantee all citizens the basic rights and freedoms of organisation, speech, thought, press, movement, residence, conscience and religion; full trade union rights for all workers including the right to strike, and one person one vote in free and democratic elections. These freedoms constitute the very essence of our national liberation and socialist objectives and they clearly imply political pluralism…
It follows that, in truly democratic conditions, it is perfectly legitimate and desirable for a party claiming to be the political instrument of the working class to attempt to lead its constituency in democratic contest for political power against other parties and groups representing other social forces. And if it wins, it must be constitutionally required, from time to time, to go back to the people for a renewed mandate. The alternative to this is self-perpetuating power with all its implications for corruption and dictatorship…”
As for Mandela’s pacifism, it beggars the imagination that anyone would think that it was easy for a man whose friends had been murdered, whose people had been confined in a social prison of racial segregation and forced labour, who had spent 27 years in prison to seek peace and reconciliation with the murderers and jailers. When Western radicals, as well-intentioned as we might be, speak of “bourgeois pacifism,” we articulate a notion that pacifism is an easy way out. A real revolutionary would stand and fight to the bloody death.
It is the infantile – and dare I say it, the bourgeois – fantasy of the Left, a Left of which I count myself a part, that resistance and revolution must be a moment of immediate apocalyptic reckoning, singed by flames and bathed in blood. That is the great romance in seminar rooms and drum circles. That’s how it looks from behind the gas mask and bandanna as we all line up – students, professors, hipsters and bloggers – with our fists raised and our banners flying, ready to take pepper spray in the face, a knock on the head, and a night in jail, to put our “bodies on the line” and fight the power. And sometimes that is how it has to be. But after we’re booked at the police station, or spend a night in jail, we get to go home to our classes, our jobs and our families afterward to lick our wounds.
Neither Mandela, nor any of the other activists of the ANC had anywhere else to go. There was no retreat. Mandela knew that the rubble left after an apocalyptic confrontation would be the rubble of his own home, in the largest sense of the word. He also knew that there could be no freedom for his people unless that included all the people – Black, “coloured,” and White – in South Africa.
Peace, reconciliation, democracy were neither a “capitulation,” nor an easy way out, nor a failure. They were the most difficult things in the world, and they were the only way to ensure that his personal freedom, and the freedom of his people, meant something.
Mandela did not storm the barricades and crush neo-liberal neo-colonialism. He did not establish a worker’s paradise-on-the-Cape. South Africa today remains a troubled land, in many ways, and it seems that, to some people, that is an indictment of Mandela’s “failure.” But that ignores something that the great man knew better than almost anyone: that revolutionary transformation is a process and not an event, that the path to change is rarely short and straight. “I have walked that long road to freedom,” he wrote in his memoirs. “I have tried not to falter; I have made missteps along the way. But I have discovered the secret that after climbing a great hill, one only finds that there are many more hills to climb.” South Africa has a long way to go, but thanks to Mandela, it is on its way.
Despite all the soft-focus samplers and aphoristic remembrances, he was not a saint, but practical man deeply committed to a political ideal. Shortly after Gandhi was assassinated in 1948, George Orwell wrote, “No doubt alcohol, tobacco, and so forth, are things that a saint must avoid, but sainthood is also a thing that human beings must avoid.” Orwell, ever prickly, had no stomach for Gandhi the saint, but he expressed deep admiration for the human being.
Although I know that my Image of Mandela is a problematic as anyone else’s, that I never knew the man apart from his mark on history, that is how I choose to remember him: as a human being. But what a human being! He was a great man, the greatest I have ever known. I feel privileged to have lived in his times. If all of our leaders – if all of us – were as committed to peace and decency, as intolerant of oppression, as courageous, as principled – as human – then we would not need people like Mandela to change the world.
Thank you, Madiba.
I had to take a moment to digest that idea. Photography has been an essential part of the newspaper business since the New York Daily Graphic ran the first half-tone photo reproduction (of New York’s Steinway Hall) on its front page in 1873. William Randolph Hearst apocryphally sent his photographers and illustrators to Cuba in 1898 with the message “you supply the pictures, and I’ll supply the war.” For more than a century, photography and journalism have been virtually one and the same. This was a shock!
Yet, in another breath, it is not such a shock. The photographers of the 19th century – like Alexander Gardiner and Matthew Brady – and the 20th century – Robert Capa, Weegie and many, many others – were highly-trained craftsmen in the fullest sense of the term. Photography was an enormously difficult and very costly calling. Photographers had specialist knowledge in artistic composition, optics and chemistry, and had had to make an investment in expensive equipment. A Graflex 4×5 Speed Graphic – the standard-issue photojournalist camera of the time – sold for more than $300 in the mid-1930s. That would be about $5,000 in current coin.
Today, everyone has a camera. They’re part of every smart-and-dumb-phone, computer, tablet and iPod. And they’re ridiculously easy to use. You point and shoot; there’s no need focus, take a light meter reading or stand over an enlarger easel in a stuffy darkroom choking on chemical fumes. Few people, even with the priciest pro-sumer or professional equipment can produce images as striking, beautiful or illustrative as the work of Dorothea Lange, but – brace yourself for this – it doesn’t matter anymore, not even for journalism.
The same kind of thing happened to the craft of typesetting. There was a time when typesetters and compositors were the aristocrats of newspaper production. They served years of apprenticeship before finally being allowed to lay hands on the linotype machines that produced newspaper galleys out of hot lead. Then along came computers, laser printers and PageMaker, Quark and – gasp! – Microsoft Publisher. Any moron who could type, could set type. (In the interests of full-disclosure, I worked in a typesetting shop during the transition to desktop publishing. I’m still bitter.)
It didn’t take newspaper publishers long to realize that they could do without those annoying, uppity, typesetters and compositors who had the most powerful and assertive union in the business – the ITU – and always seemed to be threatening to strike. If reporters were typing their copy into computers anyway, and any moron could set the type, then why not just let the editors and a tiny remnant cadre of graphics professionals do the typesetting and composition? (You might detect a note of bitterness there.) Whole composition departments disappeared virtually overnight.
Newspaper text started to look like crap to those of us who knew something about type and layouts became almost universally modular, boring, and pretty much identical to every other newspaper out there. No one noticed because, as the newspaper business began to contract in earnest in the 1990s, most markets were left with a single daily and nothing to compare it to except the New York Times and USA Today. With less money to be made in the news business – which became increasingly about the business rather than the news as a result – you didn’t need your type or design to be good, just good enough.
Reporters were next. Once most of the production staff had been made redundant, journalists found themselves competing for fewer and fewer full-time jobs. Journalism schools and communications studies programs were churning out – and continue to churn out – many, many thousands of eager, under-employed graduates annually, each one with fantastical ambitions of being the next Upton Sinclair or Ida Tarbell. They provided a competitive pool of “talent” (the “reserve army of journalism,” to cop a phrase from Karl Marx) that would work for peanuts or less, for a “special to” byline. Look at your daily newspaper, the vast majority of news stories that don’t come from AP or Reuters are probably the work of freelancers.
Readers who cared about such things noticed the drop in journalistic standards, but most readers didn’t seem to care. Newspaper addicts were – and are – a dying breed, as news consumers gravitated to different sources, where the quality of the prose was much less important than the immediacy of the raw information. It doesn’t have to be good, just good enough. News consumers shifted first to radio and television in the 60s and 70s and, more recently, the Internet. Newspaper and magazine circulation figures have been in a free-fall ever since.
Now it’s the photographers’ turn – and why not? If anyone can snap a picture… if no one has to develop it… if most news consumers are going to see it as a 72-ppi thumbnail, anyway, then why does the Sun-Times, or any newspaper for that matter, need to keep photographers on staff? In an era of declining circulation, shrinking ad revenues and intensifying competition for consumers, full-time professional photographers – like typesetters, reporters and editors before them – have become an unnecessary extravagance. Images can be sourced from readers themselves, and there’s no reason why freelance writers can’t take their own pictures when they write their articles. For free. Like in “freelancer.”
They don’t have to be good pictures, they just have to be good enough.
Most journalists know the anecdote of the sign which reads “Fresh Fish Sold Here Today.” We heard it in journalism school, or from our editors as we were just starting out. It is an object lesson in literary economy, in which unnecessary text is whittled away to the bare essentials. The joke – and there is always a joke in journalism anecdotes – is that, shorn of inessential verbiage, the sign is ultimately left blank.
This is what has happened to the news business. Whittled to the bare essentials, newspaper newsrooms will soon be left empty.
For those of us who have worked or, in rare cases, continue to work in the newspaper business, the prospect is deeply depressing. We were drawn into the work at least in part by the romance of The Front Page and All the President’s Men and we justified the ridiculous hours and crummy pay by telling ourselves that a free press is essential to democracy, and we were essential to a free press. We were “serving the public interest!” Even those of us who left, or were pushed out of, the profession still feel connected to it. Once an ink-stained wretch, always an ink-stained wretch.
Part of that identity is tied into a powerful declension narrative. “The business,” we say, “ain’t what it used to be.” Our editors and other old-timers told us that on our first day in the newsroom, just as their editors did before them. And we have repeated the mantra ourselves… to the dwindling handful of young journalists who come after us. I’m sure Joseph Addison said it to his cousin Eustace Budgell when the latter took over The Spectator in 1714.
The profession of journalism and the news business inhabit a perpetual golden age in our imaginations. It is a time when journalists were respected, when our work mattered, when people actually read newspapers, when publishers cared about journalism and not just the bottom line – or, in any event, were restrained by the heroic efforts of reporters and editors (invariably fortified by shots of whiskey from the bottle in the lower left desk drawer). It was the time when were just starting out… or just before… or at the peak of our careers. Whatever the case, we say, that time has passed, and the news business has gone to shit.
And that’s the point. The news business has been going to shit since Pheidippides dropped dead on the steps on the Parthenon. It has changed, and continued to change throughout its history, and it will change some more. Horace Greeley could never have imagined Edward Murrow’s See it Now, and if he did, he would probably have considered it a travesty. And Murrow himself, shortly after See it Now was canceled in 1958, expressed horror at the medium he helped to shape.
“One of the basic troubles with radio and television news is that both instruments have grown up as an incompatible combination of show business, advertising and news,” Murrow told the Radio-Television News Directors Association. “Each of the three is a rather bizarre and demanding profession. And when you get all three under one roof, the dust never settles.” Indeed, the RTNDA is now the RTDNA, Radio-Television Digital News Association. “The only constant in news is change,” an editor once told me. “That’s why we called it news, shithead.”
So, while I feel nothing but sympathy for the photographers who have lost their jobs at the Sun-Times, and the ones who will probably follow at other newspapers, I can also see that this was probably inevitable. While I cringe at the pathetic quality of much of the prose I read in newspapers and online, and shudder at the thought of what newspapers will look like with amateur snapshots and video frame captures on the front page, I know I can look elsewhere. The world is full of news, it just takes more effort to find than it used to.
Part of me wonders if maybe all of this is a good thing. Maybe we are at the cusp of a new paradigm, when journalism will finally be liberated from the news business. It’s a thought that I have been musing on for years. I first addressed the question of the future of journalism with the challenge posed by new media more than a decade ago, when I was teaching journalism at Concordia University in Montreal.
I gave a talk on “Journalism in an Age of Participatory News” in January 2003. The challenges were already there – long past the horizon and already in the offing. But as much as new practices and technologies posed a threat to the news business, I saw then – and still see – a great opportunity for reporters and journalists of all kinds to better serve the public interest. “To do this, we have to readjust our understanding of the news, to see it as the protean thing that it is, a chorus of many, contradictory voices in a complex counterpoint,” I concluded. “We also have to surrender some of our control over making the news, to allow its participants to tell their own stories and news consumers to construct their own narratives. But I think we implicitly accept that idea when we argue that the public interest is best served by a multiplicity of news sources.”
In the wake of slow decline and agonizing death of the news business, I still hold great hope for the news.
I have been reading the comments sections of CBC News reports of the ongoing Idle No More protests in Canada lately. It hasn’t been easy.
Before I moved to the United States in the summer of 2005 (to pursue a PhD in history at Rutgers University), I was one of those insufferable, sanctimonious Canadians who viewed Americans as our energetic, creative, though somewhat pathetic, immature and problematic cousins south of the World’s Longest Undefended Border. Sure, the United States pretty much runs the world. Sure, the best movies and TV shows (remember Coming Up Rosie? The King of Kensington?) were all American. Sure, Rock and Roll, Jazz and baseball were all American innovations…
… At least we Canadians were more civilized, tolerant and peaceful than the Yankees. (And we had hockey, but more on that some other time.)
Canadians have long produced themselves as the kinder-gentler-Americans. We are those peace-loving, honest-brokers who everyone loves when we’re backpacking across Europe (maple-leaf flags on our backpacks); who hug trees and cuddle bunnies and admire David Suzuki; who don’t invade other countries but invented UN Peacekeeping, and never pursued a policy of genocide against aboriginal people (“you call them ‘Indians,’ we call them ‘First Nations'”). American liberals have, for the most part, bought into the story for years. They look North to our national healthcare system, relatively uncontroversial abortion rights, and universal marriage equality as models for a more civilized (we’d like to think more Canadian) America.
Canadians were civilized and peaceful because Americans are not. That’s what the Molson Canadian ad told us. Canadians are intelligent, thoughtful and well-informed because Americans are not. That’s what Rick Mercer told us in Talking to Americans.
However, one need only look at the CBC News comments to see that there is a sharp disconnect between the representation and the reality. The level and virulence of anti-First Nations racism on display is really amazing, even worse that the kind of Tea Party crypto-fascism that CBC viewers and readers spewed all over reports of the Quebec student protests last winter and spring.
It’s really quite dismaying, but it’s not surprising. Even when triumphal Canadian exceptionalism was one of the most dominant elements of my national identity, I always had a sinking feeling – that sort of sinking, inchoate, inarticulate queasiness that you try to ignore – that it really wasn’t quite just so.
It was easy to ignore. My circle of friends and colleagues in Canada was made up mostly of well-educated, progressively-minded academics and journalists. They all fit the representation – intelligent, tolerant, queer, queer-friendly, feminist, lefties – so I never had to interrogate the discontinuity between representation and reality. Then I moved to the United States. I married an American. My Canadian exceptionalism had been constructed on my knowledge of the United States as the anti-Canada – the land of guns, violence, racism and imperialism. Yet I see none of that among my circle of American friends and colleagues, who are mostly well-educated, progressively-minded academics (and a few journalists). They are all intelligent, tolerant, queer, queer-friendly, feminist, lefties, none of whom fit the representation of the American Other.
Something had to give… And it did.
At no time has this been more apparent than during the last few months. Most of my Canadian friends, and much of the Canadian media, have been rhetorically going to town over the agonizing gun-control debate now raging in the US. The narratives are complex, but they mostly boil down to a frequently-expressed sentiment that can be characterized as: “We (Canadians) have gun control, and are therefore civilized, while they (Americans) do not, which is evidence of their lack of civilized rationality.” Setting aside for the moment that it’s not quite true – Canadian gun control laws are not that much more restrictive than those in the US, and the sitting Canadian government is in the process of dismantling what does exist – it does reflect the persistence and pervasiveness of the Canadian myths of the United States and American society that are so central to Canadian exceptionalism.
These narratives were crystalized yesterday in a post that one of my Canadian friends made on Facebook yesterday. I have nothing but the greatest respect for Pete. He is a brilliant journalist, editor, blogger, comedian and commentator on the foibles of Quebec politics. In the thirty years that I have known him, I have learned a great deal about journalism, the craft of writing (he was one of my first editors) and progressive politics. I find that we agree on most political questions and, even in those rare instances in which we don’t, I am virtually always impressed by his humanity, compassion and clear-headedness.
But yesterday, he posted the image of a T-shirt with a logo that reads “CANADA: living the American Dream without the violence since 1867.” I protested. Quite apart from its factual inaccuracy – I’m sure the Metis who were violently suppressed in 1870 and 1885, the Inuit and First Nations people who were forcibly relocated, kidnapped and controlled from Grise Fjord to Alert Bay and many, many other Canadians might comment on the violence in Canadian history – I found its smug exceptionalism frankly offensive.
Pete responded by posting links to recent opinion polls that have found that a disproportionate number of American respondents believe that President Barack Obama was not born in the US and that Saddam Hussein was stockpiling weapons of mass destruction. Hell, Pete noted, most Americans believe that they won the War of 1812!
Though not directly germane to the question of violence, the polls reinforced the notion that Americans are dumber than Canadians – and thus presumably culturally inferior – which, I presume, explains their inability to be as rational and civilized as their northern neighbours.
One of the problems with polls is that they only provide answers to the questions you ask. Due to the expense and resources required to take polls, moreover, which questions are asked is dependent on the instrumentalities of the people who commission polls. I suspect that’s why there aren’t any polls (that I’ve seen) that ask Canadians about their opinions of the US President’s background or whether Saddam Hussein had weapons of mass destruction. These are not significant questions in Canadian public and political culture. Consequently, mobilizing poll results about what Americans believe about President Obama’s birth or the justifications for the American invasion of Iraq as evidence of Americans’ ignorance or violent proclivities relative to Canadians’ is specious.
There are, however, some interesting polls about Canada, like the recent Ipsos Reid poll that indicates that the vast majority of Canadians (something like 2/3) believe that the Canadian government spends too much on the First Nations, and that aboriginal Canadians are “treated well.” The obvious corollary, since many aboriginal Canadians are currently involved in a political struggle with the Canadian government over treaty obligations and land claims, is that the vast majority of Canadians believe that First Nations people are spoiled, coddled malcontents. This certainly seems to be confirmed by the comments sections of CBC reports on Idle No More.
So… One could conclude from this that “Canadians are racists,” certainly with regard to First Nations people.
One could argue, in fact, that Canadians are deeply racist people. I remember a poll from a few years ago that found that a majority of Quebecois admitted to being racists (and about half of all Canadians). It is certainly possible to narrate Canadian history that way. Canadian Chinese exclusion policies (like the Chinese Immigration Act of 1923) were far more restrictive and explicitly racist than American policies of the time (see the National Origins Act of 1924). The Canadian government began the internment of Japanese Canadians a full month before the US government imposed internment on Japanese Americans in 1942. Indeed, the internment of Japanese Canadians was, if anything, more restrictive and more explicitly racist than the internment of Japanese Americans, and the conditions in the internment camps were far worse.
And there are many other ways that we can compare the racism of Canadian society to the United States:
1. There has never been a non-white Prime Minister of Canada. The United States has elected an African-American president twice. There has never been a Black premier of a Canadian province. The first African-American governor in the US was P.B.S. Pinchback in Louisiana in 1872. There has never been a non-white Canadian in a senior cabinet position, while Patricia Roberts Harris served as the Secretary of Housing and Urban Development in the US in the 1970s, and more recently, Colin Powell and Condoleezza Rice served in the portfolio of Secretary of State, arguably the most senior cabinet post.
2. Canadian historical memory has never admitted – let alone come to terms with – the sad history of slavery in Canada. Canadians prefer to represent their history as one of multicultural benevolence. The mythology of the Underground Railroad elides the narratives of Canadian slaves. (My ancestor owned slaves in the US and brought them with him when he de-camped to Halifax during the American Revolution.) Sure, slavery was abolished in Canada in the 1830s (a generation or more after it had been abolished in most of the United States), and it was never as pervasive as it was in the southern United States. But it’s something we never talk about. I can’t teach American history without discussing slavery or racial politics, but the topic never came up during my education – grade school, high school, CEGEP, BA and MA – in Canada. The last segregated school in Canada only closed in 1983, forty years after the Brown v. Board of Education decision in the US.
3. The Canadian Immigration Act of 1910 was adopted 14 years before the US National Origins Act, and was, unlike the latter, explicitly racist. It formed the basis of the so-called “White Canada Policy” that prevented the admission of Jewish refugees from Europe during the Holocaust.
4. There was a higher rate of membership in the Ku Klux Klan in Saskatchewan in the 1920s than anywhere in the United States during its entire history.
If you look at the polls, and the historical record, then once could easily conclude that Canada is a far more – and unrepentantly – racist society than the United States. Hell, if you look at the polls, you might even conclude that Canadians – 75% of whom have no idea who their head of state is – are also a bunch of politically-ignorant simpletons.
But that’s not my point. My point is that it’s all bullshit. Polls are interesting, and useful for journalists eager to fill in the space between advertisements (it’s a tough job, but someone has to do it) but they don’t really say anything outside of the very narrow parameters of the questions they ask, who asks the questions and who answers the questions.
My point is also that constructing exceptionalist narratives like “living the American Dream without the violence since 1867” is a specious project. One could produce similar anti-Canadian, American-exceptionalist narratives just as well. How about, “America – yeah, we have a gun problem, but at least our murderers aren’t perverse sexual predators who take pleasure in raping, mutilating and dismembering their victims – including their own sisters! – before burying their remains in pig farms like those sick Canadians.”
I have to say… I find the kind of sanctimonious Canadian exceptionalism that has been on display lately pretty offensive. I understand the Canadian knee-jerk reaction; the desire to produce a more humane and civilized Canadian self-representation opposed to the great Other south of the border. It’s the consolation prize for Canadian insecurity, the Canadian inferiority complex and the recognition that, in the final analysis, Canada is largely impotent in global geopolitics and economically little more than a vassal branch-plant of the American Empire. I get it.
But I think that informed, educated people on both sides of the border need to step back a bit and resist the urge to trade in this kind of frankly chauvinistic exceptionalist rubbish.
[I wrote this several months ago, as I was preparing to leave Paris. I had only accomplished a fraction of what I had planned to do during my year abroad, but the imminent digitization of many of my archives promised to make my academic life easier once I returned to the United States. It will be a convenient way to conduct research, but the prospect of working that way makes me unspeakably sad. I love my field in part because I get to interact with old things. Digital collections, for all their accessibility, just aren’t the same.]
I’m sitting in an archive. Today is one of those rare days when everything works – I arrived on time, the building was open, my documents were available. I have not had many of those days, here in France.
My documents are spread out in front of me, on a too-small table, threatening to crowd out the elderly researcher to my left. He’ll do the same to me as soon as I move.
With these pages, bits of information, figures, photos, and other flat things, I am trying to create a three-dimensional, living world. Some of the information jumps off the page – things related to China. Porcelain. Lacquer. Sometimes I go back to look over what I’ve read before, and the characters are dead again. I don’t always know how to revive them. Sometimes they haunt me. Maybe I haunt them.
Someone’s short brown hair is in the fold, almost one with the binding. I wonder how long it has been there, whether it is a relic or just the body-print of another researcher.
I suppose that’s a relic, too.
The document is a palimpsest of bodily leavings.
When they digitize it, I won’t see the hair.
The old-man smell, too, will disappear. I’ll forget that real people touched the paper. The dust won’t invade my nostrils.
Someone – a student, an overworked librarian – will scan it, too quickly, and fail to notice the black blotches and white patches that even technology can’t rectify. I will see it and skip over those parts. I’ll mourn their loss, but nobody will re-scan the original, because it will be “incommunicable,” locked in a temperature-controlled vault so that it can’t decompose too quickly.
The democratic, easily-accessible version won’t be legible.
Much of what I work with is illegible anyway. There are days when I can decipher one out of every hundred words, and when I come to the moment when the secretary changes, when suddenly the letters are shaped carefully and lovingly, as if to call my attention, I want to hug this being with the lovely handwriting.
Monsieur De La Tour had good penmanship. Monsieur Edan did not. Guess whose work I’ll be using more frequently.
I’m collecting pressed flowers. There is nothing essential here. Gather the dust, add water, shape it as I like, knowing it might fall apart, disintegrate, blow away. Sometimes the documents themselves do. The archivist looks at me suspiciously when I return the book with its cover half gone and the pages crumbling. This is partly why they’re digitizing everything, to keep my American hands from destroying French patrimony.
It’s a strange irony – I’m trying to preserve this, or at least tell a story with it, and I’m killing it in the process.
Social networking is a powerful medium. It has allowed me to reconnect with old friends with whom I had lost touch, even though we would never, or would rarely, have the opportunity to encounter each other in “real life.” It has also allowed me to connect with people whom I have never met but who, through the networks of friendship and acquaintance, I have found both delightful and interesting.
However, social networking friendships can be quite unlike off-line friendships in many ways and I am always a little uneasy about calling my Facebook contacts “friends.” Our affinities are sometimes somewhat superficial, and even if they go deeper I find that the modalities of social networking are such that we are sometimes willing to tolerate differences that would seriously strain what most of us would consider “friendship.”
I’m sure that most of us have, at some point in our lives, discovered that someone for whom we have great affection and respect holds opinions and convictions that we consider repulsive and reprehensible; not disagreements on religion, philosophy, politics or taste that we recognize as reasonable opinions, but serious differences. For example: think of the coworker who has always been kind and friendly, but who has deep homophobic views, or the relative who lets slip a bigoted comment at Thanksgiving, which you later learn is only the tip of a racist iceberg.
This is a profound ethical problem. Is it possible to call someone who holds repulsive beliefs a friend? Even if that person is otherwise a decent, caring person? What is our responsibility; does our personal relationship legitimize our friends’ beliefs?
Like all ethical questions, these ones are not easy to solve and, for the most part, it’s common enough to find ways to simply not solve them by avoiding the issue: You don’t invite your racist friend to a dinner party where there will be Black or Asian guests. You don’t watch a John Waters movie with your homophobic cousin. You don’t talk politics with your reactionary uncle. Everyone I know has done this at some point. There might be an initial moment in which you assume your friend was just “kidding” or mistaken. Then, confident that reason will work, you try to engage with your friend to convince him that he’s wrong.
Hell, sometimes it even works. But not always – in fact, rarely. And then you are confronted with a choice: do you find a way to tolerate your friend’s intolerable opinions, making excuses along the way, do you minimize contact with your friend so you don’t have to think about it, or do you end the friendship? This can be an extremely thorny situation in “real life” because our connections are usually more than superficial. Your coworker is not going to go away, and your cousin will still turn up at Thanksgiving. Moreover, because of the complexity of our (real) social networks, how you answer the question will inevitably have consequences in your relationships with mutual friends.
On the other hand, we are ethically responsible for our friends’ beliefs. Our endorsement of our friends legitimizes them in the social gaze: “If Fred is a staunch anti-racist, but he’s friends with Bob, who always makes those comments about ‘kikes,’ then maybe Bob isn’t so bad.” We add our social capital not only to our friends, but also to their beliefs.
Moreover, by failing to confront or denounce repulsive beliefs we legitimize the false notion that everything is equal in the marketplace of ideas and that it all comes down to a question of personal preference. I’m not comfortable with that. While I can accept that which sports team you support or which political candidate you endorse might be a question of personal preference, I’m not willing to extend that to all other ideas.
At the end of the day, no one is hurt if you prefer the Yankees over the Mets and I won’t be offended if you take a little dig at me if I’m a Mets fan. (After all, Mets fans are used to it.) But that is not the case if you think that people with dark skin are inferior, that people who love differently are abominations who do not deserve rights, that women need to keep in their place, or that it is ever, ever okay to inflict violence on another human being. These are beliefs with very real human consequences and we are responsible for them through our relationships.
Social networking simultaneously diminishes and amplifies the ethical problem. On one hand, most of us recognize that our online “friends” are somehow not quite the same kind of friends as those we have offline. I have something like 300 “friends” on Facebook; I am fairly sure that I could not list 300 friends that I have in real life, even if the majority of them are on Facebook. And of those Facebook “friends,” I only ever pay attention to what a couple of dozen of them are doing and saying. Every now and then I look at my “friends” list and marvel at how many of them are people I don’t really know that well at all.
Yet, they are “friends” on Facebook, with my explicit endorsement. Their “friends” – particularly ones that we do not share – can connect them to me simply by paging through their “friends” lists or reading my occasional comments on the Timelines, or theirs on mine. Sometimes, this leads to serendipitous connections, like when a “friend” of a “friend” discovers that we have certain affinities.
However, there is also peril because a billion Facebook users can see that I have a “friend” who believes that Barack Obama is a communist, or that Global Warming is a conspiracy and its okay to foul the environment, or that it is perfectly legitimate to inflict violence on people you don’t like… and then conclude that I’m okay with it.
But I’m not. And if I’m not okay with it I have an ethical responsibility to communicate that and refuse to legitimize ideas that I find abhorrent. So I have pruned my “friends” list. This is an awful lot more difficult than it sounds because I do have considerable affection for some of the people I am thus shutting off. I also know that there will be consequences – hurt feelings and insult – and I do feel great regret for hurting my erstwhile Facebook “friends'” feelings – particularly because some of them are also friends in the real world.
And I don’t doubt that some “friends” will return the favour because they find my atheist, socialist, pacifist, postmodernist, postcolonialist, queer beliefs just as offensive. They are entitled to do so, because they should not have to call someone who has beliefs they find repulsive a friend.