In the first season of the Showtime psychological drama Homeland, audiences meet Carrie, a bipolar CIA agent whose investigation into the loyalties of Brody, a former prisoner of war, is called into question throughout the season. Is she paranoid? Insane? We get a glimpse of antipsychotic medication from the get-go. Although she has hidden her pills in an aspirin bottle, she is outed when an operative with a headache finds it.
Unlike the sensationalism with which Six Feet Under handled brother Billy’s bipolar disorder, Homeland feels more accurate—intensity that bothers other people, sleeplessness, untethering of signs from meaning, rapidity of speech, flights of thought, heaviness, sorrow, and the slow, thick movements of the depressed person who realizes she needs correction to function in society.
By the last episode, the audience knows that Carrie’s not crazy. Sadly, she doesn’t. Nobody understands the mechanism by which electroconvulsive therapy works.
So, desperate to correct her bipolar disorder and revive the life her bipolar disorder has wrecked, Carrie opts for ECT. As she goes under, she forgets the critical evidence linking Brordy to terrorist Abu Nazir (which would have proven she was not crazy).
Homeland is the best paranoid fantasy I’ve ever seen, a suspenseful, well-plotted narrative interpretation that seems to say, “you’re not paranoid if they’re really out to get you.” It is also one of the best depictions of bipolar disorder I’ve seen onscreen, the brilliance savaged by uncontrollable emotion and fear. I say this as somebody whose break with reality at age twenty-seven was similar, even though it was not relevant to a larger narrative, the question of national security.
Nobody understands the mechanism by which electroconvulsive therapy works. It erases memories, a narrative device that Homeland is taking advantage of. It’s a way to start over, to wipe the slate clean. I wonder if erasing memories is a side effect of ECT, or an integral part of the remaking of oneself that occurs when trying to recover from a deeply disquieting psychotic episode.
The concept of the magic bullet was developed by Paul Ehrlich, a physician and scientist who died in 1915 in Germany. He believed that it should be possible to create a compound that selected a disease-causing organism, as well as delivering a toxin to kill it.
If you’re a psychiatric patient resistant to the use of pharmaceutical drugs, you’ll soon discover that most psychiatrists require you to be on medication as part of treatment. Psychiatric drugs are the answer almost all the time. “If you don’t like the way we do it here, you can find another doctor,” said a prominent psychiatrist at Stanford to a patient concerned about weight gain, citing a story about a patient who was functioning well on Lithium, who went off and a year later was on the roof of the hospital, wild-haired and ready to jump.
Antipsychotics treat the entire person as a disease-causing organism. They select the entire person for regeneration or for destruction. For as many people as they help to function in society, they sedate, squash, destroy, the spirit of others. As far back as 1979, the American psychiatric community has been aware of a WHO study that showed that people who suffered a schizophrenic break in India, Nigeria, or Columbia had a higher chance of fully recovering than those who suffered the same in the United States.
In his book Mad in America, Robert Whitaker argues that these very different countries with good rates of recovery are different from the United States in one important respect—the medical care. The poor countries can’t afford those magic bullets, the psychiatric drugs that American culture demands you get on if you want to be respectable.
Of course, it’s possible that the sampling in other countries is skewed. In India, anyway, there is less awareness of mental illness as a physical condition. Dualism is taken for granted. Mental illness is a soul sickness. And people hide it, at least from people outside their families.
Organizations in America to reduce stigma have done incredible work, creating a kind of water cooler around mental illness, a space in which it’s mostly safe to out yourself… unless of course you actually want to be one of the leaders of the fight against stigma, a voice in the debate about how mental illness is treated and talked about.
“Well, it’s not really a disease,” noted one kind-hearted, elderly psychiatrist who was, unbeknownst to him, in mixed company at a meeting for one of these organizations. “But it’s what we need to call it.” It’s the only way insurers will pay for the expensive treatments available.
In November 2004, Proposition 63 was passed, allowing the California Department of Mental Health to provide increased resources toward county mental health programs. In Santa Clara County, where I live, the most vocal people at the Mental Health Stakeholders’ meetings are family members. I don’t feel like anything gets accomplished at these debates about how the Proposition 63 money is allocated, but evidently things get decided eventually. If you go to these meetings, everyone is as nice as can be.
But, it’s dumbfounding to me how much the discussion is dominated by family members as stakeholders, rather than “consumers” of mental health services. The consumers do speak, but there’s a subtle layer of condescension if they announce themselves as consumers. If a family member speaks, however, he or she gets unbridled sympathy. The participation levels highlight how many of the leaders in the movement to remove stigma from mental illness are family members of people with illnesses (or claim to be family members), rather than the individuals themselves.
Perhaps I am wrong, perhaps these are people in hiding, people for whom the stigma is so significant and painful and personal that they imagine everyone else holds the same kind of stigma around the topic. And therefore they don’t out themselves, but focus instead on telling a story about a family member.
But I don’t think I’m wrong. The assumption is that people with mental illnesses are voiceless, can’t speak for themselves in a way that is reliable, in a way that other people want to hear or be led by. People want to hear stories of mental illness, but they don’t want to hear it from the people on the frontlines, the ones being devastated. Those, apparently, are too depressing.
The same bias operates in the publishing industries. Readers eat up stories that come from a sufferer like Kay Redfield Jamison who is also an authority figure, or a celebrity like Catherina Zeta-Jones, but they don’t want the story of the man sitting next to them on the bus who shifts in and out of reality as he speaks. Readers love the well-crafted, newsy piece written by a sister or father quietly or poignantly observing what schizophrenia looks like on the outside. Or the finely-observed novel written by someone who has worked among the mentally ill.
Don’t get me wrong. There is wisdom and beauty that emerges from the stories of people who are exposed to mental illness, but do not suffer it. But those aren’t the only stories, the only truths.
Right now, the medical model is on the rise, and very few people talk about the existential aspects of mood disorders and schizophrenia. As the controversial psychiatrist R.D. Laing wrote in The Politics of Experience, “Existential thinking offers no security, no home for the homeless. It addresses no one except you and me. It finds its validation when, across the gulf of our idioms and styles, our mistakes, errings and perversities, we find in the other’s communication an experience of relationship established, lost, destroyed, or regained. We hope to share the experience of a relationship, but the only honest beginning, or even end, may be to share the experience of its absence.”
People expect those who suffer to submit to the embarrassment and pain of correction, not to muddy the literary pool with their more difficult writings, their experimentation, their grief. Be happy! the television ads for atypical antipsychotics promise us. Never mind the footnote: the nausea, the tremendous weight gain, the dizziness, the anxiety, the cognitive dulling, or the akathisia that makes you want to run to Alaska or New Orleans, whichever is further, just to get away from yourself.
Pathology can be competitive. Unlike real life, if you enter any psychiatric hospital, the key to socializing is to suffer more intensely than the person next to you. You usually don’t talk about the news or the last book you read. You talk about diagnosis, about pills, about the trustworthiness of the doctors (by and large, they aren’t seen as trustworthy, and this perception, too, is pathologized).
Sometimes diagnosis is how the intensity of suffering is measured. So, for example, the person branded with Bipolar I, who careens between mania and depression with layovers of psychosis, is higher in the hospital hierarchy than the person with garden-variety depression or even the person who has been diagnosed with Bipolar II who may suffer from severe depression, but only hits the mild high of hypomania. The people in the lockdown ward are more hardcore than those in the “voluntary” ward. Being hospitalized marks you as a more serious case than a person who has never been hospitalized.
It’s the patients who’ve had electroconvulsive therapy that are given the most credibility in the psych wards. Unlike being on Prozac or Wellbutrin these days, if you’ve experienced electroconvulsive therapy, you win as far as being considered fucked up goes. While many patients guard this experience outside the psych ward—it’s not something you slip into casual conversation—inside the hospital it’s a different story.
I met a young woman once who had lost a couple of her kids briefly because of her illness. “Electroshock’s better than Seroquel or Abilify,” she said. “It works for a few years and then you need it again.” This in spite of the hair loss, the memory loss, the loss of dignity. (You don’t get fat from ECT.)
Perhaps the most troubling aspect of our society’s emphasis on correction over empathy and tolerance is its concurrent fascination with the life of the artist/writer suicide—the person who suffered so much, he or she stopped talking and just did it. Earlier this year, in the hubbub surrounding what would have been David Foster Wallace’s fiftieth birthday, Jonathan Franzen pointed out that David Foster Wallace was his friend, that there were people who’d only heard Wallace’s commencement speech rather than read his books or talked to him in person, who were now characterizing him as a saint.
In one essay, Franzen wrote, “I will pass over the question of diagnosis (it’s possible he was not simply depressive) and the question of how such a beautiful human being had come by such vividly intimate knowledge of the thoughts of hideous men.”
Surely an essay capitalizing on his friendship with another writer who committed suicide and intimating, without offering any evidence, that his friend was one of those hideous men must have struck the author of The Corrections as being just as suspect in its motivations as the belated attention of those people who paid Wallace’s work no attention during his lifetime. Or not.
We’re coming up on the fiftieth anniversary of Sylvia Plath’s death, February 11, 1963. As most readers know, on that day she put her two children in a bedroom, stuffed a towel under the door, and stuck her head in an oven. This was about a month after Knopf turned down The Bell Jar saying that following Esther’s breakdown, “the story ceases to be a novel and becomes a case history.”
Just under fifty years later, Hollywood created a glossy fiction about her, a movie of surfaces that for intellectual property reasons was unable to quote from her poetry or her novel, the very reason people should care about the more intimate details of her life. Plath, rejected so harshly by the publishing industry while alive, has been embraced, a little too much, in her afterlife. Think of the fans that pillaged her headstone, repetitively, trying to remove the reference to Ted Hughes that exists in her last name.
Like many other teenage girl poets, I loved Plath not only for her more controversial well-known lines, but for her quieter poems. Take this line from “Poppies in October”: “O my God, what am I /That these late mouths should cry open /In a forest of frost, in a dawn of cornflowers.” There was something addictive about Plath’s poetry, something powerful, subversive, stunning, and catchy.
I possessed all those other attributes of the cliché fangirl—the all-black wardrobe, the perfectionistic, withdrawn, and cynical moods. I delved into Anne Sexton, a very different poet, only because of her friendship with Plath. In short, like all fan girls, I aped Plath based on my limited information and my love of The Bell Jar, a book I might not have had access to as an American reader, if she had lived.
While Plath’s poetry is filled with mystery and myths, a rare violent beauty, and words about motherhood, the media, her fans, and her critics have been less interested in all of these than in her death obsession and lines like “Every woman adores a Fascist” in the poem “Daddy”. We’ll never know the calculus that caused her to turn on the oven and stick her head inside, though we can sense the tortured energy that created the poems in Ariel.
We can fool ourselves that her words allow us to divine who she was, the cause and effect of her life, but to believe that we have succeeded after all these years? It’s reflective of an adolescent spirit in our culture.
Writing a biography is hard. You patch together events that taken all together may not have any sort of arc. Often you rely on the reader’s intrinsic interest in the person to propel him forward through the text, through minutiae about what movie somebody saw on a particular day or what they thought about a political topic. In order to get inside the person to whom you can’t speak, you comb his or her letters, diaries, emails, grocery lists, searching for the sparks of insight that string together the events, making sense of them. With artists and writers, there is an illusion that you can get closer because there is always the work.
Consider how much more difficult it is to construct a biography of someone who has been marked as mentally ill. The mark that psychiatrists leave is a mark of unreliable narration. The process of memory even among the sane is a process of reconstruction, of drawing from actual recollection as well as one’s expectations, biases, prior knowledge and even later-acquired knowledge. We are, all of us, fictionalizing our own lives all the time, though most of us do not interrogate our memories, do not question how much of what we remember is real and how much construction.
Now imagine the mentally ill person, sitting across from a psychiatrist who tells him his perception of the world is wrong. The impact of being branded that way crosses over into memories. You are remembering and reconstructing from a vastly different position than the person who is not the subject of interrogation. It crosses over into your diaries, your emails, your letters, yes, sometimes even your grocery list.
David Foster Wallace experienced multiple courses of ECT over the course of his life. According to his mother, after six rounds in Arizona, he emerged “fragile as a child”. There is no way for a family member, for a biographer, for anyone outside to get inside that, to offer the truth.
On Salon, a writer debated the good and bad points of the ending of the audiobook of D.T. Max’s David Foster Wallace biography, Every Love Story is a Ghost Story, which I had just listened to and which seems to embrace the medical model. After a few unadorned lines that state Wallace hanged himself and a few lines from a character’s death in Infinite Jest, the narrator ends, “This is not an ending anyone would have wanted for him, but it was the one he had chosen.” As the writer points out, this ending leaves the listener with a (weak) facsimile of the emotion that hit Wallace’s friends and family when they heard of his suicide.
The writer on Salon noted that by ending starkly rather than with the details that might satisfy some sordid gossipy urge, D.T. Max chose not to mythologize Wallace. I think that in fact there was simply no way to write about Wallace’s death in a way that could do the reality a justice, no way to avoid the mythmaking, the fictionalizing, that occurs around a break in ordinary emotions. There is no way, within the genre of literary biography, to capture anything more than the potentially thin connections between a life’s events and a mind’s work. It is the absence of concrete details that creates myth. Although it’s a good biography, it chose an easy way out of Wallace’s story by allowing us to fill in whatever details we wanted; whatever prejudices and biases we came to his story with, we left with. There is nothing wrong with that. Mythologizing is the way we make meaning.
I wonder if we have no magic bullet for mental illness because current psychiatric diagnosis cannot separate the “sickness”, the “malady”, from the person. ECT approximates a magic bullet by erasing memories, erasing the many narrative connections that make up the self.
Unlike biographies that end in death at the hands of others or by natural causes, literary biographies of people who have killed themselves are often faulted for their handling of the death—either for being overdramatic or not offering enough. I fall in the camp of wanting the biographer to offer more, rather than less. Our only connections to tragedy and disorder and the darker spaces of the human soul shouldn’t be television characters. As writers, too often we shy away from the messy, dark, uncomfortable reality, opting instead for an elegance in our endings as D.T. Max did with David Foster Wallace’s biography. It’s the kind of ending that allows the reader to create a reality in the unsaid. But as an audience, we seek the connection that a strong narrative, whether in biography or fiction, provides. We live for it.
Listen to Anita read this essay:
To play the media you will need to either update your browser to a recent version or update your Flash plugin.