When It Comes to Eating Disorders, Do We as a Society Know What We’re Dealing With?
The point is that the perfection those suffering from eating disorders are longing for in themselves in most cases is neither physical nor real. We will need to overcome our societal inability to see errors for what they are—an opportunity to learn—if we want to deal with eating disorders.
This week it is National Eating Disorder Awareness Week. Not that you’d know it. Three days in, Angelina Jolie’s leg at the Oscars on Sunday is still getting a lot more attention than the eating disorder usually associated with unnaturally thin and awkwardly poky limbs.
There’s a reason for that. Eating disorders make for an uncomfortable conversation topic. For one, they are incredibly insidious and prevalent. About 8 million individuals living in the United States have an eating disorder (7 million of them women and girls), and most never get the treatment they need. Essentially, this means everyone knows at least one person with a mental illness. Most people don’t like to think about that.
And then there’s the fact that eating disorders are deadly. Literally. Eating disorders are the mental illness with the highest mortality rate in the United States, and it is estimated that some 20 percent of those suffering from anorexia will die prematurely.
But mostly, many of us don’t like to talk about eating disorders because we’ve been very close to a sufferer.
Like This Story?
Your $10 tax-deductible contribution helps support our research, reporting, and analysis.
When I was 16, I arrived in the United States as a foreign exchange student from Denmark with a complicated, but manageable, case of bulimia, weighing approximately 110 pounds. I went home a year later with full-blown anorexia, weighing about 80 (which, considering my height, indicates a Body Mass Index of about 9.5, i.e. very underweight). It took fifteen years, 3 bouts of therapy, and a pregnancy to finally root out all obsessive thinking about food.
Now, when I think about food, it’s not obsessive. I may really want a cinnamon roll or crave salt. And I definitely get so hungry that thoughts of food take over until I eat. But I no longer think about the quantities I have eaten, or about whether or not I deserve food.
To say that this is “liberating” is beside the point. On the one hand, it feels like I am finally alive, that I can now concentrate on the real colors and textures of life. On the other, to be honest, it’s so basic that it’s mundane. When I don’t think about my past, I forget I ever obsessed over food. I eat when I am hungry and don’t when I am not. It seems uncomplicated, somehow. Yet, of course, I know it’s not.
There are any number of books and articles linking eating disorders to the depiction of boyishly (and unnaturally) thin women in fashion, movies, and television. In my experience, fashion has very little to do with it. Fashion will make perfectly well-proportioned and healthy women and girls want to lose five pounds, go on fad diets, and be miserable. Fashion dictates clothing that looks better on a stick-insect than on a person.
Fashion alone, however, does not make you live off half an apple a day. Fashion does not make you develop so much scar tissue in your throat that your gag reflex is inhibited and you need to use toothbrushes and pens to make yourself throw up. Fashion may be an impetus to lose weight. But it’s something else that makes you stop eating altogether.
And that something else is control. At least in my case.
I am not a reductionist, and years of working closely with victims of all kinds has taught me that while we all are experts on our own suffering, we can be woefully blind to the solutions that are necessary to deal with that of others.
Even so, I believe letting go of control is key to recovery for many. When my mother instituted a system of rewards for pounds gained with target body weight on certain dates, I’d carefully gain the required weight on the required date, then proceed to lose it all and more in the days after. In short, I never let go of control.
By contrast, when I returned to therapy briefly while dealing with a bout of bulimia during a particularly stressful time in my life, my therapist told me not to worry about it. “In the grand scheme of things,” he said, “you are just sticking a finger down your throat. Really, is that so bad? Just remember to brush your teeth.” My purging immediately got less frequent and then disappeared—it didn’t feel so imperative after I stopped worrying about it. In short, I gave myself license to let go.
The point is not that a cavalier attitude always generates the desired change. In this case, my therapist knew from previous interactions who I was and what my reaction was likely to be.
The point is that the perfection those suffering from eating disorders are longing for in themselves in most cases is neither physical nor real. It’s not that I wanted to be thin, or even that I liked my emaciated body. It’s that I was scared out of my wits of failing.
We will need to overcome our societal inability to see errors for what they are—an opportunity to learn—if we want to deal with eating disorders. So if you want to do something to counter eating disorders this week, sure, eat healthily and don’t assume that anyone who’s not a size 2 is unhealthy, stupid, or both.
But more than that: show through your actions that you appreciate effort and honest errors more than caution and control. It may seem three steps removed from eating disorders and food. Trust me, it is not.
It's time for a shift in the use of “self-care” that creates space for actual care apart from the extra kindnesses and important, small indulgences that may be part of our self-care rituals, depending on our ability to access such activities.
As a chronically ill, chronically poor person, I have feelings about when, why, and how the phrase “self-care” is invoked. When International Self-Care Day came to my attention, I realized that while I laud the effort to prevent some of the 16 million people the World Health Organization reports die prematurely every year from noncommunicable diseases, the American notion of self-care—ironically—needs some work.
I propose a shift in the use of “self-care” that creates space for actual care apart from the extra kindnesses and important, small indulgences that may be part of our self-care rituals, depending on our ability to access such activities. How we think about what constitutes vital versus optional care affects whether/when we do those things we should for our health and well-being. Some of what we have come to designate as self-care—getting sufficient sleep, treating chronic illness, allowing ourselves needed sick days—shouldn’t be seen as optional; our culture should prioritize these things rather than praising us when we scrape by without them.
International Self-Care Day began in China, and it has spread over the past few years to include other countries and an effort seeking official recognition at the United Nations of July 24 (get it? 7/24: 24 hours a day, 7 days a week) as an important advocacy day. The online academic journal SelfCare calls its namesake “a very broad concept” that by definition varies from person to person.
“Self-care means different things to different people: to the person with a headache it might mean a buying a tablet, but to the person with a chronic illness it can mean every element of self-management that takes place outside the doctor’s office,” according to SelfCare. “[I]n the broadest sense of the term, self-care is a philosophy that transcends national boundaries and the healthcare systems which they contain.”
Like This Story?
Your $10 tax-deductible contribution helps support our research, reporting, and analysis.
In short, self-care was never intended to be the health version of duct tape—a way to patch ourselves up when we’re in pieces from the outrageous demands of our work-centric society. It’s supposed to be part of our preventive care plan alongside working out, eating right, getting enough sleep, and/or other activities that are important for our personalized needs.
The notion of self-care has gotten a recent visibility boost as those of us who work in human rights and/or are activists encourage each other publicly to recharge. Most of the people I know who remind themselves and those in our movements to take time off do so to combat the productivity anxiety embedded in our work. We’re underpaid and overworked, but still feel guilty taking a break or, worse, spending money on ourselves when it could go to something movement- or bill-related.
The guilt is intensified by our capitalist system having infected the self-care philosophy, much as it seems to have infected everything else. Our bootstrap, do-it-yourself culture demands we work to the point of exhaustion—some of us because it’s the only way to almost make ends meet and others because putting work/career first is expected and applauded. Our previous president called it “uniquely American” that someone at his Omaha, Nebraska, event promoting “reform” of (aka cuts to) Social Security worked three jobs.
“Uniquely American, isn’t it?” he said. “I mean, that is fantastic that you’re doing that. (Applause.) Get any sleep? (Laughter.)”
The audience was applauding working hours that are disastrous for health and well-being, laughing at sleep as though our bodies don’t require it to function properly. Bush actually nailed it: Throughout our country, we hold Who Worked the Most Hours This Week competitions and attempt to one-up the people at the coffee shop, bar, gym, or book club with what we accomplished. We have reached a point where we consider getting more than five or six hours of sleep a night to be “self-care” even though it should simply be part of regular care.
Most of us know intuitively that, in general, we don’t take good enough care of ourselves on a day-to-day basis. This isn’t something that just happened; it’s a function of our work culture. Don’t let the statistic that we work on average 34.4 hours per week fool you—that includes people working part time by choice or necessity, which distorts the reality for those of us who work full time. (Full time is defined by the Internal Revenue Service as 30 or more hours per week.) Gallup’s annual Work and Education Survey conducted in 2014 found that 39 percent of us work 50 or more hours per week. Only 8 percent of us on average work less than 40 hours per week. Millennials are projected to enjoy a lifetime of multiple jobs or a full-time job with one or more side hustles via the “gig economy.”
Despite worker productivity skyrocketing during the past 40 years, we don’t work fewer hours or make more money once cost of living is factored in. As Gillian White outlined at the Atlantic last year, despite politicians and “job creators” blaming financial crises for wage stagnation, it’s more about priorities:
Though productivity (defined as the output of goods and services per hours worked) grew by about 74 percent between 1973 and 2013, compensation for workers grew at a much slower rate of only 9 percent during the same time period, according to data from the Economic Policy Institute.
It’s no wonder we don’t sleep. The Centers for Disease Control and Prevention (CDC) has been sounding the alarm for some time. The American Academy of Sleep Medicine and the Sleep Research Society recommend people between 18 and 60 years old get seven or more hours sleep each night “to promote optimal health and well-being.” The CDC website has an entire section under the heading “Insufficient Sleep Is a Public Health Problem,” outlining statistics and negative outcomes from our inability to find time to tend to this most basic need.
We also don’t get to the doctor when we should for preventive care. Roughly half of us, according to the CDC, never visit a primary care or family physician for an annual check-up. We go in when we are sick, but not to have screenings and discuss a basic wellness plan. And rarely do those of us who do go tell our doctors about all of our symptoms.
I recently had my first really wonderful check-up with a new primary care physician who made a point of asking about all the “little things” leading her to encourage me to consider further diagnosis for fibromyalgia. I started crying in her office, relieved that someone had finally listened and at the idea that my headaches, difficulty sleeping, recovering from illness, exhaustion, and pain might have an actual source.
Considering our deeply-ingrained priority problems, it’s no wonder that when I post on social media that I’ve taken a sick day—a concept I’ve struggled with after 20 years of working multiple jobs, often more than 80 hours a week trying to make ends meet—people applaud me for “doing self-care.” Calling my sick day “self-care” tells me that the commenter sees my post-traumatic stress disorder or depression as something I could work through if I so chose, amplifying the stigma I’m pushing back on by owning that a mental illness is an appropriate reason to take off work. And it’s not the commenter’s fault; the notion that working constantly is a virtue is so pervasive, it affects all of us.
Things in addition to sick days and sleep that I’ve had to learn are not engaging in self-care: going to the doctor, eating, taking my meds, going to therapy, turning off my computer after a 12-hour day, drinking enough water, writing, and traveling for work. Because it’s so important, I’m going to say it separately: Preventive health care—Pap smears, check-ups, cancer screenings, follow-ups—is not self-care. We do extras and nice things for ourselves to prevent burnout, not as bandaids to put ourselves back together when we break down. You can’t bandaid over skipping doctors appointments, not sleeping, and working your body until it’s a breath away from collapsing. If you’re already at that point, you need straight-up care.
Plenty of activities are self-care! My absolutely not comprehensive personal list includes: brunch with friends, adult coloring (especially the swear word books and glitter pens), soy wax with essential oils, painting my toenails, reading a book that’s not for review, a glass of wine with dinner, ice cream, spending time outside, last-minute dinner with my boyfriend, the puzzle app on my iPad, Netflix, participating in Caturday, and alone time.
My someday self-care wish list includes things like vacation, concerts, the theater, regular massages, visiting my nieces, decent wine, the occasional dinner out, and so very, very many books. A lot of what constitutes self-care is rather expensive (think weekly pedicures, spa days, and hobbies with gear and/or outfit requirements)—which leads to the privilege of getting to call any part of one’s routine self-care in the first place.
It would serve us well to consciously add an intersectional view to our enthusiasm for self-care when encouraging others to engage in activities that may be out of reach financially, may disregard disability, or may not be right for them for a variety of other reasons, including compounded oppression and violence, which affects women of color differently.
I hear the term self-care a lot and often it is defined as practicing yoga, journaling, speaking positive affirmations and meditation. I agree that those are successful and inspiring forms of self-care, but what we often don’t hear people talking about is self-care at the intersection of race and trauma, social justice and most importantly, the unawareness of repressed emotional issues that make us victims of our past.
The often-quoted Audre Lorde wrote in A Burst of Light: “Caring for myself is not self-indulgence, it is self-preservation, and that is an act of political warfare.”
But as we continue to talk about self-care, we need to be clear about the difference between self-care and actual care and work to bring the necessities of life within reach for everyone. Actual care should not have to be optional. It should be a priority in our culture so that it can be a priority in all our lives.
In a series of workshops over a three-day conference in Herndon, Virginia, self-proclaimed medical and scientific experts renewed their debunked efforts to promote the purported links between abortion and a host of negative outcomes, including breast cancer and mental health problems.
Less than two weeks after the Supreme Court rejected the anti-choice movement’s unscientific claims about how abortion restrictions make patients safer, the National Right to Life Convention hosted a slate of anti-choice “experts,” who promoted even more dubious claims that fly in the face of accepted medical science.
In a series of workshops over the three-day conference in Herndon, Virginia, self-proclaimed medical and scientific experts, including several whose false claims have been exposed by Rewire, renewed their efforts to promote the purported links between abortion and a host of negative outcomes, including breast cancer and mental health problems.
Some of those who spoke at the convention were stalwarts featured in the Rewire series “False Witnesses,” which exposed the anti-choice movement’s attempts to mislead lawmakers, courts, and the public about abortion care.
During a Thursday session titled “The Abortion-Breast Cancer Link: The Biological Basis, The Studies, and the Fraud,” Lanfranchi, one of Rewire’s “False Witnesses,” pushed her debunked talking points.
Throughout the presentation, which was attended by Rewire, Lanfranchi argued that there is “widespread fraudulent behavior among scientists and medical organizations to obfuscate the link” between abortion and breast cancer.
In a statement, the irony of which may have been lost on many in the room, Lanfranchi told attendees that sometimes “scientists in the pursuit of truth can be frauds.”Lanfranchi went on to point to numerous studies and texts she claimed supported her theories and lamented that over time, textbooks that had previously suggested a link between abortion and breast cancer in the ’90s were later updated to exclude the claim.
Lanfranchi later pivoted to note her inclusion in Rewire’s “False Witnesses” project, which she deemed an “attack.”
“We were one of 14 people that were on this site … as liars,” said Lanfranchi as she showed a slide of the webpage. “Now when people Google my name, instead of my practice coming up,” Rewire’s story appears.
Priscilla Coleman, another “False Witness” best known for erroneously claiming that abortion causes mental health problems and drug abuse, similarly bemoaned her inclusion in Rewire’s project during her brief participation in a Thursday session, “The Conspiracy of Silence: Roadblocks to Getting Abortion Facts to the Public.”
After claiming that there is ample evidence that abortion is associated with suicide and eating disorders, Coleman suggested that many media outlets were blocking the truth by not reporting on her findings. When it came to Rewire, Coleman wrote the outlet off as a part of the “extreme left,” telling the room that “if you look deeply into their analysis of each of our backgrounds, a lot of it is lies … it’s bogus information.”
An extensive review conducted by the American Psychological Association in 2008, however, found “no evidence sufficient to support” claims such as Coleman’s that “an observed association between abortion history and mental health was caused by the abortion.”
Rounding out the medical misinformation pushed in that session was Eve Sanchez Silver, the director and founder of the International Coalition of Color for Life. According to the biography listed on her organization’s website, Silver bills herself as a “bioethicist” who focuses on “the Abortion-Breast cancer link.”
Silver, who previously worked at the Susan G. Komen Foundation but left, she said, after finding out the organization gave money to Planned Parenthood, spent much of her presentation arguing that abortion increases the risk of breast cancer. She also detailed what she referred to as the “Pink Money Cycle,” a process in which, as she explained, money is given to Komen, which in turn donates to Planned Parenthood. As Silver told it, Planned Parenthood then gives people abortions, leading to more cases of breast cancer.
The seemingly conspiracy-driven theory has popped up in several of Silver’s presentations over the years.
Though Komen does in fact provide some funding to Planned Parenthood through grants, a July 2015 press release from the the breast cancer organization explains that it does “not and never [has] funded abortion or reproductive services at Planned Parenthood or any grantee.” Instead, the money Planned Parenthood receives from Komen “pays for breast health outreach and breast screenings for low-income, uninsured or under-insured individuals.”
On Saturday, another subject of Rewire’s “False Witnesses” series, endocrinologist Joel Brind, doubled down on his claims about the link between abortion and breast cancer in a workshop titled “New American Export to Asia: The Cover-Up of the Abortion-Breast Cancer Link.”
Brind described the Indian subcontinent as the ideal place to study the purported link between abortion and breast cancer. According to Brind, “The typical woman [there] has gotten married as a teenager, started having kids right away, breastfeeds all of them, has lots of them, never smokes, never drinks, what else is she going to get breast cancer from? Nothing.”
When it came to research from Asia that didn’t necessarily support his conclusions about abortion and breast cancer, Brind chalked it up to an international cover-up effort, “spearheaded, obviously, by our own National Cancer Institute.”
Although five states require counseling for abortion patients that includes the supposed link between abortion and breast cancer, Brind told Rewire that the link has become “the kind of thing that legislators don’t want to touch” because they would be going “against what all of these medical authorities say.”
Brind also dedicated a portion of his presentation to promoting the purported cancer-preventing benefits of glycine, which he sells in supplement form through his company, Natural Food Science LLC.
“If I sprain my ankle it doesn’t swell up, the injury will just heal,” Brind claimed, citing the supposed effects of glycine on inflammation.
In a Thursday session on “the rise of the DIY abortion”, panelist Randall O’Bannon questioned the U.S. Food and Drug Administration’s (FDA) March update to regulations on mifepristone, a drug also known as RU-486 that is used in medical abortions. Noting that the drug is “cheap,” O’Bannon appeared to fret that the new regulations might make abortion more accessible, going on to claim that there could be “a push to make [the drug] available over the counter.”
O’Bannon claimed there are “documented safety issues” associated with the drug, but the FDA says mifepristone is “safe and effective.” A 2011 post-market study by the agency of those who have used the drug since its approval found that more than 1.5 million women had used it to end a pregnancy in the U.S. Of those women, just roughly 2,200 experienced an “adverse event.”According to the Association of Reproductive Health Professionals, mifepristone “is safer than acetaminophen,” aspirin, and Viagra.
Speculating that misoprostol, another drug used in medication abortions, was less effective than medical experts say, O’Bannon later suggested that more embryos would “survive” abortions, leading to an “increased numbers of births with children with club feet, webbed toes, and fingers [and] full and partial facial paralysis.”
According to the World Health Organization, “Available data regarding a potential risk of fetal abnormality after an unsuccessful medical abortion are limited and inconclusive.”