Are you unhappy? Bloated? Is it hard to concentrate? Do you have food cravings? Breast tenderness?
If you read the Diagnostic and Statistical Manual of Mental Disorders (DSM),
published by the American Psychiatric Association, you will find your
symptoms listed under “premenstrual dysphoric disorder” (PMDD). In
other words, because of those symptoms, a therapist or doctor could
label you as having a mental disorder.
DSM is the bible of psychiatric diagnosis, used by nearly every
hospital, clinic, doctor and insurance company, as well as Medicare and
Medicaid. Since PMDD first was mentioned in the DSM in 1987, people
have received the mistaken impression that it’s real and that it’s a
mental illness. With the manual’s fifth edition currently in
preparation, that notion seems likely to be strengthened rather than
Contrary to popular
opinion, the creation and use of psychiatric categories is rarely based
on solid science, as I learned when I served on two DSM committees. The
absence of science leaves a void into which every conceivable kind of
bias has been found to flow—including sexism. The DSM’s own PMDD
committee reviewed more than 500 studies for the 1994 edition and
concluded that no high-quality research supported the existence of
PMDD, yet PMDD was placed in the manual anyway.
Like This Story?
Your $10 tax-deductible contribution helps support our research, reporting, and analysis.
Do some women report feeling worse before their periods than at other
times of the month? Certainly, although in some countries and cultures
more than others. Premenstrual discomforts are also more often reported
by women who were sexually abused as children, are struggling with
abuse or harassment, or are just plain overburdened. But that is worlds
away from a mental illness.
powerful DSM authors proposed adding PMDD in the mid-1980s and proposed
adding it to the next edition of the manual. It would represent an
extreme form of PMS—the popularly accepted “syndrome” of physical and
emotional symptoms between ovulation and menstruation. To qualify, it
would have to include five familiar PMS-type symptoms, at least one of
them a “mood disorder” such as feeling hopeless, “on edge,”
self-deprecating, irritable, angry or tearful. No one keeps
comprehensive records of how often a PMDD diagnosis is given, but based
on PMDD committee estimates, approximately half a million American
women could be given the PMDD label.
Hundreds of researchers have tried unsuccessfully to prove that women
are more likely to have mood problems premenstrually than at other
times. University of British Columbia researcher Christine Hitchcock
says, “Something like half of women say they have premenstrual
problems, but when you ask them to keep daily ratings of their moods,
the data don’t reflect that.” Another study showed that men identified PMDD symptoms in themselves as commonly as women did.
Despite this, when Eli Lilly and Company’s patent on antidepressant
Prozac was about to expire, the pharmaceutical giant successfully asked
the Food and Drug Administration to approve it to treat PMDD, providing
a patent extension worth millions. Eli Lilly repackaged Prozac in pink
and purple and rechristened it the feminine-sounding “Sarafem.” Other
drug companies rushed to market similar products. They deliberately
listed physical problems associated with menstruation for
some women, such as breast tenderness or bloating, and added a list of
mood problems from the PMDD list that virtually every human being
The PMDD mood symptoms are also listed for menopause, although they are supposedly caused at menopause by deficiency
in the hormones whose increasesupposedly causes PMDD. I half-jokingly
predicted that we would soon hear about premenarcheal dysphoric
disorder between a baby girl’s birth and her first period, thus
pathologizing women’s moods from birth to death.
Women should be wary of believing claims that high-tech research has
now proven that PMDD is real. We should also advocate a national
conversation—even congressional hearings—about the often hidden,
devastating consequences of simply being given diagnostic labels such
as PMDD. Finally, we should stop pathologizing ourselves and other
women and help each other look at what’s really behind our feelings.
The full text of this article appears in the Summer issue of Ms. magazine, available on newsstands or by joining the Ms. community at www.msmagazine.com.
It's time for a shift in the use of “self-care” that creates space for actual care apart from the extra kindnesses and important, small indulgences that may be part of our self-care rituals, depending on our ability to access such activities.
As a chronically ill, chronically poor person, I have feelings about when, why, and how the phrase “self-care” is invoked. When International Self-Care Day came to my attention, I realized that while I laud the effort to prevent some of the 16 million people the World Health Organization reports die prematurely every year from noncommunicable diseases, the American notion of self-care—ironically—needs some work.
I propose a shift in the use of “self-care” that creates space for actual care apart from the extra kindnesses and important, small indulgences that may be part of our self-care rituals, depending on our ability to access such activities. How we think about what constitutes vital versus optional care affects whether/when we do those things we should for our health and well-being. Some of what we have come to designate as self-care—getting sufficient sleep, treating chronic illness, allowing ourselves needed sick days—shouldn’t be seen as optional; our culture should prioritize these things rather than praising us when we scrape by without them.
International Self-Care Day began in China, and it has spread over the past few years to include other countries and an effort seeking official recognition at the United Nations of July 24 (get it? 7/24: 24 hours a day, 7 days a week) as an important advocacy day. The online academic journal SelfCare calls its namesake “a very broad concept” that by definition varies from person to person.
“Self-care means different things to different people: to the person with a headache it might mean a buying a tablet, but to the person with a chronic illness it can mean every element of self-management that takes place outside the doctor’s office,” according to SelfCare. “[I]n the broadest sense of the term, self-care is a philosophy that transcends national boundaries and the healthcare systems which they contain.”
Like This Story?
Your $10 tax-deductible contribution helps support our research, reporting, and analysis.
In short, self-care was never intended to be the health version of duct tape—a way to patch ourselves up when we’re in pieces from the outrageous demands of our work-centric society. It’s supposed to be part of our preventive care plan alongside working out, eating right, getting enough sleep, and/or other activities that are important for our personalized needs.
The notion of self-care has gotten a recent visibility boost as those of us who work in human rights and/or are activists encourage each other publicly to recharge. Most of the people I know who remind themselves and those in our movements to take time off do so to combat the productivity anxiety embedded in our work. We’re underpaid and overworked, but still feel guilty taking a break or, worse, spending money on ourselves when it could go to something movement- or bill-related.
The guilt is intensified by our capitalist system having infected the self-care philosophy, much as it seems to have infected everything else. Our bootstrap, do-it-yourself culture demands we work to the point of exhaustion—some of us because it’s the only way to almost make ends meet and others because putting work/career first is expected and applauded. Our previous president called it “uniquely American” that someone at his Omaha, Nebraska, event promoting “reform” of (aka cuts to) Social Security worked three jobs.
“Uniquely American, isn’t it?” he said. “I mean, that is fantastic that you’re doing that. (Applause.) Get any sleep? (Laughter.)”
The audience was applauding working hours that are disastrous for health and well-being, laughing at sleep as though our bodies don’t require it to function properly. Bush actually nailed it: Throughout our country, we hold Who Worked the Most Hours This Week competitions and attempt to one-up the people at the coffee shop, bar, gym, or book club with what we accomplished. We have reached a point where we consider getting more than five or six hours of sleep a night to be “self-care” even though it should simply be part of regular care.
Most of us know intuitively that, in general, we don’t take good enough care of ourselves on a day-to-day basis. This isn’t something that just happened; it’s a function of our work culture. Don’t let the statistic that we work on average 34.4 hours per week fool you—that includes people working part time by choice or necessity, which distorts the reality for those of us who work full time. (Full time is defined by the Internal Revenue Service as 30 or more hours per week.) Gallup’s annual Work and Education Survey conducted in 2014 found that 39 percent of us work 50 or more hours per week. Only 8 percent of us on average work less than 40 hours per week. Millennials are projected to enjoy a lifetime of multiple jobs or a full-time job with one or more side hustles via the “gig economy.”
Despite worker productivity skyrocketing during the past 40 years, we don’t work fewer hours or make more money once cost of living is factored in. As Gillian White outlined at the Atlantic last year, despite politicians and “job creators” blaming financial crises for wage stagnation, it’s more about priorities:
Though productivity (defined as the output of goods and services per hours worked) grew by about 74 percent between 1973 and 2013, compensation for workers grew at a much slower rate of only 9 percent during the same time period, according to data from the Economic Policy Institute.
It’s no wonder we don’t sleep. The Centers for Disease Control and Prevention (CDC) has been sounding the alarm for some time. The American Academy of Sleep Medicine and the Sleep Research Society recommend people between 18 and 60 years old get seven or more hours sleep each night “to promote optimal health and well-being.” The CDC website has an entire section under the heading “Insufficient Sleep Is a Public Health Problem,” outlining statistics and negative outcomes from our inability to find time to tend to this most basic need.
We also don’t get to the doctor when we should for preventive care. Roughly half of us, according to the CDC, never visit a primary care or family physician for an annual check-up. We go in when we are sick, but not to have screenings and discuss a basic wellness plan. And rarely do those of us who do go tell our doctors about all of our symptoms.
I recently had my first really wonderful check-up with a new primary care physician who made a point of asking about all the “little things” leading her to encourage me to consider further diagnosis for fibromyalgia. I started crying in her office, relieved that someone had finally listened and at the idea that my headaches, difficulty sleeping, recovering from illness, exhaustion, and pain might have an actual source.
Considering our deeply-ingrained priority problems, it’s no wonder that when I post on social media that I’ve taken a sick day—a concept I’ve struggled with after 20 years of working multiple jobs, often more than 80 hours a week trying to make ends meet—people applaud me for “doing self-care.” Calling my sick day “self-care” tells me that the commenter sees my post-traumatic stress disorder or depression as something I could work through if I so chose, amplifying the stigma I’m pushing back on by owning that a mental illness is an appropriate reason to take off work. And it’s not the commenter’s fault; the notion that working constantly is a virtue is so pervasive, it affects all of us.
Things in addition to sick days and sleep that I’ve had to learn are not engaging in self-care: going to the doctor, eating, taking my meds, going to therapy, turning off my computer after a 12-hour day, drinking enough water, writing, and traveling for work. Because it’s so important, I’m going to say it separately: Preventive health care—Pap smears, check-ups, cancer screenings, follow-ups—is not self-care. We do extras and nice things for ourselves to prevent burnout, not as bandaids to put ourselves back together when we break down. You can’t bandaid over skipping doctors appointments, not sleeping, and working your body until it’s a breath away from collapsing. If you’re already at that point, you need straight-up care.
Plenty of activities are self-care! My absolutely not comprehensive personal list includes: brunch with friends, adult coloring (especially the swear word books and glitter pens), soy wax with essential oils, painting my toenails, reading a book that’s not for review, a glass of wine with dinner, ice cream, spending time outside, last-minute dinner with my boyfriend, the puzzle app on my iPad, Netflix, participating in Caturday, and alone time.
My someday self-care wish list includes things like vacation, concerts, the theater, regular massages, visiting my nieces, decent wine, the occasional dinner out, and so very, very many books. A lot of what constitutes self-care is rather expensive (think weekly pedicures, spa days, and hobbies with gear and/or outfit requirements)—which leads to the privilege of getting to call any part of one’s routine self-care in the first place.
It would serve us well to consciously add an intersectional view to our enthusiasm for self-care when encouraging others to engage in activities that may be out of reach financially, may disregard disability, or may not be right for them for a variety of other reasons, including compounded oppression and violence, which affects women of color differently.
I hear the term self-care a lot and often it is defined as practicing yoga, journaling, speaking positive affirmations and meditation. I agree that those are successful and inspiring forms of self-care, but what we often don’t hear people talking about is self-care at the intersection of race and trauma, social justice and most importantly, the unawareness of repressed emotional issues that make us victims of our past.
The often-quoted Audre Lorde wrote in A Burst of Light: “Caring for myself is not self-indulgence, it is self-preservation, and that is an act of political warfare.”
But as we continue to talk about self-care, we need to be clear about the difference between self-care and actual care and work to bring the necessities of life within reach for everyone. Actual care should not have to be optional. It should be a priority in our culture so that it can be a priority in all our lives.
When the U.S. Supreme Court sent a case about faith-based objections to the Affordable Care Act's contraceptive mandate back to lower courts, it left students at religious colleges and universities with continuing uncertainty about getting essential health care. And that's not what religious freedom is about.
Students choose which university to attend for a variety of reasons: the programs offered, the proximity of campus to home, the institution’s reputation, the financial assistance available, and so on. But young people may need to ask whether their school is likely to discriminate in the provision of health insurance, including contraceptive coverage.
In Zubik v. Burwell, a group of cases sent back to the lower courts by the U.S. Supreme Court in May, a handful of religiously affiliated universities sought the right to deny their students, faculty, and staff access to health insurance coverage for contraception.
This isn’t just a legal debate for me. It’s personal. The private university where I attend law school, Georgetown University in Washington, D.C., currently complies with provisions in the Affordable Care Act that make it possible for a third-party insurer to provide contraceptive access to those who want it. But some hope that these legal challenges to the ACA’s birth control rule will reverse that.
Like This Story?
Your $10 tax-deductible contribution helps support our research, reporting, and analysis.
Georgetown University Law Center refused to provide insurance coverage for contraception before the accommodation was created in 2012. Without a real decision by the Supreme Court, my access to contraception insurance will continue to be at risk while I’m in school.
I’m not alone. Approximately 1.9 million students attend religiously affiliated universities in the United States, according to the Council for Christian Colleges and Universities. We students chose to attend these institutions for lots of reasons, many of which having nothing to do with religion. I decided to attend Georgetown University Law Center because I felt it was the right school for me to pursue my academic and professional goals, it’s in a great city, it has an excellent faculty, and it has a vibrant public-interest law community.
Like many of my fellow students, I am not Catholic and do not share my university’s views on contraception and abortion. Although I was aware of Georgetown’s history of denying students’ essential health-care benefits, I did not think I should have to sacrifice the opportunity to attend an elite law school because I am a woman of reproductive age.
That’s why, as a former law clerk for Americans United for Separation of Church and State, I helped to organize a brief before the high court on behalf of 240 students, faculty, and staff at religiously affiliated universities including Fordham, Georgetown, Loyola Marymount, and the University of Notre Dame.
Our brief defended the sensible accommodation crafted by the Obama administration. That compromise relieves religiously affiliated nonprofit organizations of any obligation to pay for or otherwise provide contraception coverage; in fact, they don’t have to pay a dime for it. Once the university informs the government that it does not want to pay for birth control, a third-party insurer steps in and provides coverage to the students, faculty, and staff who want it.
Remarkably, officials at the religious colleges still challenging the Affordable Care Act say this deal is not good enough. They’re arguing that the mere act of informing the government that they do not want to do something makes them “complicit” in the private decisions of others.
Such an argument stands religious freedom on its head in an attempt to impose one group’s theological beliefs on others by vetoing the third-party insurance providers’ distribution of essential health coverage to students, faculty, and staff.
This should not be viewed as some academic debate confined to legal textbooks and court chambers. It affects real people—most of them women. Studies by the Guttmacher Institute and other groups that study human sexuality have shown that use of artificial forms of birth control is nearly universal among sexually active women of childbearing years. That includes Catholic women, who use birth control at the same rate as non-Catholics.
Indeed, contraception is essential health care, especially for students. An overwhelming number of young people’s pregnancies are unplanned, and having children while in college or a graduate program typically delays graduation, increases the likelihood that the parent will drop out, and may affect their future professional paths.
Additionally, many menstrual disorders make it difficult to focus in class; contraception alleviates the symptoms of a variety of illnesses, and it can help women actually preserve their long-term fertility. For example, one of the students who signed our brief told the Court that, “Without birth control, I experience menstrual cycles that make it hard to function in everyday life and do things like attend class.” Another woman who signed the brief told the Court, “I have a history of ovarian cysts and twice have required surgery, at ages 8 and 14. After my second surgery, the doctor informed me that I should take contraceptives, because if it happened again, I might be infertile.”
For these and many other reasons, women want and need convenient access to safe, affordable contraceptives. It is time for religiously affiliated institutions—and the Supreme Court—to acknowledge this reality.
Because we still don’t have an ultimate decision from the Supreme Court, incoming students cannot consider ease of access to contraception in deciding where to attend college, and they may risk committing to attend an university that will be legally allowed to discriminate against them. A religiously affiliated university may be in all other regards a perfect fit for a young woman. It’s unfair that she should face have to risk access to essential health care to pursue academic opportunity.
Religious liberty is an important right—and that’s why it should not be misinterpreted. Historically, religious freedom has been defined as the right to make decisions for yourself, not others. Religious freedom gives you have the right to determine where, how, and if you will engage in religious activities.
It does not, nor should it ever, give one person or institution the power to meddle in the personal medical decisions of others.