When I was pregnant with my second child, I had very recently miscarried, and was terrified that there was something wrong with the new pregnancy. It was with those wary eyes that I noticed the odd behavior of the man performing my 20 week anatomy scan — how he stopped chatting with us for a while and became very intent on scanning one particular area, how he asked as we ended exactly when I would be seeing my doctor for my next prenatal appointment, and how he never offered us a picture of our baby.
I knew there was something wrong with my baby, I just didn’t know what. I waited three days until the appointment, cataloging the different potential issues and thinking to myself exactly what I could…and couldn’t…live with.
My doctor told me shortly after I arrived that there was a problem with the scan and they thought they detected a club foot. I couldn’t have been more relieved. He suggested we wait two weeks and scan again and see if it may have just been the position, or if there really was an issue. If there was, we could set up a plan to begin treatment shortly after birth.
My son didn’t have a club foot, we learned at the next ultrasound. He was perfect in every way. And this time, I left with pictures to prove it.
Like This Story?
Your $10 tax-deductible contribution helps support our research, reporting, and analysis.
If he had actually had club foot, we would have spoken to pediatric orthopedics and done so joyfully, happy that it was all that was wrong. I had imagined every other dire diagnosis possible, some I could have accepted. And, frankly, many I could not.
That was all at 20 weeks. By the time I saw my doctor, it was 20 weeks and 3 days. By the time we did the second scan, it was 22 weeks.
If I were in Arizona, it would have all been too late.
A majority of ob gyns do not care to do the anatomy scan any earlier than 18 weeks, as there are still major points of fetal development to finish completing before the actual scan. When a physical problem is found, they often like to do a follow up scan a week or two later in order to confirm the initial diagnosis. But with a 20 week ban in place, time is literally not on a woman’s side.
In order to be certain of the physical viability and health of the fetus, there will need to be much more reliance on earlier tests and scans. Quad screen, a blood test that can check for elevated levels of hormones consistent with various neural tube or chromosomal disorders, would be used much more frequently — and have a tendency to result in false positives. More mothers will be seeking out N/T scans, More women would be seeking out amniocentesis, either because of quad screen false positives, bad N/T scans or simply as a stand alone diagnostic, a procedure that can increase risk of miscarriage.
All of these test will greatly increase the medical costs involved with pregnancy, as well as put the pregnancy at risk. But what is even more disturbing is the pregnancies that could be lost in a rushed diagnosis.
As one doctor explained when discussing the proposed 20 week ban in Georgia:
“I saw a couple at 21 or 22 weeks. They didn’t have health insurance until then because he was transferring jobs, so they hadn’t had the screening. When they came to see me, they were distraught. The fetus had a heart defect, what turned out to have been a fixable heart defect. But we didn’t know that for sure. We had to go through amniocentesis, which takes a week. That brings us up to 22 and a half weeks. Then we had to counsel this family and convince them that this heart defect was fixable. Because we had the time, we were able to walk them through the process and they didn’t terminate.
“If I had seen them at 19 weeks and four days, and this new law gave them just three days to decide, they would have terminated that pregnancy. We told the Legislature that, but these legislators think that’s not going to happen.
“It’s going to happen.”
There are a number of things that could show up on a scan at one point, but could simply disappear as time progresses. Heart defects, cord issues, cysts are all soft markers for a variety of Trisomys and chromosomal disorders, yet many will clear at a later ultrasound. Without the ability to wait, how many women might rush to a decision just to beat an arbitrary deadline set not by medical professionals but a group of legislators?
Anti-choice lawmakers think it is so important to make a woman “think it over” that they’ve instituted waiting periods from 24 hours to three days. Yet when it comes to deciding whether or not a physical or chromosomal disorder would cause too much suffering to actually bring a baby to birth, suddenly legislators want a fast, hard deadline, and no exceptions.
A new report confirms what millions of women already know: that women’s choices are not to blame for the gender wage gap. Instead, researchers at the Economic Policy Institute (EPI), the progressive think tank that issued the report, say that women’s unequal pay is driven by “discrimination, social norms, and other factors beyond women’s control.”
This finding—that the gender pay gap is caused by structural factors rather than women’s occupational choices—is surprisingly controversial. Indeed, in my years as a journalist covering women’s economic issues, the subject that has been most frustrating for me to write about has been the gender gap. (Full disclosure: I’ve worked as a consultant for EPI, though not on this particular report.) No other economic topic I’ve covered has been more widely misunderstood, or has been so outrageously distorted by misrepresentations, half-truths, and lies.
That’s because, for decades, conservatives have energetically promoted the myth that the gender pay gap does not exist. They’ve done such a bang-up job of it that denying the reality of the gap, like denying the reality of global warming, has become an article of faith on the right. Conservative think tanks like the Independent Women’s Forum and the American Enterprise Institute and right-wing writers at outlets like the Wall Street Journal, Breitbart, and the Daily Caller have denounced the gender pay gap as “a lie,” “not the real story,” “a fairy tale,” “a statistical delusion,” and “the myth that won’t die.” Sadly, it is not only right-wing propagandists who are gender wage gap denialists. Far more moderate types like Slate’s Hanna Rosin and the Atlantic’s Derek Thompson have also claimed that the gender wage gap statistic is misleading and exaggerates disparities in earnings.
According to the most recent figures available from the Census Bureau, for every dollar a man makes, a woman makes only 79 cents, a statistic that has barely budged in a decade. And that’s just the gap for women overall; for most women of color, it’s considerably larger. Black women earn only 61 percent of what non-Hispanic white men make, and Latinas earn only 55 percent as much. In a recent survey, U.S. women identified the pay gap as their biggest workplace concern. Yet gender wage gap denialists of a variety of political stripes contend that gender gap statistic—which measures the difference in median annual earnings between men and women who work full-time, year-round—is inaccurate because it does not compare the pay of men and women doing the same work. They argue that when researchers control for traits like experience, type of work, education, and the like, the gender gap evaporates like breath on a window. In short, the denialists frame the gender pay gap as the product not of sexist discrimination, but of women’s freely made choices.
Like This Story?
Your $10 tax-deductible contribution helps support our research, reporting, and analysis.
The EPI study’s co-author, economist Elise Gould, said in an interview with Rewire that she and her colleagues realized the need for the new report when an earlier paper generated controversy on social media. That study had uncovered an “unadjusted”—meaning that it did not control for differences in workplace and personal characteristics—$4 an hour gender wage gap among recent college graduates. Gould said she found this pay disparity “astounding”: “You’re looking at two groups of people, men and women, with virtually the same amount of experience, and yet their wages are so different.” But critics on Twitter, she said, claimed that the wage gap simply reflected the fact that women were choosing lower-paid jobs. “So we wanted to take out this one idea of occupational choice and look at that,” Gould said.
Gould and her co-author Jessica Schieder highlight two important findings in their EPI report. One is that, even within occupations, and even after controlling for observable factors such as education and work experience, the gender wage gap remains stubbornly persistent. As Gould told me, “If you take a man and a woman sitting side by side in a cubicle, doing the same exact job with the same amount of experience and the same amount of education, on average, the man is still going to be paid more than the woman.”
The EPI report cites the work of Harvard economist Claudia Goldin, who looked at the relative weight in the overall wage gap of gender-based pay differences within occupations versus those between occupations. She found that while gender pay disparities between different occupations explain 32 percent of the gap, pay differences within the same occupation account for far more—68 percent, or more than twice as much. In other words, even if we saw equal numbers of men and women in every profession, two-thirds of the gender wage gap would still remain.
And yes, female-dominated professions pay less, but the reasons why are difficult to untangle. It’s a chicken-and-egg phenomenon, the EPI report explains, raising the question: Are women disproportionately nudged into low-status, low-wage occupations, or do these occupations pay low wages simply because it is women who are doing the work?
Historically, “women’s work” has always paid poorly. As scholars such as Paula England have shown, occupations that involve care work, for example, are associated with a wage penalty, even after controlling for other factors. But it’s not only care work that is systematically devalued. So, too, is work in other fields where women workers are a majority—even professions that were not initially dominated by women. The EPI study notes that when more women became park rangers, for example, overall pay in that occupation declined. Conversely, as computer programming became increasingly male-dominated, wages in that sector began to soar.
The second major point that Gould and Schieder emphasize is that a woman’s occupational choice does not occur in a vacuum. It is powerfully shaped by forces like discrimination and social norms. “By the time a woman earns her first dollar, her occupational choice is the culmination of years of education, guidance by mentors, parental expectations, hiring practices, and widespread norms and expectations about work/family balance,” Gould told Rewire. One study cited by Gould and Schieder found that in states where traditional attitudes about gender are more prevalent, girls tend to score higher in reading and lower in math, relative to boys. It’s one of many findings demonstrating that cultural attitudes wield a potent influence on women’s achievement. (Unfortunately, the EPI study does not address racism, xenophobia, or other types of bias that, like sexism, shape individuals’ work choices.)
Parental expectations also play a key role in shaping women’s occupational choices. Research reflected in the EPI study shows that parents are more likely to expect their sons to enter male-dominated science, technology, engineering, and math (often called STEM) fields, as opposed to their daughters. This expectation holds even when their daughters score just as well in math.
Another factor is the culture in male-dominated industries, which can be a huge turn-off to women, especially women of color. In one study of women working in science and technology, Latinas and Black women reported that they were often mistaken for janitors—something that none of the white women in the study had experienced. Another found that 52 percent of highly qualified women working in science and technology ended up leaving those fields, driven out by “hostile work environments and extreme job pressures.”
Among those pressures are excessively long hours, which make it difficult to balance careers with unpaid care work, for which women are disproportionately responsible. Goldin’s research, Gould said, shows that “in jobs that have more temporal flexibility instead of inflexibility and long hours, you do see a smaller gender wage gap.” Women pharmacists, for example, enjoy relatively high pay and a narrow wage gap, which Goldin has linked to flexible work schedules and a professional culture that enables work/life balance. By contrast, the gender pay gap is widest in highest-paying fields such as finance, which disproportionately reward those able to work brutally long hours and be on call 24/7.
Fortunately, remedies for the gender wage gap are at hand. Gould said that strong enforcement of anti-discrimination laws, greater wage transparency (which can be achieved through unions and collective bargaining), and more flexible workplace policies would all help to alleviate gender-based pay inequities. Additional solutions include raising the minimum wage, which would significantly boost the pay of the millions of women disproportionately concentrated in the low-wage sector, and enacting paid family leave, a policy that would be a boon for women struggling to combine work and family. All of these issues are looming increasingly large in our national politics.
But in order to advance these policies, it’s vital to debunk the right’s shameless, decades-long disinformation campaign about the gender gap. The fact is, in every occupation and at every level, women earn less than men doing exactly the same work. The right alleges that the official gender pay gap figure exaggerates the role of discrimination. But even statistics that adjust for occupation and other factors can, in the words of the EPI study, “radically understate the potential for gender discrimination to suppress women’s earnings.”
Contrary to conservatives’ claims, women did not choose to be paid consistently less than men for work that is every bit as valuable to society. But with the right set of policies, we can reverse the tide and bring about some measure of economic justice to the hard-working women of the United States.
It's time for a shift in the use of “self-care” that creates space for actual care apart from the extra kindnesses and important, small indulgences that may be part of our self-care rituals, depending on our ability to access such activities.
As a chronically ill, chronically poor person, I have feelings about when, why, and how the phrase “self-care” is invoked. When International Self-Care Day came to my attention, I realized that while I laud the effort to prevent some of the 16 million people the World Health Organization reports die prematurely every year from noncommunicable diseases, the American notion of self-care—ironically—needs some work.
I propose a shift in the use of “self-care” that creates space for actual care apart from the extra kindnesses and important, small indulgences that may be part of our self-care rituals, depending on our ability to access such activities. How we think about what constitutes vital versus optional care affects whether/when we do those things we should for our health and well-being. Some of what we have come to designate as self-care—getting sufficient sleep, treating chronic illness, allowing ourselves needed sick days—shouldn’t be seen as optional; our culture should prioritize these things rather than praising us when we scrape by without them.
International Self-Care Day began in China, and it has spread over the past few years to include other countries and an effort seeking official recognition at the United Nations of July 24 (get it? 7/24: 24 hours a day, 7 days a week) as an important advocacy day. The online academic journal SelfCare calls its namesake “a very broad concept” that by definition varies from person to person.
“Self-care means different things to different people: to the person with a headache it might mean a buying a tablet, but to the person with a chronic illness it can mean every element of self-management that takes place outside the doctor’s office,” according to SelfCare. “[I]n the broadest sense of the term, self-care is a philosophy that transcends national boundaries and the healthcare systems which they contain.”
Like This Story?
Your $10 tax-deductible contribution helps support our research, reporting, and analysis.
In short, self-care was never intended to be the health version of duct tape—a way to patch ourselves up when we’re in pieces from the outrageous demands of our work-centric society. It’s supposed to be part of our preventive care plan alongside working out, eating right, getting enough sleep, and/or other activities that are important for our personalized needs.
The notion of self-care has gotten a recent visibility boost as those of us who work in human rights and/or are activists encourage each other publicly to recharge. Most of the people I know who remind themselves and those in our movements to take time off do so to combat the productivity anxiety embedded in our work. We’re underpaid and overworked, but still feel guilty taking a break or, worse, spending money on ourselves when it could go to something movement- or bill-related.
The guilt is intensified by our capitalist system having infected the self-care philosophy, much as it seems to have infected everything else. Our bootstrap, do-it-yourself culture demands we work to the point of exhaustion—some of us because it’s the only way to almost make ends meet and others because putting work/career first is expected and applauded. Our previous president called it “uniquely American” that someone at his Omaha, Nebraska, event promoting “reform” of (aka cuts to) Social Security worked three jobs.
“Uniquely American, isn’t it?” he said. “I mean, that is fantastic that you’re doing that. (Applause.) Get any sleep? (Laughter.)”
The audience was applauding working hours that are disastrous for health and well-being, laughing at sleep as though our bodies don’t require it to function properly. Bush actually nailed it: Throughout our country, we hold Who Worked the Most Hours This Week competitions and attempt to one-up the people at the coffee shop, bar, gym, or book club with what we accomplished. We have reached a point where we consider getting more than five or six hours of sleep a night to be “self-care” even though it should simply be part of regular care.
Most of us know intuitively that, in general, we don’t take good enough care of ourselves on a day-to-day basis. This isn’t something that just happened; it’s a function of our work culture. Don’t let the statistic that we work on average 34.4 hours per week fool you—that includes people working part time by choice or necessity, which distorts the reality for those of us who work full time. (Full time is defined by the Internal Revenue Service as 30 or more hours per week.) Gallup’s annual Work and Education Survey conducted in 2014 found that 39 percent of us work 50 or more hours per week. Only 8 percent of us on average work less than 40 hours per week. Millennials are projected to enjoy a lifetime of multiple jobs or a full-time job with one or more side hustles via the “gig economy.”
Despite worker productivity skyrocketing during the past 40 years, we don’t work fewer hours or make more money once cost of living is factored in. As Gillian White outlined at the Atlantic last year, despite politicians and “job creators” blaming financial crises for wage stagnation, it’s more about priorities:
Though productivity (defined as the output of goods and services per hours worked) grew by about 74 percent between 1973 and 2013, compensation for workers grew at a much slower rate of only 9 percent during the same time period, according to data from the Economic Policy Institute.
It’s no wonder we don’t sleep. The Centers for Disease Control and Prevention (CDC) has been sounding the alarm for some time. The American Academy of Sleep Medicine and the Sleep Research Society recommend people between 18 and 60 years old get seven or more hours sleep each night “to promote optimal health and well-being.” The CDC website has an entire section under the heading “Insufficient Sleep Is a Public Health Problem,” outlining statistics and negative outcomes from our inability to find time to tend to this most basic need.
We also don’t get to the doctor when we should for preventive care. Roughly half of us, according to the CDC, never visit a primary care or family physician for an annual check-up. We go in when we are sick, but not to have screenings and discuss a basic wellness plan. And rarely do those of us who do go tell our doctors about all of our symptoms.
I recently had my first really wonderful check-up with a new primary care physician who made a point of asking about all the “little things” leading her to encourage me to consider further diagnosis for fibromyalgia. I started crying in her office, relieved that someone had finally listened and at the idea that my headaches, difficulty sleeping, recovering from illness, exhaustion, and pain might have an actual source.
Considering our deeply-ingrained priority problems, it’s no wonder that when I post on social media that I’ve taken a sick day—a concept I’ve struggled with after 20 years of working multiple jobs, often more than 80 hours a week trying to make ends meet—people applaud me for “doing self-care.” Calling my sick day “self-care” tells me that the commenter sees my post-traumatic stress disorder or depression as something I could work through if I so chose, amplifying the stigma I’m pushing back on by owning that a mental illness is an appropriate reason to take off work. And it’s not the commenter’s fault; the notion that working constantly is a virtue is so pervasive, it affects all of us.
Things in addition to sick days and sleep that I’ve had to learn are not engaging in self-care: going to the doctor, eating, taking my meds, going to therapy, turning off my computer after a 12-hour day, drinking enough water, writing, and traveling for work. Because it’s so important, I’m going to say it separately: Preventive health care—Pap smears, check-ups, cancer screenings, follow-ups—is not self-care. We do extras and nice things for ourselves to prevent burnout, not as bandaids to put ourselves back together when we break down. You can’t bandaid over skipping doctors appointments, not sleeping, and working your body until it’s a breath away from collapsing. If you’re already at that point, you need straight-up care.
Plenty of activities are self-care! My absolutely not comprehensive personal list includes: brunch with friends, adult coloring (especially the swear word books and glitter pens), soy wax with essential oils, painting my toenails, reading a book that’s not for review, a glass of wine with dinner, ice cream, spending time outside, last-minute dinner with my boyfriend, the puzzle app on my iPad, Netflix, participating in Caturday, and alone time.
My someday self-care wish list includes things like vacation, concerts, the theater, regular massages, visiting my nieces, decent wine, the occasional dinner out, and so very, very many books. A lot of what constitutes self-care is rather expensive (think weekly pedicures, spa days, and hobbies with gear and/or outfit requirements)—which leads to the privilege of getting to call any part of one’s routine self-care in the first place.
It would serve us well to consciously add an intersectional view to our enthusiasm for self-care when encouraging others to engage in activities that may be out of reach financially, may disregard disability, or may not be right for them for a variety of other reasons, including compounded oppression and violence, which affects women of color differently.
I hear the term self-care a lot and often it is defined as practicing yoga, journaling, speaking positive affirmations and meditation. I agree that those are successful and inspiring forms of self-care, but what we often don’t hear people talking about is self-care at the intersection of race and trauma, social justice and most importantly, the unawareness of repressed emotional issues that make us victims of our past.
The often-quoted Audre Lorde wrote in A Burst of Light: “Caring for myself is not self-indulgence, it is self-preservation, and that is an act of political warfare.”
But as we continue to talk about self-care, we need to be clear about the difference between self-care and actual care and work to bring the necessities of life within reach for everyone. Actual care should not have to be optional. It should be a priority in our culture so that it can be a priority in all our lives.