World AIDS Day 2031: How We Could Outsmart HIV

Brian Ackerman

In 2031, HIV will still be a reality. But if the Obama administration leads the world in promoting smart and evidence-based prevention education, it will be a disease everyone on the planet knows how to prevent.

Imagine: 
The date is Dec. 1, 2031.  HIV/AIDS has ravaged the global population
for 50 years.  It is World AIDS Day again and I sit down in my
futuristic flying chair to draft a blog (an ancient form of personal
commentary that only middle-aged people have heard of).  The theme
of this year’s World AIDS Day is "Victory."  Here is my blog
from the future:

So
we made it.  50 years.  Millions of lives lost.  Millions
more chronically ill.  But we made
it.  The world has fought a 50 year battle with AIDS and finally,
with infection rates continuing their 22-year-long plummet around the
world, nearly universal access to antiretroviral therapy, and a sustainable
global healthcare infrastructure that
integrates prevention, treatment, and care services, we can claim victory. 
Many are saying "I can’t believe it."  But I can.  And
not only can I believe it, I would find it unacceptable to have it any
other way. 

At
this triumphant juncture in human history I think we must reflect on
the path we chose to take together to achieve this feat.  While
the AIDS epidemic began in 1981, before I was even born, the global
response to the disease took a dramatic turn 22 years ago, in 2009. 
Amidst a crisis in the world economy from which we have only recently
completely recovered, and amidst a massive power shift in Washington,
D.C., the United States charted a new course in the global response
with its five-year, $48 billion initiative to end the pandemic.

The
Obama Administration and 111
th Congress were
charged with calculating the sum of these parts: limited financial resources
+ an explosive global epidemic + donor fatigue for AIDS-specific funding
+ the largest generation of young people in human history (3 billion
people) accounting for 45 percent of new infections annually +
over two decades of data collection and a wealth of evidence-based,
best practice models for prevention, treatment, and care. 

Like This Story?

Your $10 tax-deductible contribution helps support our research, reporting, and analysis.

Donate Now

While
many said the sum was in fact, a negative one, the U.S. government decided
that the knowledge we had about the AIDS epidemic, about how to prevent
it, about how to treat it, and about how to care for those affected
by it, was enough to outweigh the challenges of the other parts. 
And it could do such things because the U.S. government remembered that
it was good at defying the odds when it employed science, as was the
case with that trip to the moon back in 1969. 

So
what exactly did the U.S. government
do in 2009 that changed the course of human history and led us on a
path to victory over HIV/AIDS?  The list of specific actions could
fill a book, but fundamentally, what is now taught in history books
as the eight-year "War on Science," came to an end, and science
won!  The Office of the Global AIDS Coordinator
declared that with regards to prevention, it would fund only comprehensive,
medically accurate, and evidence-based programs
and emphasized that one of the only ways to outsmart HIV was to not
be afraid to address socially complex issues of sexuality and reproductive
health in a holistic and healthy manner. 
Congress amended PEPFAR reauthorizing legislation to eliminate the balanced
funding requirement for abstinence and be faithful programs and to fund
family planning programs within U.S. global AIDS funding. 

The
largest generation of young people (half the world’s population in
2009) were targeted in these programs
and like magic (but better, because it was science), a new generation
of sexually educated, informed, and responsible decision-makers
came of sexual and reproductive age.  Their new world became the
one in which we live today, where HIV is no less a reality, but it is
a reality that almost everyone on the planet knows how to prevent. 
Today, using prevention commodities such as male and
female condoms during sex is as common as putting on sunscreen before
spending a day at the beach or wearing our seatbelts when we fly our
futuristic cars (this comparison is particularly apt considering the
invention of multicolored seatbelts in 2015 that made them so much more
fun to wear!).  Even further, young people know how to negotiate
safe sex so they are empowered to protect themselves
(even if their partner is not as excited about multi-colored commodities). 

Thanks
to this emphasis on evidence-based prevention programming, we were able
to finally catch up with the number of people newly living with HIV
and ensure their access to life-extending anti-retroviral therapy and
other holistic interventions. 
The emphasis on quality prevention programming did not detract from
treatment access, but in effect, amplified our ability to ensure that
a greater percentage of people living with HIV and AIDS could live long,
healthy lives. 

Of
course the United States did not do this alone.  It joined a global
majority of donor countries and technical institutions that had been
saying the same thing for some time.  But it seems that the theme
of World AIDS Day 2008, "Leadership," must have really been effective. 
The United States dramatically enhanced its own
response to the pandemic in 2009 and led the way for a truly sustainable
and effective global response, so much so that I can sit here today
and claim victory over HIV along with the rest of the world. 
Looking back, the moon was pretty cool 80 years ago, but I don’t think
the world has ever claimed a win as big as this one.
 

…And
we time warp back to the present.   

The
gross oversimplification of my blog from the future is intended neither
to "pit prevention against treatment," nor to place unreasonable
pressure on our incoming policy makers.  Rather, it outlines a
vision of what could be instead of what is, and highlights that we have
a choice as a country and as a world, instead of a dilemma.  It
is no secret that unprotected sex fuels AIDS.  Let’s choose to
utilize our limited resources as best as possible in the coming years,
and only fund comprehensive prevention programs to construct a world
in which access to HIV prevention information and commodities is universal,
so that universal access to treatment can also become a reality.   

I
expect nothing less from the incoming Administration and, like others,
believe it’s time. 

If
you are interested in sharing your thoughts about World AIDS Day, join
me for Advocates for Youth’s World AIDS Day Bog-a-thon, between
December 1-7, 2008 at www.amplifyyourvoice.org/WorldAidsDay

Culture & Conversation Human Rights

Let’s Stop Conflating Self-Care and Actual Care

Katie Klabusich

It's time for a shift in the use of “self-care” that creates space for actual care apart from the extra kindnesses and important, small indulgences that may be part of our self-care rituals, depending on our ability to access such activities.

As a chronically ill, chronically poor person, I have feelings about when, why, and how the phrase “self-care” is invoked. When International Self-Care Day came to my attention, I realized that while I laud the effort to prevent some of the 16 million people the World Health Organization reports die prematurely every year from noncommunicable diseases, the American notion of self-care—ironically—needs some work.

I propose a shift in the use of “self-care” that creates space for actual care apart from the extra kindnesses and important, small indulgences that may be part of our self-care rituals, depending on our ability to access such activities. How we think about what constitutes vital versus optional care affects whether/when we do those things we should for our health and well-being. Some of what we have come to designate as self-care—getting sufficient sleep, treating chronic illness, allowing ourselves needed sick days—shouldn’t be seen as optional; our culture should prioritize these things rather than praising us when we scrape by without them.

International Self-Care Day began in China, and it has spread over the past few years to include other countries and an effort seeking official recognition at the United Nations of July 24 (get it? 7/24: 24 hours a day, 7 days a week) as an important advocacy day. The online academic journal SelfCare calls its namesake “a very broad concept” that by definition varies from person to person.

“Self-care means different things to different people: to the person with a headache it might mean a buying a tablet, but to the person with a chronic illness it can mean every element of self-management that takes place outside the doctor’s office,” according to SelfCare. “[I]n the broadest sense of the term, self-care is a philosophy that transcends national boundaries and the healthcare systems which they contain.”

Like This Story?

Your $10 tax-deductible contribution helps support our research, reporting, and analysis.

Donate Now

In short, self-care was never intended to be the health version of duct tape—a way to patch ourselves up when we’re in pieces from the outrageous demands of our work-centric society. It’s supposed to be part of our preventive care plan alongside working out, eating right, getting enough sleep, and/or other activities that are important for our personalized needs.

The notion of self-care has gotten a recent visibility boost as those of us who work in human rights and/or are activists encourage each other publicly to recharge. Most of the people I know who remind themselves and those in our movements to take time off do so to combat the productivity anxiety embedded in our work. We’re underpaid and overworked, but still feel guilty taking a break or, worse, spending money on ourselves when it could go to something movement- or bill-related.

The guilt is intensified by our capitalist system having infected the self-care philosophy, much as it seems to have infected everything else. Our bootstrap, do-it-yourself culture demands we work to the point of exhaustion—some of us because it’s the only way to almost make ends meet and others because putting work/career first is expected and applauded. Our previous president called it “uniquely American” that someone at his Omaha, Nebraska, event promoting “reform” of (aka cuts to) Social Security worked three jobs.

“Uniquely American, isn’t it?” he said. “I mean, that is fantastic that you’re doing that. (Applause.) Get any sleep? (Laughter.)”

The audience was applauding working hours that are disastrous for health and well-being, laughing at sleep as though our bodies don’t require it to function properly. Bush actually nailed it: Throughout our country, we hold Who Worked the Most Hours This Week competitions and attempt to one-up the people at the coffee shop, bar, gym, or book club with what we accomplished. We have reached a point where we consider getting more than five or six hours of sleep a night to be “self-care” even though it should simply be part of regular care.

Most of us know intuitively that, in general, we don’t take good enough care of ourselves on a day-to-day basis. This isn’t something that just happened; it’s a function of our work culture. Don’t let the statistic that we work on average 34.4 hours per week fool you—that includes people working part time by choice or necessity, which distorts the reality for those of us who work full time. (Full time is defined by the Internal Revenue Service as 30 or more hours per week.) Gallup’s annual Work and Education Survey conducted in 2014 found that 39 percent of us work 50 or more hours per week. Only 8 percent of us on average work less than 40 hours per week. Millennials are projected to enjoy a lifetime of multiple jobs or a full-time job with one or more side hustles via the “gig economy.”

Despite worker productivity skyrocketing during the past 40 years, we don’t work fewer hours or make more money once cost of living is factored in. As Gillian White outlined at the Atlantic last year, despite politicians and “job creators” blaming financial crises for wage stagnation, it’s more about priorities:

Though productivity (defined as the output of goods and services per hours worked) grew by about 74 percent between 1973 and 2013, compensation for workers grew at a much slower rate of only 9 percent during the same time period, according to data from the Economic Policy Institute.

It’s no wonder we don’t sleep. The Centers for Disease Control and Prevention (CDC) has been sounding the alarm for some time. The American Academy of Sleep Medicine and the Sleep Research Society recommend people between 18 and 60 years old get seven or more hours sleep each night “to promote optimal health and well-being.” The CDC website has an entire section under the heading “Insufficient Sleep Is a Public Health Problem,” outlining statistics and negative outcomes from our inability to find time to tend to this most basic need.

We also don’t get to the doctor when we should for preventive care. Roughly half of us, according to the CDC, never visit a primary care or family physician for an annual check-up. We go in when we are sick, but not to have screenings and discuss a basic wellness plan. And rarely do those of us who do go tell our doctors about all of our symptoms.

I recently had my first really wonderful check-up with a new primary care physician who made a point of asking about all the “little things” leading her to encourage me to consider further diagnosis for fibromyalgia. I started crying in her office, relieved that someone had finally listened and at the idea that my headaches, difficulty sleeping, recovering from illness, exhaustion, and pain might have an actual source.

Considering our deeply-ingrained priority problems, it’s no wonder that when I post on social media that I’ve taken a sick day—a concept I’ve struggled with after 20 years of working multiple jobs, often more than 80 hours a week trying to make ends meet—people applaud me for “doing self-care.” Calling my sick day “self-care” tells me that the commenter sees my post-traumatic stress disorder or depression as something I could work through if I so chose, amplifying the stigma I’m pushing back on by owning that a mental illness is an appropriate reason to take off work. And it’s not the commenter’s fault; the notion that working constantly is a virtue is so pervasive, it affects all of us.

Things in addition to sick days and sleep that I’ve had to learn are not engaging in self-care: going to the doctor, eating, taking my meds, going to therapy, turning off my computer after a 12-hour day, drinking enough water, writing, and traveling for work. Because it’s so important, I’m going to say it separately: Preventive health care—Pap smears, check-ups, cancer screenings, follow-ups—is not self-care. We do extras and nice things for ourselves to prevent burnout, not as bandaids to put ourselves back together when we break down. You can’t bandaid over skipping doctors appointments, not sleeping, and working your body until it’s a breath away from collapsing. If you’re already at that point, you need straight-up care.

Plenty of activities are self-care! My absolutely not comprehensive personal list includes: brunch with friends, adult coloring (especially the swear word books and glitter pens), soy wax with essential oils, painting my toenails, reading a book that’s not for review, a glass of wine with dinner, ice cream, spending time outside, last-minute dinner with my boyfriend, the puzzle app on my iPad, Netflix, participating in Caturday, and alone time.

My someday self-care wish list includes things like vacation, concerts, the theater, regular massages, visiting my nieces, decent wine, the occasional dinner out, and so very, very many books. A lot of what constitutes self-care is rather expensive (think weekly pedicures, spa days, and hobbies with gear and/or outfit requirements)—which leads to the privilege of getting to call any part of one’s routine self-care in the first place.

It would serve us well to consciously add an intersectional view to our enthusiasm for self-care when encouraging others to engage in activities that may be out of reach financially, may disregard disability, or may not be right for them for a variety of other reasons, including compounded oppression and violence, which affects women of color differently.

Over the past year I’ve noticed a spike in articles on how much of the emotional labor burden women carry—at the Toast, the Atlantic, Slate, the Guardian, and the Huffington Post. This category of labor disproportionately affects women of color. As Minaa B described at the Huffington Post last month:

I hear the term self-care a lot and often it is defined as practicing yoga, journaling, speaking positive affirmations and meditation. I agree that those are successful and inspiring forms of self-care, but what we often don’t hear people talking about is self-care at the intersection of race and trauma, social justice and most importantly, the unawareness of repressed emotional issues that make us victims of our past.

The often-quoted Audre Lorde wrote in A Burst of Light: “Caring for myself is not self-indulgence, it is self-preservation, and that is an act of political warfare.”

While her words ring true for me, they are certainly more weighted and applicable for those who don’t share my white and cisgender privilege. As covered at Ravishly, the Feminist Wire, Blavity, the Root, and the Crunk Feminist Collective recently, self-care for Black women will always have different expressions and roots than for white women.

But as we continue to talk about self-care, we need to be clear about the difference between self-care and actual care and work to bring the necessities of life within reach for everyone. Actual care should not have to be optional. It should be a priority in our culture so that it can be a priority in all our lives.

News Politics

Debbie Wasserman Schultz Resigns as Chair of DNC, Will Not Gavel in Convention

Ally Boguhn

Donna Brazile, vice chair of the DNC, will step in as interim replacement for Wasserman Schultz as committee chair.

On the eve of the Democratic National Convention in Philadelphia, Rep. Debbie Wasserman Schultz (D-FL) resigned her position as chair of the Democratic National Committee (DNC), effective after the convention, amid controversy over leaked internal party emails and months of criticism over her handling of the Democratic primary races.

Wasserman Schultz told the Sun Sentinel on Monday that she would not gavel in this week’s convention, according to Politico.

“I know that electing Hillary Clinton as our next president is critical for America’s future,” Wasserman Schultz said in a Sunday statement announcing her decision. “Going forward, the best way for me to accomplish those goals is to step down as Party Chair at the end of this convention.”

“We have planned a great and unified Convention this week and I hope and expect that the DNC team that has worked so hard to get us to this point will have the strong support of all Democrats in making sure this is the best convention we have ever had,” Wasserman Schultz continued.

Just prior to news that Wasserman Schultz would step down, it was announced that Rep. Marcia Fudge (D-OH) would chair the DNC convention.

Donna Brazile, vice chair of the DNC, will step in as interim replacement for Wasserman Schultz as committee chair.

Wasserman Schultz’s resignation comes after WikiLeaks released more than 19,000 internal emails from the DNC, breathing new life into arguments that the Democratic Party—and Wasserman Schultz in particular—had “rigged” the primary in favor of nominating Hillary Clinton. As Vox‘s Timothy B. Lee pointed out, there seems to be “no bombshells” in the released emails, though one email does show that Brad Marshall, chief financial officer of the DNC, emailed asking whether an unnamed person could be questioned about “his” religious beliefs. Many believe the email was referencing Sen. Bernie Sanders’ (I-VT).

Another email from Wasserman Schultz revealed the DNC chair had referred to Sanders’ campaign manager, Jeff Weaver, as a “damn liar.”

As previously reported by Rewire before the emails’ release, “Wasserman Schultz has been at the center of a string of heated criticisms directed at her handling of the DNC as well as allegations that she initially limited the number of the party’s primary debates, steadfastly refusing to add more until she came under pressure.” She also sparked controversy in January after suggesting that young women aren’t supporting Clinton because there is “a complacency among the generation” who were born after Roe v. Wade was decided.

“Debbie Wasserman Schultz has made the right decision for the future of the Democratic Party,” said Sanders in a Sunday statement. “While she deserves thanks for her years of service, the party now needs new leadership that will open the doors of the party and welcome in working people and young people. The party leadership must also always remain impartial in the presidential nominating process, something which did not occur in the 2016 race.”

Sanders had previously demanded Wasserman Schultz’s resignation in light of the leaked emails during an appearance earlier that day on ABC’s This Week.

Clinton nevertheless stood by Wasserman Schultz in a Sunday statement responding to news of the resignation. “I am grateful to Debbie for getting the Democratic Party to this year’s historic convention in Philadelphia, and I know that this week’s events will be a success thanks to her hard work and leadership,” said Clinton. “There’s simply no one better at taking the fight to the Republicans than Debbie—which is why I am glad that she has agreed to serve as honorary chair of my campaign’s 50-state program to gain ground and elect Democrats in every part of the country, and will continue to serve as a surrogate for my campaign nationally, in Florida, and in other key states.”

Clinton added that she still looks “forward to campaigning with Debbie in Florida and helping her in her re-election bid.” Wasserman Schultz faces a primary challenger, Tim Canova, for her congressional seat in Florida’s 23rd district for the first time this year.