Skip to main content


In May 2020, after the start of the ongoing 2020 uprisings against police violence, Hacking//Hustling, a collective of sex workers, survivors, and accomplices working at the intersection of tech and social justice, set out to understand the ways that platforms’ responses to Section 230 carve-outs impact content moderation, and threaten free speech and human rights for those who trade sex and/or work in movement spaces.

More censorship, surveillance and carve-outs to Section 230 (§ 230) of the Communications Decency Act (CDA) are on the horizon. Now more than ever, it’s important to understand exactly how these carve-outs impact social media and financial technology platforms, and their content moderation practices. Moreover, it’s important to understand how those content moderation decisions impact the humans—especially marginalized communities—who rely on those platforms every day.

Hacking // Hustling’s new report, Posting Into the Void: Studying the Impact of Shadowbanning on Sex Workers and Activists, serves as an extension of Erased, Hacking//Hustling’s study on FOSTA, and adds to the small body of research that focuses on the human impact of § 230 carve-outs and platform content moderation decisions.

The key findings were:

  1. People who identified as both a sex worker and an activist, organizer, or protestor (AOP) experienced the negative impacts of platform policing both more intensely and more frequently. Sex workers are significantly more likely (30.77%) to report they have been shadowbanned on social media while AOPs are significantly less likely to report the same (12.82%). Of those who identify as both a sex worker and an AOP, an incredible 51.28% report they have been shadowbanned.
  2. Sex workers who started doing more online work due to COVID-19 experienced significantly more punitive platform policing than other respondents. 71.14% of people who have done sex work have started doing more online work due to COVID-19. We found that nearly every form of shadowbanning and deplatforming we asked about was more prevalent among sex workers who had started doing more sex work online due to in-person COVID-19 restrictions—even more so than sex workers who already did sex work online.
  3. Sex workers who shared original tweets about Black Lives Matter from an account where they also post about sex work were significantly more likely to suffer platform policing. They say they have: noticed a difference in the visibility of their content, posts, or profile since the end of May 2020 (44.30%); and lost access to a financial technology (e.g. PayPal, Venmo, Square Cash, etc.) (51.90%).
  4. Respondents who identified as both a sex worker and an AOP demonstrated the most chilled speech. 82.5% said they have avoided posting content for fear of being kicked off, shadowbanned, or facing legal action. Only 44.19% of AOPs have avoided posting content for fear of being kicked off, shadowbanned, or facing legal action, which is significantly less than their sex working peers (68.75%). 
  5. Movement work is restricted most severely for those who are both a sex worker and an AOP. A sentiment analysis of our qualitative data shows a compounding effect where sex workers who also identified as AOPs experienced the most severe forms of platform punishment. Sex workers and sex working AOPs consistently described severe levels of paranoia and chilling effects that non-sex working AOPs did not. 
  6. Losing access to financial technologies reduces an individual’s ability to earn a living, and disrupts movement work and mutual aid efforts. 66.13% of sex workers who had been deplatformed from a financial technology reported that it impacted their ability to do sex work. 36.67% of respondents who reported being deplatformed from a financial technology reported that it impacted their ability to do movement work or community organizing.
  7. Sex workers are experiencing catfishing and content theft at alarmingly high rates. 43.75% of sex workers and 46.43% of sex working AOPs report having had their images or content used for a fake account that they did not run or provided false information (e.g. having a catfishing account steal their photos). AOPs who do not do sex work are significantly less likely to say the same (14.29%).
  8. Sex workers are barred from accessing the marketing tools non-sex working communities use to build their small businesses. Whether you’re looking at social media platforms, website hosting, or financial technologies, the world of small business limits—if not excludes—sex workers at every stage. Sex workers continue to build thriving small businesses despite this barrage of marketing barriers. With so many sex workers funding their movement work through sex work, this has a compounding negative effect on community.

Content Moderation’s Impact on Sex Workers

Increasingly, content moderation, censorship, and shadowbanning facilitate sex worker erasure and opress the sex working community. 

Sex workers are disproportionately losing access to social media platforms, having bank accounts seized, being banned from major payment processors, and being used as test subjects for facial recognition databases. These are forms of structural violence in the digital world, impacting a population already vulnerable to state and platform policing.
Those who have done sex work are far and away the most severely impacted segment we studied. Of those who have done sex work (non-sex worker data in brackets for comparison):

  • 97.95% are familiar with the term shadowbanning (vs. 87.67%).
  • 69.57% report they have been shadowbanned (vs. 34.88%).
  • 63.31% report their content has been repressed in the timeline (vs. 54.90%).
  • 67.14% have had a post removed from social media (vs. 46%).
  • 41.84% said they have received a warning that their social media account is close to deletion (vs. 17.65%).
  • 58.57% said they have found their username does not show up in searches (vs. 22%).
  • 72.86% said they have experienced social media platforms suppressing their friends’ content from their timelines (vs. 62.75%).
  • 41.01% said they have been deplatformed or kicked off of a social media account (vs. 21.57%).
  • 45.45% have had their images or content used for a fake account that they did not run or provided false information (e.g. catfishing) (vs. 14.29%).
  • 33.33% said when their images were used for catfishing, they were simultaneously shadowbanned so the fake account showed up first (vs. 0%).
  • 54.76% avoid specific words to avoid being shadowbanned (vs. 23.81%).
  • 77.34% avoid posting content for fear of being kicked off, shadowbanned, or facing legal action (vs. 44.19%).
  • 67.69% have had content that does not violate a sensitive media policy marked as sensitive media on their profile (vs. 41.86%).
  • 74.19% are noticing trends in the suppression of information on social media (vs. 66.67%).

The Compounding Impact on Sex Working Activists

While trading sex was the most indicative factor we studied, we consistently found that people who identified as both a sex worker and an AOP experienced the negative impacts of platform policing both more intensely and more frequently. 
Compared to only sex workers, sex working AOPs are significantly more likely to say they: 

  • use more than one social media account (93.48%); 
  • noticed a difference in the visibility of their content, posts, or profiles since the end of May 2020 (41.86%); and
  • have ever had an issue using financial technologies to community organize or share money with community (30.12%).

Compared to only sex workers and only AOPs, sex working AOPs are significantly more likely to say they have: 

  • received a warning that their social media account is close to deletion (48.91%); and
  • lost access to a financial technology (e.g. PayPal, Venmo, Square Cash, etc.) (50%).

This research merely scratches the surface of how the identities of sex workers and AOPs intersect to create more severe content moderation for people who hold both identities.

Hacking//Hustling’s Recommendations

With the EARN IT Act, a number of anti-encryption legislative efforts, and the defunding of open-source technologies, these findings highlight the harms to sex workers and AOPs who are trying to use online platforms to make a living, share resources, and organize.

We’re calling for social media and financial technology platforms to:

  1. Make internal content moderation practices public.
  2. Give users more choice in what they see.
  3. Hire sex workers to conduct compentency trainings for staff. 
  4. Open your wallet and share this report with colleagues.

We’re calling for policymakers to:

  1. Listen to sex workers when they warn about the potential impact of policies.
  2. Stop the state from determining what safety means for communities by stoking fear and spreading misinformation.
  3. Challenge the framing that sex workers and survivors are two discrete communities. 
  4. Fight against legislation that increases liability for platforms and does nothing to stop violence.
  5. Advocate for the decriminalization, decarceration, and destigmatization of sex work.

To read the full, 80+ page report with more detailed findings, please click here (PDF).


Danielle Blunt is one of the co-founders of Hacking//Hustling, a collective of sex workers and accomplices working at the intersection of tech and social justice to interrupt state surveillance and violence facilitated by technology. A professional NYC-based Femdom and Dominatrix, Blunt researches sex work and equitable access to technology from a public health perspective. She enjoys redistributing money from institutions, watching her community thrive, and making men cry.