Deplatforming Sex - Panel Video & Recap

Paypal and Venmo routinely kick sex workers off their services with little due process. Visa and Mastercard recently stopped processing payments on Pornhub. Many banks include morality clauses that allow them to freeze and terminate sex worker accounts at will. Among its many policies relating to nudity and sex, Facebook banned the eggplant or peach emoji in conjunction with any statement referring to being "horny." When Instagram updated its terms of use at the end of 2020, it effectively banned OnlyFans content creators from using the app and made it more difficult for sex educators to share content. Many sexuality professionals and businesses––especially sex workers––fear that platforms will shadowban or delete their accounts, making it harder for them to market themselves and make a living.

Why is this happening? And what can be done about it? On June 17th, 2021, we hosted Deplatforming Sex: A Panel Discussion on the history and current impacts of this phenomenon.

Becca Motola-Barnes, the San Francisco Sex Positive Democrat’s President, moderated. Our panelists were:

  • Clare Bayley, creator of the Nectarine Project and a TAPP Fellow at Harvard Kennedy School
  • Daly Barnett, technologist at the Electronic Frontier Foundation and a member of the sex work advocacy collective, Hacking//Hustling.
  • Ryan Olson, co-founder of Fetishmen Networks and Real Kink and adult industry developer who specializes in payment processing and compliance within the adult sector.

You can watch the video here.

Becca kicked off the discussion by asking panelists to define deplatforming. Daly explained that deplatforming often refers to the platform or application that user engages with –– but there is also platform censorship. She said that platform censorship refers to the infrastructure of the internet –– such as banks, payment processors, and credit card companies. In addition, there is content moderation, which are actions that a specific platform will take to enforce their terms of service or community guidelines and lessen their legal responsibilities. She noted that there aren’t discrete boundaries between these.

Clare acknowledged that it’s complicated. “There’s so many different places at which a decision or action can be taken, and there are so many different people involved in this system,” she said.


Panelists discussed how content moderation leads to sexuality professionals and businesses being deplatformed. Clare noted that there is a difference between policy and implementation. What happens on the platforms, she said, often doesn’t match their policy. For example, many platforms might ban content involving sex but will have exceptions for educational content. But sometimes platforms will delete even educational content.

Becca noted one of the ways in which this policy affects us at SFSPDC. Because we have "sex" in our name, Facebook limits the number of people we can invite to our events. She asked, “Where do we draw the line? How do we educate people when just having 'sex' in your name gets you knocked out?”

Daly talked about a law that had the biggest impact on platform’s policy and content moderation: SESTA/FOSTA.  SESTA/FOSTA carved out Section 230 and removed immunization that platforms had when it comes to material that could be aiding human trafficking. In response, tech companies are being broad in how they deploy content moderation. "Because it’s so hazy," she said when talking about SESTA/FOSTA, "no one knows what that means or how it could be –– these platforms, in order to dodge liability, even the largest ones that could afford a slew of lawsuits, even though there haven’t been any yet, at least under FOSTA, they just hit anything that could be maybe swept under that umbrella.”

Daly explained that the impact of this is far reaching. “…any sort of sex content, any kind of sex education, any people they have deemed might or might not be a sex worker, or involved in the sex industry on either side of the counter, and any profiles anywhere are all best just to get rid of than to deal with the legal nightmares,” she said.

Clare said that SESTA/FOSTA caused many sites to shut down, including Craigslist Personals. Ryan added that it has even led to the banning of specific words, such as the word “hole” on the dating site Doubleist.


Part of this problem, panelists discussed, is the way that social media platforms moderate content. Ryan explained that platforms used to have automatic moderation; however, that became problematic, and so they now hire outsourced people to do the monitoring. This leads to subjective policy implementation. “How do you fight something that is subjective?” Ryan questioned.

The panelists were clear that these policies are tied in with racism, sexism, homophobia, fatphobia, and transphobia. Ryan stressed that “We have to start calling it discrimination…We have to be treated equally.”

Clare agreed, noting that it’s “hard to call it that because it’s not written in the policy that way, but from looking at the banned content it’s clear that the further you are from a thin, straight, young white, young woman, the more likely you are to have your content taken down.” She explained how this works, saying, “Some of it is bias in the content moderation, and some of it’s just more likely to be reported in harassment campaigns.”

When Becca asked whether we had empirical evidence that platforms discriminated, Clare said that it’s difficult to track because we can’t track content that has been removed, but there are a lot of studies about bias in AI. Daly gave some examples, including Timnit Gebru’s firing from Google.

Daly reiterated that “Any sort enforcement of these things are going to reflect biases of its creators or the society in which it's built up.” She added that it is a “Vastly intersectional network of bigotry that is coded into the platforms that we’re engaging in.”


The panelists discussed the impact that morality policing within the banking industry has. Companies and professionals must deal with banks, investors, payment processors (i.e. Stripe, Square, Venmo, etc.) –– and all of those have rules about what businesses and professionals are allowed to sell and how they are allowed to sell it.

Ryan explained, saying, “I buy a book and it’s got the F word on it. This company can go ahead, they can sell that book a million times, put their money into a Bank of America, Chase Bank, Wells Fargo account, and they don’t think twice about it. But the moment somebody wants to create porn, or they want to do something that is not Christianlike, now there’s all these policies, and everyone’s bouncing it to the other person. If it’s not Visa saying we don’t want someone to wear diapers, it’s the payment processor saying they don’t want to be attached to this, or their bank that does the payment processing on their behalf saying they don’t want to be attached to this.”

Ryan noted that these rules are different depending on where you live and where your business is located. If you are website whose business is located in North America, credit cards, payment processors, and banks might prohibit certain activities to be done or mentioned (i.e. watersports). However, as he explained, if you’re in Europe, that very same activity might be allowed. Companies don’t want to make their European customers angry, but it’s different in the United States. Because of religious groups, they have to block more content.

As panelists pointed out, these rules often don’t even correspond with what is legal in the United States. For example, it is legal for sex toy stores to operate, but payment processors, banks, and credit card companies might choose not to work with these businesses or might stipulate how they can advertise. Or it is legal to tip a stripper for dances, but you can’t tip them over Venmo because Venmo won’t allow it.

“At the end of the day, why are we again still being so subjective, if these people are working legally? Why are they not being treated the same as someone like myself, who is a developer?” Ryan asked.

When talking about how these infrastructural places (banks, payment processors, credit card companies) have positioned themselves as unique chokeholds in the system, Daly said, “They should not be the moral arbiters of what is and is not allowed.”


When talking about what can be done to fight this, the panelists agreed that using USPS as a bank or making Facebook into a utility regulated by the government wouldn’t help. They had other suggestions.

Ryan called for transparency in policy so that users can choose to not use the platform if they don’t agree with the policy. He also argued that “platforms need to start using their money and they need to start seeing the money value of not having discrimination.”

Daly agreed with Ryan, asserting that there shouldn’t be any uniform law governing how the Internet should be. She also felt concerned about transparency. She said, “Whatever they do, I want it to be transparent, so that users have a choice.” She wants more regulation that fosters tech competition and transparent, consent-based policy that would affect how people engage in the platform and empower the platforms that people want. This, she believes, would lead to a more democratic tech world.

Clare emphasized the importance of sex-positivity. “The win would be to convince people that sex is important and that exposure to sexual content and sexual information is important and a good thing to protect,” she said. This does not mean that children should be shown porn, she stressed. She believes that consent is important above all else and that there are ways that companies can ensure consent. Some social media companies, she said, are already using features that with opt in filters, things to confirm that the person viewing the website is 18, and she wants more companies to use those tactics. “Any attempt to separate sex from the rest of life just harms us all,” she concluded.  


Becca summed up the panel, expressing that some of the things that she heard were: a push for more transparency, advocate for anti-trust law to be used to create more competition in marketplace, and perhaps someday an online bill of rights that includes sex and sex positivity as a part of human expression

Clare encouraged those in the tech field to continue to be personally sex positive and to speak out about these issues and. You can find her at Nectarine Project and on Instagram or on Twitter.

Ryan stressed the importance of having more sex-positivity and encouraged people to be loud and proud –– and to advocate to self, friends, family, and work. You can find him on Twitter and on Instagram.

Daly encouraged people to fight for your own digital privacy and security and to fight for your community’s digital privacy and security. She encouraged people to go visit Electronic Frontier Foundation to learn more about how they defend digital civil liberties and Hacking//Hustling to learn more about they advocate for sex workers in the tech space.

Showing 1 reaction

Please check your e-mail for a link to activate your account.
  • Kelsey Britt
    published this page in SFSPD Blog 2021-07-14 12:07:29 -0700