Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Reason
Reason
Politics
Elizabeth Nolan Brown

PROTECT Act Could Require Removal of All Existing Porn Online

Is Congress really trying to outlaw all sex work? That's what some people fear the Preventing Rampant Online Technological Exploitation and Criminal Trafficking (PROTECT) Act would mean.

The bill defines "coerced consent" to include consent obtained by leveraging "economic circumstances"—which sure sounds like a good starting point for declaring all sex work "coercive" and all consent to it invalid. (Under that definition, in fact, most jobs could be considered nonconsensual.)

Looking at the bill as a whole, I don't think this is its intent, nor is it likely be enforced that way. It's mainly about targeting tech platforms and people who post porn online that they don't have a right to post.

But should the PROTECT Act become law, its definition of consent could be used in other measures that do seek to target sex work broadly. And even without banning sex work, it could still wreak major havoc on sex workers, tech companies, and free speech and internet freedom more widely.

There are myriad ways it would do this. Let's start by looking at how it could make all existing online porn against the law.

How the PROTECT Act Would Make All Existing Online Porn Illegal 

The PROTECT Act doesn't directly declare all existing web porn illegal. Its sponsor—Sen. Mike Lee (R–Utah)—at least seems to know that the First Amendment wouldn't allow that. Nonetheless, under the PROTECT Act, platforms that failed to take down existing porn (defined broadly to include "any intimate visual depiction" or any "visual depiction of actual or feigned sexually explicit activity") would open themselves up to major fines and lawsuits.

In order to stay on the right side of PROTECT Act requirements, tech companies would have to collect statements of consent from anyone depicted in intimate or sexually explicit content. These statements would have to be submitted on yet-to-be-developed forms created or approved by the U.S. Attorney General.

And the law would "apply to any pornographic image uploaded to a covered platform before, on, or after that effective date" (emphasis mine).

Since no existing image has been accompanied by forms that don't yet exist, every existing pornographic image (or image that could potentially be classified as "intimate") would be a liability for tech companies.

How the PROTECT Act Would Chill Legal Speech

Let's back up for a moment and look at what the PROTECT Act purports to do and how it would go about this. According to Lee's office, it is aimed at addressing "online sexual exploitation" and "responds to a disturbing trend wherein survivors of sexual abuse are repeatedly victimized through the widespread distribution of non-consensual images of themselves on social media platform."

Taking or sharing intimate images of someone without their consent is wrong, of course. Presumably most people would like to stop this and think there should be consequences for those who knowingly and maliciously do so.

But Lee's plan strikes much further than this, targeting companies that serve as conduits for any sort of intimate imagery. The PROTECT Act would subject them to so much bureaucracy and liability that they may reasonably decide to ban any imagery with racy undertones or too much flesh showing.

This would seriously chill sexual expression online—not just for sex workers, but for anyone who wants to share a slightly risque image of themselves, for those whose art or activism includes any erotic imagery, and so on. Whether or not the government intends to go after such material, the mere fact that it could will incentivize online platforms to crack down on anything that a person or algorithm might construe at a glance as a violation: everything from a photo of a mother breastfeeding to a painting that includes nudity.

And it's not just at the content moderation end that this would chill speech. The PROTECT Act could also make users hesitant to upload erotic content, since they would have to attach their real identities to it and submit a bunch of paperwork to do so.

How the PROTECT Act Would Invade Privacy 

Under the PROTECT Act, all sorts of sex workers—people who appear in professional porn videos produced by others, people who create and post their own content, pinup models, strippers and escorts who post sexy images online to advertise offline services, etc.—would have to turn over proof of their real identities to any platform where they posted content. Sex workers and amateur porn producers would have their real identities tied to any online account where they post.

This would leave them vulnerable to hackers, snoops, stalkers, and anyone in the government who wanted to know who they were.

And it doesn't stop at sex workers (these things never do) or amateur porn producers. The PROTECT Act's broad definition of porn could encompass boudoir photos, partial nudity in an artwork or performance, perhaps even someone wearing a revealing bathing suit in a vacation pic.

To show just how ridiculous this could get, consider that the bill defines pornography to include any images where a person is identifiable and "the naked genitals, anus, pubic area, or post-pubescent female nipple of the individual depicted are visible."

If your friend's nipple is visible through her t-shirt in a group shot, you may have to get a consent form from her before posting it and to show your driver's license and hers when you do. Or just be prepared to be banned from posting that picture entirely, if the platform decides it's too risky to allow any nipples at all.

Here's What the PROTECT Act Says 

Think I'm exaggerating? Let's look directly at the PROTECT Act's text.

First, it prohibits any "interactive computer service" from allowing intimate images or "sexually explicit" depictions to be posted without verifying the age and identity of the person posting it.

Second, it requires platforms to verify the age and identity of anyone pictured, using government-issued identification documents.

Third, it requires platforms to ascertain that any person depicted has "provided explicit written evidence of consent for each sex act in which the individual engaged during the creation of the pornographic image; and…explicit written consent for the distribution" of the image. To verify consent, companies would have to collect "a consent form created or approved by the Attorney General" that includes the real name, date of birth, and signature of anyone depicted, as well as statements specifying "the geographic area and medium…for which the individual provides consent to distribution," the duration of that consent to distribute, a list of the specific sex acts that the person agreed to engage in, and "a statement that explains coerced consent and that the individual has the right to withdraw the individual's consent at any time."

Platforms would also have to create a process for people to request removal of pornographic images, prominently display this process, and remove images within 72 hours of an eligible party requesting they be taken down.

The penalties for failure to follow these requirements would be quite harsh for people posting or hosting content.

Someone who uploaded an intimate depiction of someone "with knowledge of or reckless disregard for (1) the lack of consent of the individual to the publication; and (2) the reasonable expectation of the individual that the depiction would not be published" could be guilty of a federal crime punishable by fines and up to five years in prison. They could also be sued by "any person aggrieved by the violation" and face damages including $10,000 per image per day.

Platforms that failed to verify the ages and identities of people posting pornographic images could face civil penalties of up to $10,000 per day per image, levied by the attorney general. Failure to verify the identities, ages, and consent status of anyone in a pornographic image could open companies up to civil lawsuits and huge payouts for damages. Tech companies could also face fines and lawsuits for failing to create a process for removal, to prominently display this process, or to designate an employee to field requests. And of course, failure to remove requested images would open a company up to civil lawsuits, as would failure to block re-uploads of an offending image or any  "altered or edited" version of it.

Amazingly, the bill states that "nothing in this section shall be construed to affect section 230 of the Communications Act." Section 230 protects digital platforms and other third parties online from some liability for the speech of people who use their tools or services, and yet this whole bill is based on punishing platforms for things that users post. It just tries to hide it by putting insane regulatory requirements on these platforms and then saying it's not about them allowing user speech, it's about them failing to secure the proper paperwork to allow that user speech.

An Insanely Unworkable Standard

Under the PROTECT Act, companies would have to start moderating to meet the sensibilities of a Puritan or else subject themselves to an array of time-consuming, technologically challenging, and often impossible feats of bureaucratic compliance.

The bill mandates bunches of paperwork for tech platforms to collect, store, and manage. It doesn't just require a one-time age verification or a one-time collection of general consent forms—no, it requires these for every separate sexual image or video posted.

Then it requires viewing the content in its entirety to make sure it matches the specific consent areas listed. (Is a blow job listed on that form? What about bondage?)

Then it requires keeping track of variable consent revocation dates—a person could consent to have the video posted in perpetuity, for five years, or for some completely random number of days—and removing content on this schedule.

This is, of course, all after the company ascertains that a depiction is pornographic. That first step alone would be a monumental task for platforms with large amounts of user-uploaded content, requiring them to screen all images before they go up or patrol constantly for posted images that might need to be taken down.

And when companies received takedown requests, they would have just 72 hours to determine if the person making it really was someone with a valid case as opposed to, say, someone with a personal vendetta against the person depicted, or an some anti-porn zealot trying to cleanse the internet. It would be understandable if companies in this situation choose to err on the side of taking down any flagged content.

The PROTECT Act would also mean a lot of paperwork for people posting content. Sure, professional porn companies already document a lot of this stuff. But now we're talking anyone who appears nude on OnlyFans having to submit this paperwork with every single piece of content uploaded.

And in all cases, we're left with this broad and vague definition of consent as a guiding principle. The bill states that consent "does not include coerced consent" and defines "coerced consent" to include not just any consent obtained through "fraud, duress, misrepresentation, undue influence, or nondisclosure" or consent from someone who "lacks capacity" (i.e., a minor) but also consent obtained "though exploiting or leveraging the person's immigration status; pregnancy; disability; addiction; juvenile status; or economic circumstances."

With such broad parameters of coercion, all you may have to say is "I only did this because I was poor" or "I only did this because I was addicted to drugs" and your consent could be ruled invalid—entitling you to collect tens of thousands of dollars from anyone who distributed the content or a tech platform that didn't remove it quickly enough. Even if the tech company or porn distributor or individual uploader ultimately prevailed in such lawsuits, that would only come after suffering the time and expense of fending the suits off.

For someone like Lee—who has proposed multiple measures to crack down on online sexual content—the unworkability of all of this might look like a feature, not a bug. It would be reasonable for a tech company looking at these risks to conclude that allowing any sort of sexy imagery is not worth it and/or that taking down any image upon any request was a good idea.

A measure like the PROTECT Act might help stop the spread of nonconsensual porn on mainstream, U.S.-based platforms (though such images could still spread freely through private communication channels and underground platforms). But it would do this at the cost of a ton of protected speech and consensual creations.

Today's Image

Performance art or pornography? (Bushwick/2013) (ENB/Reason)

The post PROTECT Act Could Require Removal of All Existing Porn Online appeared first on Reason.com.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.