Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Reason
Reason
Elizabeth Nolan Brown

The TAKE IT DOWN Act's Good Intentions Don't Make Up for Its Bad Policy

Who could possibly oppose legislation to get tough on AI-generated revenge porn? For one, Kentucky Republican Rep. Thomas Massie, one of two nays in Monday's House vote on the TAKE IT DOWN Act. For another, a whole bunch of civil liberties advocates, including folks with groups like the American Civil Liberties Union, the Electronic Frontier Foundation, and The Future of Free Speech.

That's because no matter how worthy the intentions behind the TAKE IT DOWN Act may be, the way it's written poses major threats to protected expression and online privacy. It could also give politicians another tool with which to pressure technology companies into doing their bidding.

None of the measure's critics are defending "revenge porn," or what the bill calls "nonconsensual intimate visual depictions." Rather, they worry that the measure would be "ripe for abuse, with unintended consequences," as Massie put it.

Alas, the TAKE IT DOWN Act (S.146), sponsored by Sen. Ted Cruz (R–Texas), has now passed the Senate and the House. Next stop: President Donald Trump, who has been supportive of the bill.

What the TAKE IT DOWN Act Says

The measure would make it a federal crime to publish "any intimate visual depiction of an identifiable individual" online if the image was generated by a computer or artificial intelligence and was "indistinguishable from an authentic visual depiction of that individual," unless the depicted individual consented to its publication or "voluntarily exposed" such an image in a "public or commercial setting" themselves.

So, no Photoshopping a celebrity's head onto someone else's racy image and posting it to some online forum. No asking Grok to imagine your ex in a compromising situation with J.D. Vance or a pizza delivery man or a Smurf, and then messaging that image to friends. And so on.

The measure would also ban publishing "an intimate visual depiction of an identifiable individual" online unless the depicted individual "voluntarily exposed" the image "in a public or commercial setting" or otherwise had no expectation of privacy. In this case, the crime is sharing real images of someone who didn't want them shared.

Notably, the bill contains an exception for real or AI-generated images shared by law enforcement agencies or other government actors doing it as part of "investigative, protective, or intelligence activity." (Wouldn't want to jeopardize any of those catfishing sex stings, would we?)

For everyone else, violating the terms of the TAKE IT DOWN Act could mean up to two years in prison if the depicted individual was an adult and up to three years in prison if the depicted individual was a minor.

Threatening Free Speech and Encryption

Already, there's some danger here of roping in people who share parodies and other protected speech.

But perhaps a bigger problem is the way the new measure would be enforced against tech platforms.

The bill would require online platforms to establish a notice and removal regime similar to those used for copyright infringements (a notoriously easy-to-abuse system). Platforms would be required to remove reported images within 48 hours after receiving a request and "make reasonable efforts to remove any known identical copies of such depiction." The quick turnaround required—and the liability imposed if a platform fails to comply—would incentivize companies to simply take down any reported images, even when these weren't breaking the law. That makes it ripe for use by people who want legal images to be removed.

"Services will rely on automated filters, which are infamously blunt tools," warned Electronic Frontier Foundation Activism Director Jason Kelley. "They frequently flag legal content, from fair-use commentary to news reporting."

The law would also incentivize greater monitoring of speech, "including speech that is presently encrypted," noted Kelley. "The law thus presents a huge threat to security and privacy online."

And the agency tasked with ensuring tech-company compliance would be the Federal Trade Commission (FTC), a body of political appointees that can be incredibly influenced by the whims of whoever is in power. That makes the measure ripe for use against politically disfavored tech companies and easily wielded as a jawboning tool to get tech platforms to do an administration's bidding.

That also makes it easily susceptible to corrupt uses, such as removing images embarrassing to politicians. ("I'm going to use that bill for myself, too, if you don't mind," Trump told Congress in March. "Because nobody gets treated worse than I do online.")

TAKE IT DOWN's Many Critics

The bill has bipartisan support in Congress, as bills aimed at giving the government more control over online spaces are wont to (see: FOSTA). But it has been roundly criticized by groups concerned with free speech and other civil liberties.

"The TAKE IT DOWN Act responds to real harms, but in the hands of a government increasingly willing to regulate speech, its broad provisions provide a powerful new tool for censoring lawful online expression, monitoring private communications, and undermining due process," said Ashkhen Kazaryan, senior legal fellow at The Future of Free Speech.

The TAKE IT DOWN Act "creates unacceptable risks to users' fundamental privacy rights and cybersecurity by undermining encryption," a coalition of civil liberties and cybersecurity groups and experts wrote in a letter earlier this month. "Although the Act appropriately excludes some online services — including '[providers] of broadband internet access service' and '[electronic] mail' — from the definition of 'covered platform,' the Act does not exclude private messaging services, private electronic storage services, or other services that use encryption to secure users' data," states the letter, signed by the American Civil Liberties Union, the Internet Society, and New America's Open Technology Institute, among many others.

The notice-and-takedown scheme "would result in the removal of not just nonconsensual intimate imagery but also speech that is neither illegal nor actually [nonconsensual distribution of intimate imagery]," a group of civil liberties organizations—including the Center for Democracy & Technology, Fight for the Future, the Freedom of the Press Foundation, TechFreedom, and the Woodhull Freedom Foundation—wrote to senators in February. "This mechanism is likely unconstitutional and will undoubtedly have a censorious impact on users' free expression. While the criminal provisions of the bill include appropriate exceptions for consensual commercial pornography and matters of public concern, those exceptions are not included in the bill's takedown system."

"The bill is so bad that even the Cyber Civil Rights Initiative, whose entire existence is based on representing the interests of victims of [non-consensual intimate imagery] and passing bills similar to the Take It Down Act, has come out with a statement saying that, while it supports laws to address such imagery, it cannot support this bill due to its many, many inherent problems," notes Mike Masnick at Techdirt. "The bill's vague standards combined with harsh criminal penalties create a perfect storm for censorship and abuse."

"While the bill is meant to address a serious problem, good intentions alone are not enough to make good policy," said Kelley. "Lawmakers should be strengthening and enforcing existing legal protections for victims, rather than inventing new takedown regimes that are ripe for abuse."


Follow-Up: Cambridge Sex Workers Weren't Locked Inside

A few weeks ago, this newsletter covered a case against a Cambridge, Massachusetts, sex business. Though not as steeped in human trafficking fantasies as many sex work busts are, a Homeland Security agent did claim that a manager locking the door from the outside "utilized this tactic so that the commercial sex providers felt that they had to stay in the unit to perform sex acts for cash on behalf of the prostitution network." That claim was subsequently used by some media to fuel claims that workers were coerced, and the bit about locking the door from the outside would later be repeated by federal prosecutors.

But "an employee of the Atmark, the building where the door-locking took place, said Thursday that all its apartment doors can be unlocked from the inside, and that renters are not allowed to replace locks—currently high-tech devices controlled by smartphone— with their own fixtures," reports Cambridge Day.

Cambridge Day "has confirmed what we suspected—the apartment in the Cambridge brothel case could be unlocked from the inside, debunking the government affidavit's claim that the women were locked inside," the Boston Sex Workers and Allies Collective (BSWAC) posted on BlueSky.

More Sex & Tech News

• Statistics professor Aaron Brown dismantles an influential study linking legalized prostitution to increases in human trafficking. "The study, published in 2013 in the journal World Development, has been used to stop legalization initiatives around the world and to justify harsh new laws that turn customers of voluntary sex work into criminals, often in the name of stopping human trafficking," Brown points out. "Unfortunately, the authors of the study used a flawed economic model and abysmal data to reach their conclusion. When crucial information was missing, they guessed and filled it in. Then, when the analysis didn't yield what seemed to be the authors' desired finding, they threw out the data. There is no evidence that legalizing prostitution increases human trafficking."

• Asian massage parlor panic will not die. Again and again, media outlets are willing to lap up groups warning that immigrant massage workers are sex slaves, despite the fact that virtually every "human trafficking" bust at a massage parlor winds up with the workers themselves getting charged with prostitution or unlicensed massage and little else.

• Trump is repeating Joe Biden's AI mistakes.

The Wall Street Journal provoked Meta's AI chatbots into some sexy talk and then freaked out about it. "The use-case of this product in the way described is so manufactured that it's not just fringe, it's hypothetical," a Meta spokesman told the Journal in response. "Nevertheless, we've now taken additional measures to help ensure other individuals who want to spend hours manipulating our products into extreme use cases will have an even more difficult time of it."

Today's Image

Phoenix | 2018 (ENB/Reason)

The post The TAKE IT DOWN Act's Good Intentions Don't Make Up for Its Bad Policy appeared first on Reason.com.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.