October 07, 2022 |
The PROTECT Act: Overbroad and Vague |
In a press release issued last week, Sen. Mike Lee (R-UT) announced that he has introduced the Preventing Rampant Online Technological Exploitation and Criminal Trafficking (âPROTECTâ) Act of 2022, saying that adult websites âneed to do more to prevent the exploitation that is occurring on their platforms and allow individuals to remove images shared without their consent.â Ostensibly, the bill is designed to prevent the distribution of child sexual abuse material (âCSAMâ) on adult websites â and those that accept the upload of user-generated content (âUGCâ), in particular â but by its own terms, the PROTECT Actâs scope appears to be much broader than that. One of several pieces of legislation that have sported the âPROTECTâ acronym over the years, Leeâs new proposal begins with a long set of findings that reference reports by the New York Times, the National Center for Missing and Exploited Children (NCMEC) and other sources. These findings emphasize data points that appear to be tied to the criminal prosecution of individuals associated with GirlsDoPorn.com (the names of this company and related individuals are not included), as well as claims found in civil lawsuits targeting Pornhub and other sites that feature UGC. (The bill omits reference to some of NCMECâs less convenient reporting â including the fact that a much higher percentage of CSAM online is found on mainstream platforms like Facebook than on adult websites of any kind.) Several of the billâs key definitions are quite expansive, including a definition of âcovered platformâ that encompasses âan interactive computer service that hosts or makes available to the general public pornographic images.â A subsequent definition, that of âpornographic imageâ, means âany visual depiction of actual or feigned sexually explicit conduct; or any intimate visual depiction.â The definition of âintimate visual depictionâ is broader still, covering âany visual depiction… of an individual who is reasonably identifiable from the visual depiction itself or information displayed in connection with the visual depiction, including through⦠facial recognition; an identifying marking on the in1dividual, including a birthmark or piercing; an identifying feature of the background of the visual depiction; voice matching; or written confirmation from an individual who is responsible, in whole or in part, for the creation or development of the visual depiction; and in which… the individual depicted is engaging in sexually explicit conduct; or the naked genitals, anus, pubic area, or post-pubescent female nipple of the individual depicted are visible.â As far as what the bill would require operators of âcovered platformsâ to do, the bill would impose exacting age and identity verification responsibilities of the sort that have been rejected by U.S. courts when scrutinizing previous legislation. Under the bill, a âcovered platform operator may not upload or allow a user to upload a pornographic image to the covered platform unless the operator has verified⦠the identity of the user⦠and that the user is not less than 18 years old.â (Internal citations omitted.) To comply with this requirement, platform operators âshall verify the identity and age of a user by… requiring use of an adult access code or adult personal identification number⦠accepting a digital certificate that verifies age⦠or using any other reasonable measure of age verification that the Attorney General has determined to be feasible with available technology.â The bill would also require platform operators to âobtain verified consent forms from individuals uploading content and those appearing in uploaded contentâ and âmandate that websites quickly remove images upon receiving notice they uploaded without consent,â to put it in the language of Leeâs press release. Attorney Larry Walters notes that âmany of the obligations the bill seeks to impose are already followed by adult platforms in accordance with the Updated MasterCard Guidelines effective in October, 2021.â âMainstream platforms that allow some adult content will be more severely impacted than adult platforms since they have generally not been required to adopt these same standards since they do not rely on credit card processing to sell subscriptions,â Walters told YNOT. From a constitutional perspective, Walters said the bill âsuffers from many of the same infirmities that resulted in Section 2257 to be found largely unconstitutional in the Free Speech Coalition litigation.â âTreating âpornographyâ different from other forms of protected speech is a content-based distinction and will require that the government demonstrate a compelling interest which has been addressed through the least restrictive means,â Walters explained. âThis test has resulted in invalidation of most content-based restrictions on speech and can only be met in unique circumstances. The law also appears to be overbroad and vague, given the imprecise definitions of the content and platforms subject to the restrictions and obligations.â The billâs vagueness creates other potential constitutional complications, as well. âA separate question arises whether some or all of these new requirements apply to websites that do not allow user uploads, or even to hosts that are removed from the operation of a platform,â Walters said. âThe imprecise language used in the Bill could be constitutionally problematic from a First and Fifth Amendment perspective.â Leeâs bill has not been heard by any Senate committee or subcommittee yet and is bound to undergo changes in the process of being brought to the floor for a vote â if indeed it advances that far. YNOT will continue to track the bill and provide updates on any substantive developments in the weeks and months ahead. |