October 20, 2023 |
Google Publishes âLegislative Frameworkâ for Protecting Kids Online |
Earlier this week, Kent Walker, the President of Global Affairs for Google and parent company Alphabet, published a post sharing a document the company called a “Legislative Framework to Protect Children and Teens Online.” Writing that the company agrees with “public health and mental health experts that technology companies have a responsibility to design and build better online experiences,” Walker said the company is “encouraged by the increased global interest in ensuring that online services address risks to children and teens.” As a frequent target of claims its websites are harmful to children, it’s no surprise Alphabet wants to weigh in proactively on the subject of how platforms like YouTube and Google ought to be regulated. It should also come as no surprise to find that when it comes to the more intrusive forms of age-verification being contemplated by legislatures across the country (and around the world), Alphabet doesn’t believe those methods should be applied to their products. “(A)s policymakers contemplate these issues, they should carefully consider the broader impacts of these bills and avoid side effects like blocking access to critical services, requiring people (including adults) to submit unnecessary identification or sensitive personal information, or treating an older teen the same as a younger child,” Walker wrote, noting that “child safety and privacy groups have similarly stressed the importance of getting this right.” In the Framework document itself, Alphabet got more specific. “A good understanding of user age can help online services offer age-appropriate experiences,” Alphabet acknowledged, adding the cautionary note that “any method to determine the age of users across services comes with tradeoffs, such as intruding on privacy interests, requiring more data collection and use, or restricting adult users’ access to important information and services.” “Where required, age assurance – which can range from declaration to inference and verification – should be risk-based, preserving users’ access to information and services, and respecting their privacy,” the company added. “Where legislation mandates age assurance, it should do so through a workable, interoperable standard that preserves the potential for anonymous or pseudonymous experiences. It should avoid requiring collection or processing of additional personal information, treating all users like children, or impinging on the ability of adults to access information.” Respecting privacy, preserving the potential for anonymous or pseudonymous experiences and avoiding impinging on the ability of adults to access information – so far, this sounds like a wish list for what the adult industry and representative organizations like the Free Speech Coalition would like to see in a legislative framework, too. With the very next line, however, Alphabet made clear its concern about impinging the ability of adults to access information doesn’t extend to adult entertainment. “More data-intrusive methods (such as verification with “hard identifiers” like government IDs) should be limited to high-risk services (e.g., alcohol, gambling, or pornography) or age correction,” the company wrote. A seasoned Google user might ask: Does this mean Alphabet thinks users should be required to present ID to make use of Google’s image search to find sexually explicit images and videos? Or does the search giant want a free pass on this front? Unfortunately, Alphabet doesn’t address that question in its Framework. (Funny, that.) To be fair, the Framework does offer a lot of sensible advice, some of which would be useful in an adult entertainment context, as well. “Protections aimed at addressing risks of exposure to harmful content may not be as relevant for products that do not allow users to upload and share content,” Alphabet noted. “In order to create a safer online environment, well-crafted legislation should be tailored to the potential underlying risk to children and teens posed by a particular service. Educational technology services, for example, should be considered separately given their unique context, roles, and responsibilities.” You can read Alphabet’s Legislative Framework to Protect Children and Teens Online here. Metal frame photo by Pat Whelen from Pexels |