Home AI voice like Scarlett Johannson’s is ‘high-profile example of the growing need for transparency’
Virginia News

AI voice like Scarlett Johannson’s is ‘high-profile example of the growing need for transparency’

Artificial intelligence
(© Zobacz więcej – stock.adobe.com)

U.S. Rep. Don Beyer of Virginia today urged House consideration of his legislation, the AI Foundation Model Transparency Act, following a statement by actress Scarlett Johansson.

The actress is raising questions about whether OpenAI used her voice without authorization in the development of a new artificial intelligence (AI) product.

Beyer serves on the House leadership-appointed bipartisan Task Force on Artificial Intelligence, is vice-Chair of the Artificial Intelligence Caucus, and is a leading legislator in the U.S. House on AI.

The AI Foundation Model Transparency Act, introduced by Beyer and AI Caucus Chair Anna Eshoo last year, would prompt the establishment of transparency standards for information that high-impact foundation models must provide to the FTC and to the public, including how those AI models are trained and information about the source of data used. The bill has been endorsed by organizations representing and supporting creators and creative industries including SAG-AFTRA, the Authors Guild, and Universal Music.

“Anyone who believes their voice is used without their permission would ask the same questions Scarlett Johansson is asking now. The AI Foundation Model Transparency Act would ensure that those questions are answered,” Beyer said.

He added that Johannson’s is not the first case and will not be the last, but is a high-profile example of growing need for transparency in AI.

“Congress can help solve this problem by requiring creators of AI foundation models to share key information with regulators and the public, which is exactly what my bill would do,” Beyer said. He will continue to encourage his colleagues to pass AI legislation for transparency.

Johansson’s statement alleged that the voice in a recently-unveiled OpenAI personal assistant voice named “Sky” was so similar to her own that “her closest friends… could not tell the difference.” She went on to urge “the passage of appropriate legislation” to address the issue.

OpenAI claimed in response that “Sky’s voice is not an imitation of Scarlett Johansson but belongs to a different professional actress using her own natural speaking voice.”

The AI Foundation Model Transparency Act, if enacted, would ensure that information about the data sources in such cases is made public to establish the truth.

The AI Foundation Model Transparency Act would:

  • Direct the FTC, in consultation with NIST, the Copyright Office, and OSTP, to set transparency standards for foundation model deployers, by asking them to make certain information publicly available to consumers;
  • Direct companies to provide consumers and the FTC with information on the model’s training data, model training mechanisms, and whether user data is collected in inference; and
  • Protect small deployers and researchers, while seeking responsible transparency practices from our highest-impact foundation models.

The bill would also help copyright owners protect their copyrights, addressing widespread concerns from businesses and individuals about AI, by giving users more information to help them begin to determine if their copyrights were included in an AI foundation model’s training data.

Rebecca Barnabi

Rebecca Barnabi

Rebecca J. Barnabi is the national editor of Augusta Free Press. A graduate of the University of Mary Washington, she began her journalism career at The Fredericksburg Free-Lance Star. In 2013, she was awarded first place for feature writing in the Maryland, Delaware, District of Columbia Awards Program, and was honored by the Virginia School Boards Association’s 2019 Media Honor Roll Program for her coverage of Waynesboro Schools. Her background in newspapers includes writing about features, local government, education and the arts.