California wants to prevent an AI “catastrophe”

Courtesy of Midjourney

The Golden State may be close to passing AI safety regulation — and Silicon Valley isn’t pleased.

The proposed AI safety bill, SB 1047, also known as the Safe and Secure Innovation for Frontier Artificial Intelligence Models Act, aims to establish “common sense safety standards” for powerful AI models.

The bill would require companies developing high-powered AI models to implement safety measures, conduct rigorous testing, and provide assurances against "critical harms," such as the use of models to execute mass-casualty events and cyberattacks that lead to $500 million in damages. It warns that the California attorney general can take civil action against violators, though rules would only apply to models that cost $100 million to train and pass a certain computing threshold.

A group of prominent academics, including AI pioneers Geoffrey Hinton and Yoshua Bengio,published a letter last week to California’s political leaders supporting the bill. “There are fewer regulations on AI systems that could pose catastrophic risks than on sandwich shops or hairdressers,“ they wrote, saying that regulations are necessary not only to rein in the potential harms of AI but also to restore public confidence in the emerging technology.

Critics, including many in Silicon Valley, argue the bill is overly vague and could stifle innovation. In June, the influential startup incubator Y Combinator, wrote a public letter outlining its concerns. It said that liability should lie with those who abuse AI tools, not developers, that the threshold for inclusion under the law is arbitrary, and that a requirement that developers include a “kill switch” allowing them to turn off the model would be a “de facto ban on open-source AI development.”

Steven Tiell, a nonresident senior fellow with the Atlantic Council's GeoTech Center, thinks the bill is “a good start” but points to “some pitfalls.” He appreciates that it only applies to the largest models but has concerns about the bill’s approach to “full shutdown” capabilities – aka the kill switch.

“The way SB 1047 talks about the ability for a ‘full shutdown’ of a model – and derivative models – seems to assume foundation models would have some ability to control derivative models,” Tiell says. He warned this could “materially impact the commercial viability of foundation models across wide swaths of the industry.”

Hayley Tsukayama, associate director of legislative activism at the Electronic Frontier Foundation, acknowledges the tech industry’s concerns. “AI is changing rapidly, so it’s hard to know whether — even with the flexibility in the bill — the regulation it’s proposing will age well with the industry,” she says.

“The whole idea of open-source is that you’re making a tool for people to use as they see fit,” she says, emphasizing the burden on open-source developers. “And it’s both harder to make that assurance and also less likely that you’ll be able to deal with penalties in the bill because open-source projects are often less funded and less able to spend money on compliance.”

State Sen. Scott Wiener, the bill’s sponsor, told Bloomberg he’s heard industry criticisms and made adjustments to its language to clarify that open-source developers aren’t entirely liable for all the ways their models are adapted, but he stood by the bill’s intentions. “I’m a strong supporter of AI. I’m a strong supporter of open source. I’m not looking in any way to impede that innovation,” Wiener said. “But I think it’s important, as these developments happen, for people to be mindful of safety.” Spokespeople for Wiener did not respond to GZERO’s request for comment.

In the past few months, Utah and Colorado have passed their own AI laws, but they’ve both focused on consumer protection rather than liability for catastrophic results of the technology. California, which houses many of the biggest companies in AI, has broader ambitions. But while California has been able to lead the nation — and the federal government on data privacy — it might need industry support to get its AI bill fully approved in the legislature and signed into law. California’s Senate passed the bill last month, and the Assembly is set to vote on it before the end of August.

California Gov. Gavin Newsom hasn’t signaled whether or not he’ll sign the bill should it pass both houses of the legislature, but in May, he publicly warned against over-regulating AI and ceding America’s advantage to rival nations: “If we over-regulate, if we overindulge, if we chase the shiny object, we could put ourselves in a perilous position.”

More from GZERO Media

- YouTube

Fifty years after the fall of Saigon (or its liberation, depending on whom you ask), Vietnam has transformed from a war-torn battleground to one of Asia’s fastest-growing economies—and now finds itself caught between two superpowers. Ian Bremmer breaks down how Vietnam went from devastation in the wake of the Vietnam War to becoming a regional economic powerhouse.

Eurasia Group and GZERO Media are seeking a highly creative, detail-oriented Graphic and Animation Designer who lives and breathes news, international affairs, and policy. The ideal candidate has demonstrated experience using visual storytelling—including data visualizations and short-form animations—to make complex geopolitical topics accessible, social-friendly, and engaging across platforms. You will join a dynamic team of researchers, editors, video producers, and writers to elevate our storytelling and thought leadership through innovative multimedia content.

The body of Pope Francis in the coffin exposed in St. Peter's Basilica in Vatican City on April 24, 2025. The funeral will be celebrated on Saturday in St. Peter's Square.
Pasquale Gargano/KONTROLAB/ipa-agency.net/IPA/Sipa USA

While the Catholic world prepares for the funeral of Pope Francis on Saturday – the service begins at 10 a.m. local time, 4 a.m. ET – certain high-profile attendees may also have other things on their mind. Several world leaders will be on hand to pay their respects to the pontiff, but they could also find themselves involved in bilateral talks.

A Ukrainian rescue worker sits atop the rubble of a destroyed residential building during rescue operations, following a Russian missile strike on a residential apartment building block in Kyiv, Ukraine, on April 24, 2025.
Photo by Justin Yau/ Sipa USA
Members of the M23 rebel group stand guard at the opening ceremony of Caisse Generale d'epargne du Congo (CADECO) which will serve as the bank for the city of Goma where all banks have closed since the city was taken by the M23 rebels, in Goma, North Kivu province in the East of the Democratic Republic of Congo, April 7, 2025.
REUTERS/Arlette Bashizi

The Democratic Republic of the Congo and an alliance of militias led by the notorious M23 rebels announced a ceasefire on Thursday after talks in Qatar and, after three years of violence, said they would work toward a permanent truce.

Students shout slogans and burn an effigy to protest the Pahalgam terror attack in Guwahati, Assam, India, on April 24, 2025. On April 22, a devastating terrorist attack occurs in Pahalgam, Jammu and Kashmir, resulting in the deaths of at least 28 tourists.
Photo by David Talukdar/NurPhoto

Prime Minister Narendra Modi has blamed Pakistan for Tuesday’s deadly terrorist attack in Kashmir, and he’s takenaggressive action against its government.