California presses ahead with state AI regulation framework in U.S.-Xinhua

California presses ahead with state AI regulation framework in U.S.

Source: Xinhua

Editor: huaxia

2026-04-12 14:17:45

by Wen Tsui

SACRAMENTO, California, April 11 (Xinhua) -- The U.S. state of California has been pressing ahead with its own artificial intelligence (AI) regulation framework amid unfavorable federal policies.

In January, the state enacted two AI safety laws. In March, the administration of President Donald Trump unveiled a national policy framework to limit what states can do in regulating AI.

Following the White House move, California Governor Gavin Newsom, a Democrat, has extended the state's approach by directing agencies to apply state standards to government contracts as well.

According to the Governor's Office, Newsom's executive order on March 30 instructed the state's chief information security officer to independently review new federal supply-chain-risk designations and, where deemed improper, allow state agencies to continue procuring from the affected company.

California has been building its own AI regulation framework as the federal government moves in the opposite direction to seek uniform and "minimally burdensome" national standards.

The Trump administration's efforts include an AI action plan released in July 2025 to remove what it describes as "red tape and onerous regulation" on AI development, an executive order in December 2025 establishing a legal task force to challenge state AI laws in court, and the legislative framework published in March calling on U.S. Congress to preempt state AI rules, which the administration says "impose undue burdens."

"While others in Washington are designing policy and creating contracts in the shadow of misuse, we're focused on doing this the right way," Newsom said in an official statement accompanying his March 30 order, which was titled "As Trump rolls back protections, Governor Newsom signs first-of-its-kind executive order to strengthen AI protections and responsible use."

At a press conference on Tuesday, on the sidelines of the April 6-9 HumanX AI conference in San Francisco, Nand Mulchandani, a former U.S. government official and visiting fellow at Stanford University's Hoover Institution, said that the AI governance-related dispute reflects a structural feature of the U.S. system.

"Federal regulation is the lowest common denominator policy, but it's uniform, and it's predictable," he said.

"The negative is that individual states don't get to have their own unique culture and points of view represented. But that's how the country is structured," he said.

Mulchandani declined to comment on whether California or the federal government has the better approach, calling it "a personal choice question."

California's two AI laws, both of which took effect on Jan. 1, 2026, require large AI model developers to publish safety frameworks, report critical incidents to state authorities, and disclose the data sources used to train their systems.

Multinational Law firm Morrison Foerster noted in a February analysis that the California laws closely resemble transparency requirements under the European Union's AI Act, allowing companies to satisfy both regimes through a single compliance program.

The Government AI Coalition, a network created by the City of San Jose in 2023, received city council approval this week to become an independent nonprofit, media outlet StateScoop reported. The California-based coalition represents more than 900 public agencies and 3,000 government members and produces procurement templates and policy guidance for government AI adoption.

Speaking at the HumanX conference, which focuses on the real-world impact of AI on business, leadership and society, Nighat Dad, founder of the Digital Rights Foundation, noted that the fragmentation of AI governance worldwide is leaving users in developing countries without recourse.

"When it comes to harms and weaponization of these tools, they are the ones who bear the brunt of it and still have no say around this," she said.