
Spencer Cox urges states to set AI safety rules, pushes energy protections
Cox presses states' role in AI governance
At a Washington governors' forum, Spencer Cox argued states should keep the ability to act where public safety, child welfare and local deployments are at stake, urging governors to adopt rules limiting harmful AI uses in schools and communities. He framed state action as complementary to federal efforts, not purely oppositional, and said some choices—like classroom-facing limits and deployment controls—are best decided locally because implementation and enforcement are place-specific.
On infrastructure, Cox distinguished support for adding compute capacity from granting special subsidies to major providers, and highlighted recent state energy changes intended to shield ratepayers from electricity price increases tied to high-density data centers and AI compute loads. He recommended policies that steer where compute clusters are sited and how costs are allocated so residential customers do not absorb the marginal price impacts of industrial-scale workloads.
Cox’s comments arrived as the White House sharpened a national posture: senior administration deliberations produced a narrower executive approach that preserves several state carve-outs—notably for minors and for data-center and procurement rules—while seeking to reduce harmful variation across jurisdictions. That compromise, reached in the vice president’s office, has not ended the debate; it reorients the fight toward congressional bargaining, agency rulemaking and likely litigation over preemption boundaries.
Industry players continue to press for uniform national rules and infrastructure investments, even as some governors warn that broad federal preemption would undercut local innovation and protections. Venture and corporate donors have amplified their political activity—building on roughly $125 million reported in 2025—to influence how federal statutes and certification regimes are written and to push for shared compute, portability and auditability standards that lower compliance burdens for large, cross-state deployments.
The practical consequences for companies and public-sector CIOs are immediate: expect a mix of state-level guardrails focused on content, child protections and deployment controls alongside federal standards that carve out enumerated state lanes. Firms will need technical mitigations—geo-fencing, differentiated product tiers, verifiable logging and portability features—to navigate overlapping rules and reduce stacked enforcement risk.
Legal experts see a high probability of preemption litigation if Congress or the White House attempts broad exclusions of state authority; courts will be asked to define which regulatory lanes belong to states and which belong to Washington. Economically, energy and permitting rules at the state level may influence where data centers and compute clusters locate, shifting investment patterns and local economic impacts.
For governors, the situation reframes AI governance as a multilevel coordination challenge: state experimentation can produce rapid, targeted protections—particularly for minors—but it also raises compliance complexity for national products. Cox urged pragmatic coordination among states and with federal partners to reduce needless friction while preserving the ability to act on local harms.
Read Our Expert Analysis
Create an account or login for free to unlock our expert analysis and key takeaways for this development.
By continuing, you agree to receive marketing communications and our weekly newsletter. You can opt-out at any time.
Recommended for you
Lawmaker urges federal-first approach to AI rules to prevent patchwork state laws
Rep. Jay Obernolte says last year’s proposed 10-year moratorium was a tactical push to force Congress to build a national AI framework, not a permanent ban on state action. He urged Congress to pair clear federal preemption language with explicitly preserved state lanes, praising a narrowed White House executive order that reflected an internal compromise and preserved carve-outs for areas like child safety and data-center governance.
U.S. White House AI Push Exposes Deep Rift in Republican Coalition
A private clash between a White House AI adviser and senior Trump-aligned figures crystallized a widening split in the Republican coalition over federal preemption and the pace of AI deregulation. The episode coincided with an accelerated, well-funded industry campaign — including large PAC coffers and calls for public compute and interoperability — that will push the policy fight onto Capitol Hill and into the courts.
AI Industry Super PAC Banks $125M to Push National Rules, Targets State-Level Champions
A newly formed PAC backed by major AI investors and companies raised $125 million in 2025 and entered 2026 with roughly $70 million to deploy in federal races aimed at securing uniform national AI rules. The move dovetails with broader industry efforts to shape infrastructure and standards policy—such as calls for public compute, interoperability, portability and auditability—so that divergent state laws do not dictate the regulatory baseline.

