The latest trend in AI regulation is to bypass lawmakers. The White House is using procurement to regulate the AI market, and California is doing the same. They are taking different approaches that might add costs and create conflicts for businesses.
The U.S. General Services Administration, which manages government purchasing, released AI procurement rules March 6 that include a “no dogmas” AI approach. This includes requiring systems to remain neutral and avoid ideological positions, including those related to diversity, equity and inclusion. It wants nonpartisan rules.
In contrast, California, under a March 30 executive order from Gov. Gavin Newsom, requires vendors to demonstrate safeguards against harmful bias and protections for civil rights.
In Lieu of Regulations
Meanwhile, the AI industry is spending millions of dollars to lobby state and federal lawmakers to limit regulations. OpenAI, earlier this month, released a policy paper calling for a democratic process to shape AI. But procurement isn't democratic. Vendors are being told that to win government contracts, they must meet certain rules.
“Money drives behavior,” said Ojas Rege, senior vice president and general manager of privacy and data governance at OneTrust, which develops an AI governance platform.
The proposed GSA rules on AI were called “landmark” in an analysis by law firm Holland & Knight, and "are among the most prescriptive seen in federal contracting.”
The nine-page GSA proposal requires, among other things, the use of AI systems developed and produced in the U.S. It prevents vendors from using government data to train or improve AI models for other customers. California has not yet issued specific data governance rules.
The GSA proposal doesn't define numeric thresholds for bias, making it difficult to measure, said Patrick Sullivan, vice president of strategy and innovation at A-Lign, a cybersecurity assurance firm.
The proposal requires bias monitoring and government-run benchmarking but doesn't specify which metrics to use. In contrast, the Equal Employment Opportunity Commission uses the four-fifths, or 80%, rule. If a selection rate for any race, sex or ethnic group is less than 80% of the rate for the group with the highest selection rate, it may indicate disparate impact.
“We really don't have anything to measure other than someone's subjective judgment,” Sullivan said.
Federal Executive Order
In December, President Donald Trump issued an executive order directing federal agencies to challenge state AI laws. “There must be only one rulebook if we are going to continue to lead in AI,” Trump wrote on TruthSocial before releasing the order. But the order isn't deterring states. AI-related bills are being introduced in many states.
Lawmakers in Nebraska and Oklahoma, for instance, are considering limits on electronic shelf labels because of concerns they could be used to charge different customers different prices for the same item. Many states are considering bills aimed at protecting minors from chatbots, and mandating human review requirements for high-risk systems, such as those used in healthcare. AI’s infrastructure needs are also getting pushback from state lawmakers, worried about their impact on the grid and environment. Maine recently approved a temporary ban on new large data centers.
Even if Trump succeeds in limiting new AI state laws, procurement rules present a different problem for the administration.
“The federal government cannot tell a state how they're going to spend their money,” said Reiko Feaver, a partner at CM Law in Atlanta.

The federal government is also limited by what it can approve.
But non-discrimination rules "can't be preempted by the federal government, even if they were to try," Feaver said.
“The federal government cannot tell a state how they're going to spend their money,” she said.
Rege, of OneTrust, said the approaches organizations must take internally to manage AI risk do not change because of these executive orders, but the financial stakes are higher because procurement is involved. The executive orders “could have a real monetary impact on that organization's ability to sell,” he said.
The White House's approach to AI procurement has been evolving. Last July, Trump issued a procurement-focused executive order directing agencies to ensure AI systems used by the government are free from ideological bias.
The White House followed that up with a memo to agencies calling for these goals to be spelled out in procurement contracts and for modifications to existing contracts. The GSA proposal last month set rules around these goals.
State Executive Order
California Gov. Gavin Newsom responded to the federal procurement proposal with an executive order, accusing the federal government of dismantling contracting standards that "remove basic protections for Americans."
California’s procurement rules may be limited. Newsom’s executive order gives California’s Department of General Services and Department of Technology four months to come up with contracting guidelines. But a future governor can rescind executive orders, and Newsom leaves office in January 2027. The state’s governor is limited to two terms.
The use of procurement to drive market change is not new. For example, “Medicare effectively is the regulator of healthcare in the United States,” noted James Hennessy, a partner at law firm Reed Smith.
“That's the kind of power the federal government could wield in making these decisions,” Hennessy said. But California complicates that, he added.
California, with its powerful economy, wields economic heft “to actually counter the federal government's weight in setting procurement-based standards,” Hennessy said.
Avani Desai, CEO of Schellman, an IT compliance and cybersecurity auditing firm, said vendors can't build separate AI systems for each regulator.
“I don't think you solve this by giving users the on/off switch for ethics. That actually is going to create more risk,” said Desai. What can change is how the results are framed to show that the system is explainable and fair, she said.
AI systems must be well governed, with built-in safeguards for bias testing, explainability and monitoring, Desai said. There is also a need for strong governance layers, as well as a flexible evidence and reporting system, so that the system can be explained differently to different buyers.
Calvin Cooper, co‑founder of NeuroMetric AI, a vendor of an AI optimization platform, said that in the absence of federal regulations, AI oversight is falling to the states.
But Cooper argued that states act as incubators of rules, something he sees as a “feature, not a bug, of our system.” Procurement is one way to test different approaches, he said.