Quotation compilations are filled with jabs at lawmakers, as deep thinkers complain about the cravenness, venality and opportunism of politicians. Journalist H.L. Mencken complained that a “good politician is quite as unthinkable as an honest burglar.” Napoleon Bonaparte supposedly quipped that “in politics stupidity is not a handicap.” I’ve known good, honest and smart politicians, but my main beef is their overall lack of humility.
Not so much personal humility, but a sense of limits on what government can accomplish. California is notoriously absurd on this front, as our top politicians routinely make grandiose pronouncements. Their latest ban will change the trajectory of Earth’s climate patterns! They will stand up to greed and Other Evil Forces! Every one of them aspires to sound like John F. Kennedy.
Sure, governments can occasionally accomplish something worthwhile, but the ones that make the most elaborate promises seem least able to deliver basic services. My local public utility promises only to keep the lights on and succeeds at the task virtually every day. By contrast, the state vows to end poverty, but can’t manage to distribute unemployment benefits without sending billions to fraudsters.
It’s with that backdrop that I present the latest hubris: Senate Bill 1047, which sits on the governor’s desk. It’s the Legislature’s “first-in-the-nation,” “groundbreaking,” “landmark” effort to take control of Artificial Intelligence before, as in the movie “Terminator,” AI gains self-awareness. I’ll always remember gubernatorial candidate Arnold Schwarzenegger’s visit to The Orange County Register while on break from filming the 2003 sequel, but Hollywood usually is no model for Sacramento.
Per a Senate analysis, the bill “requires developers of powerful artificial intelligence models and those providing the computing power to train such models to put appropriate safeguards and policies into place to prevent critical harms” and “establishes a state entity to oversee the development of these models.”
Once when testifying about a bill in another state that imposed taxes and regulations on vaping devices, I watched as lawmakers passed around vape examples and looked at them with apparent bewilderment. They had little understanding about how these relatively simple devices operated. How many California lawmakers truly understand the nature of AI models, which are among the most complex (and rapidly developing) technologies in existence?
Do you suppose lawmakers will protect us from unforeseen “critical harms” from an almost unknowingly complex technology in ways we have yet to fathom? If you believe that, then you might have too much faith in governmentâand too little insight into the clunky, retrograde way by which it almost always operates. It sometimes is efficient in twisting new regulatory tools to abuse our rights, but rarely in service to our protection.
Some tech groups (including my employer, the R Street Institute) sent a letter to Gavin Newsom urging a veto. “SB 1047 is designed to limit the potential for ‘critical harm’ which includes ‘the creation or use of a chemical, biological, radiological or nuclear weapon in a manner that results in mass casualties,'” it argued. “These harms are theoretical. There are no real-world examples of third parties misusing foundation models to cause mass casualty events.”
Yet California lawmakers believe they have to savvy to stop some fictional catastrophe they’ve seen in a dystopian movie by imposing regulations that will, say, require a “kill switch” (like an easy button!). They will create another bureaucracy, where regulators will presumably understand this technology at the level of its designers. If they were that skilled, they’d be start-up billionaires living in Mountain View, rather than state workers living in Orangevale.
While the benefits of such far-reaching and vague regulation are hard to imagine, the downsides are pretty clearâespecially in a state so dependent on its tech industry. California has of a few years been losing tech companies and jobs from the San Francisco Bay Area, but AI is a growing hot spot. Is it wise to chase this industry away? It’s not as if AI designers can’t easily build their businesses in other communities (Austin or Seattle) with a large tech-worker cohort.
To their credit, lawmakers amended the bill to remove some troubling provisions that could subject AI firms to attorney-general lawsuits and even potential criminal charges, but it still will leave the industry dazed, confused, and subject to incalculable penalties. This is a start-up heavy industry, yet these provisions will place particular burdens on start-up companies that lack the compliance and legal resources to navigate the state-imposed thicket.
“The entire framework is built on the premise that these advanced models will pose a threat, which is an assumption that is highly contested,” wrote the American Enterprise Institute’s Will Rinehart. “And to top it off, if these AI models are truly dangerousâŠthen California shouldn’t be regulating them anywayâit should be the purview of the federal government.” That analysis makes sense, but who believes Newsom has the humility to heed it?
This column was first published in The Orange County Register.