A16z Super PAC Targets NY Lawmaker Over AI Safety Bill

A16z Super PAC Targets NY Lawmaker Over AI Safety Bill - Professional coverage

According to TechCrunch, a super PAC called Leading the Future with over $100 million in backing from Andreessen Horowitz and OpenAI President Greg Brockman has chosen New York Assembly member Alex Bores as its first political target. The PAC formed in August specifically to oppose AI regulation and is now targeting Bores over his sponsorship of New York’s bipartisan RAISE Act. That legislation would require large AI labs to implement safety plans, disclose safety incidents, and face civil penalties up to $30 million for violations. Bores, who’s running for Congress in New York’s 12th District, says his constituents are increasingly concerned about AI risks from data centers to mental health impacts. The super PAC’s leaders told Politico they’re launching a multi-billion dollar effort to sink Bores’s campaign.

Special Offer Banner

Silicon Valley vs State Regulation

Here’s the thing that makes this fight particularly interesting. We’re seeing a classic power struggle between state-level innovation and federal preemption. Bores makes a compelling point about states acting as “policy laboratories” when Washington moves slowly. But Silicon Valley absolutely hates this approach – they want one national framework that they can influence centrally rather than fighting 50 different battles.

The RAISE Act itself is actually pretty moderate when you look at what got negotiated out. Bores consulted with OpenAI and Anthropic during drafting and removed provisions like third-party safety audits that the industry refused to accept. So we’re talking about a bill that already represents significant compromise, yet still draws this level of opposition. That tells you something about how allergic some tech leaders are to any regulation at all.

The Preemption Play

What’s really concerning is the broader strategy here. We’ve already seen attempts to slip federal preemption clauses into budget bills – meaning states would be blocked from passing any AI regulations whatsoever. Senator Ted Cruz and others are pushing to resurrect this approach. But here’s the problem: Congress hasn’t passed any meaningful AI regulation while simultaneously trying to prevent states from acting. That creates a regulatory vacuum where nobody’s addressing real public concerns.

Bores nailed it when he said, “The question should be, has Congress solved the problem? If Congress solves the problem, then it can tell the states to get out of the way.” Until then, states have every right to protect their citizens from potential harms. And let’s be honest – when you see industrial technology deployments that could impact public safety or critical infrastructure, having some basic safeguards isn’t anti-innovation. It’s responsible governance.

Trust as Competitive Advantage

Bores makes another crucial point that the industry seems to be missing: “The AI that wins is going to be the AI that is trustworthy.” He’s absolutely right. Look at any technology that’s become mainstream – from automobiles to pharmaceuticals to industrial computing systems – public trust built through reasonable regulation actually enables adoption, not hinders it.

The super PAC’s argument that this threatens American competitiveness feels increasingly hollow. China isn’t winning the AI race because the U.S. has safety standards – they’re competing through massive state investment and different value systems. Meanwhile, companies that build trustworthy systems often become market leaders in their sectors precisely because customers know they can rely on them.

So what we’re really watching here is a fundamental debate about the role of government in emerging technology. Should we wait for perfect federal legislation that might never come? Or allow states to experiment with solutions to immediate problems? Given the pace of AI development, the answer seems pretty clear to me.

Leave a Reply

Your email address will not be published. Required fields are marked *