ai model

California’s first partner pushes to regulate AI as Trump threatens to forbid regulations

California First Partner Jennifer Siebel Newsom recently convened a meeting that might rank among the top sweat-inducing nightmare scenarios for Silicon Valley’s tech bros — a group of the Golden State’s smartest, most powerful women brainstorming ways to regulate artificial intelligence.

Regulation is the last thing this particular California-dominated industry wants, and it’s spent a lot of cash at both the state and federal capitols to avoid it — including funding President Trump’s new ballroom. Regulation by a bunch of ladies, many mothers, with profit a distant second to our kids when it comes to concerns?

I’ll let you figure out how popular that is likely be with the Elon Musks, Peter Thiels and Mark Zuckerbergs of the world.

But as Siebel Newsom said, “If a platform reaches a child, it carries a responsibility to protect that child. Period. Our children’s safety can never be second to the bottom line.”

Agreed.

Siebel Newsom’s push for California to do more to regulate AI comes at the same time that Trump is threatening to stop states from overseeing the technology — and is ramping up a national effort that will open America’s coffers to AI moguls for decades to come.

Right now, the U.S. is facing its own nightmare scenario: the most powerful and world-changing technology we have seen in our lifetimes being developed and unleashed under almost no rules or restraints other than those chosen by the men who seek personal benefit from the outcome.

To put it simply, the plan right now seems to be that these tech barons will change the world as they see fit to make money for themselves, and we as taxpayers will pay them to do it.

“When decisions are mainly driven by power and profit instead of care and responsibility, we completely lose our way, and given the current alignment between tech titans and the federal administration, I believe we have lost our way,” Siebel Newsom said.

To recap what the way has been so far, Trump recently tried to sneak a 10-year ban on the ability of states to oversee the industry into his ridiculously named “Big Beautiful Bill,” but it was pulled out by a bipartisan group in the Senate — an early indicator of how inflammatory this issue is.

Faced with that unexpected blockade, Trump has threatened to sign a mysterious executive order crippling states’ ability to regulate AI and attempting to withhold funds from those that try.

Simultaneously, the most craven and cowardly among Republican congresspeople have suggested adding a 10-year ban to the upcoming defense policy bill that will almost certainly pass. Of course, Congress has also declined to move forward on any meaningful federal regulations itself, while technology CEOs including Trump frenemy Musk, Apple’s Tim Cook, Meta’s Zuckerberg and many others chum it up at fancy events inside the White House.

Which may be why this week, Trump announced the “Genesis Mission,” an executive order that seemingly will take the unimaginable vastness of government research efforts across disciplines and dump them into some kind of AI model that will “revolutionize the way scientific research is conducted.

While I am sure that nothing could possibly go wrong in that scenario, that’s not actually the part that is immediately alarming. This is: The project will be overseen by Trump science and technology policy advisor Michael Kratsios, who holds no science or engineering degrees but was formerly a top executive for Thiel and former head of another AI company that works on warfare-related projects with the Pentagon.

Kratsios is considered one of the main reasons Trump has embraced the tech bros with such adoration in his second term. Genesis will almost certainly mean huge government contracts for these private-sector “partners,” fueling the AI boom (or bubble) with taxpayer dollars.

Siebel Newsom’s message in the face of all this is that we are not helpless — and California, as the home of many of these companies and the world’s fourth-largest economy in its own right, should have a say in how this technology advances, and make sure it does so in a way that benefits and protects us all.

“California is uniquely positioned to lead the effort in showing innovation and responsibility and how they can go hand in hand,” she said. “I’ve always believed that stronger guardrails are actually good for business over the long term. Safer tech means better outcomes for consumers and greater consumer trust and loyalty.”

But the pressure to cave under the might of these companies is intense, as Siebel Newsom’s husband knows.

Gov. Gavin Newsom has spent the last few years trying to thread the needle on state legislation that offers some sort of oversight while allowing for the innovation that rightly keeps California and the United States competitive on the global front. The tech industry has spent millions in lobbying, legal fights and pressure campaigns to water down even the most benign of efforts, even threatening to leave the state if rules are enacted.

Last year, the industry unsuccessfully tried to stop Senate Bill 53, landmark legislation signed by Newsom. It’s a basic transparency measure on “frontier” AI models that requires companies to have safety and security protocols and report known “catastrophic” risks, such as when these models show tendencies toward behavior that could kill more than 50 people — which they have, believe it or not.

But the industry was able to stop other efforts. Newsom vetoed both Senate Bill 7, which would have required employers to notify workers when using AI in hiring and promotions; and Assembly Bill 1064, which would have barred companion chatbot operators from making these AI systems available to minors if they couldn’t prove they wouldn’t do things like encourage kids to self-harm, which again, these chatbots have done.

Still, California (along with New York and a few other states) has pushed forward, and speaking at Siebel Newsom’s event, the governor said that last session, “we took a number of at-bats at this and we made tremendous progress.”

He promised more.

“We have agency. We can shape the future,” he said. “We have a unique responsibility as it relates to these tools of technology, because, well, this is the center of that universe.”

If Newsom does keep pushing forward, it will be in no small part because of Siebel Newsom, and women like her, who keep the counter-pressure on.

In fact, it was another powerful mom, First Lady Melania Trump, who forced the federal government into a tiny bit of action this year when she championed the “Take It Down Act, which requires tech companies to quickly remove nonconsensual explicit images. I sincerely doubt her husband would have signed that particular bill without her urging.

So, if we are lucky, the efforts of women like Siebel Newsom may turn out to be the bit of powerful sanity needed to put a check on the world-domination fantasies of the broligarchy.

Because tech bros are not yet all-powerful, despite their best efforts, and certainly not yet immune to the power of moms.

Source link

Rapper RBX sues Spotify, accuses Drake of benefiting from fraudulent music streams

Rapper RBX has sued Spotify, alleging that the Swedish audio company has failed to stop the artificial inflation of music streams for artists like Drake and is hurting the revenue other rights holders receive through the platform.

RBX, whose real name is Eric Dwayne Collins, is seeking a class-action status and damages and restitution from Spotify. RBX, along with other rights holders, receive payment based on how often their music is streamed on Spotify, according to the lawsuit, filed in U.S. District Court in L.A. on Sunday.

Spotify pays rights holders a percentage of revenue based on the total streams attributed to them compared with total volume of streams for all songs, the lawsuit said.

The Long Beach-based rapper said that rights holders are losing money on Spotify because streams of some artists are being artificially inflated through bots powered by automated software, even though the use of such bots is prohibited on the platform, according to the lawsuit.

For example, the lawsuit notes that over a four-day period in 2024 there were at least 250,000 streams of Drake’s “No Face” song that appeared to originate in Turkey, but “were falsely geomapped through the coordinated use of VPNs to the United Kingdom in attempt to obscure their origins.”

Spotify knew or should have known “with reasonable diligence, that fraudulent activities were occurring on its platform,” states the lawsuit, describing the streamer’s policies to root out fraud as “window dressing.”

Spotify declined to comment on the pending litigation but said it “in no way benefits from the industry-wide challenge of artificial streaming.”

“We heavily invest in always-improving, best-in-class systems to combat it and safeguard artist payouts with strong protections like removing fake streams, withholding royalties, and charging penalties,” Spotify said in a statement.

Last year, a U.S. producer was accused of stealing $10 million from streaming services and Spotify said it was able to limit the theft on its platform to $60,000, touting it as evidence that its systems are working.

The platform is also making efforts to push back against AI-generated music that is made without artists’ permission. In September, Spotify announced it had removed more than 75 million AI-generated “spammy” music tracks from its platform over the last 12 months.

A representative for Drake did not immediately return a request for comment.

RBX is known for his work on Dr. Dre’s 1992 album “The Chronic” and Snoop Dogg’s 1993 album “Doggystyle.” He has multiple solo albums and has collaborated with artists including on Eminem’s “The Marshall Mathers LP” and Kris Kross’ “Da Bomb.” RBX is Snoop Dogg’s cousin.

Artificial intelligence continues to change the way that the entertainment industry operates, affecting everything from film and TV production to music. In the music industry, companies have sued AI startups, accusing the businesses of taking copyrighted music to train AI models.

At the same time, some music artists have embraced AI, using the technology to test bold ideas in music videos and in their songs.

Source link