Meta boosts Texas AI data center spend to $10B
Meta boosts Texas AI data center spend to $10B
Source link
Meta boosts Texas AI data center spend to $10B
Source link
IMPERIAL — Whenever the weather changes suddenly, or the skyline becomes shrouded in a windy haze, Fernanda Camarillo braces herself for an asthma attack.
Her condition has become more manageable, but the 27-year-old said it’s still scary when her chest tightens and she starts to wheeze. It was one of her first thoughts when she heard about plans to develop a massive data center next to her home in Imperial County, a farming community near the border of Mexico that struggles with poor air quality.
“A lot of people in the county are asthmatic,” she said, explaining that she worries the new center would add more pollution. “I’ve been anxious — so many of us are voicing our concerns.”
Data centers have existed for decades but are rapidly changing and expanding due to the worldwide boom in artificial intelligence, or AI as it’s known. States and communities nationwide have started pushing back, citing concerns that the projects could strain power grids, increase utility bills and have negative health and environmental impacts.
In California, state legislators are debating how to protect residents and natural resources without creating so much red tape that developers go elsewhere, taking their jobs and taxable earnings with them.
No Data Center signs are posted in the front yard of a home that is right behind the proposed site.
“We can be supportive of innovation and a technology that is needed but also protect our communities and our health and our environment,” said state Sen. Steve Padilla (D-San Diego). “We can do both at the same time.”
The California Legislature is considering bills to prohibit the projects from being exempted from the state’s stringent environmental law and to impose new tariffs on new major energy users that strain power supplies. Lawmakers also have proposed restrictions on new data centers, requiring companies to provide verifiable estimates on expected water and energy usage before they can be granted a business permit.
Imperial resident Fernanda Camarillo, who is an asthmatic, holds some of her medications.
Members of Congress also expressed concerns. Rep. Ro Khanna, speaking at a town hall about AI last month at Stanford University, said legislators must ensure data centers serve the communities that power them.
“We live in a new gilded age,” said Khanna (D-Fremont). “What kind of future are we going to build?”
::
Eric Masanet, a professor at UC Santa Barbara specializing in sustainability science for emerging technologies, described the facilities as the “brains” of the internet. The sprawling centers are filled with banks of specialized computers that process online shopping orders, stream movies, host websites, encode Zoom and other videoconferencing apps, store data and serve as switching stations for the digital world that’s now woven into daily life.
Data centers, particularly those that power AI, use significant amounts of water and energy. The facilities accounted for about 4.4% of the nation’s total electricity consumption in 2023, up from 1.9% in 2018, according to a report provided to Congress from the Lawrence Berkeley National Laboratory. The researchers projected that figure will reach 6.7% to 12% by 2028.
Many companies, including big tech giants like Meta, Google and Amazon, are making major investments in AI.
“We are building a lot more data centers faster than we ever did — and a new AI data center is 10 to 20, maybe 30 times, the size of the largest data centers we had before,” Masanet said.
The proposed site of the 950,00-square-foot data center is on a dusty parcel that is next to the Victoria Ranch housing community and adjacent to farmland in Imperial, Calif.
It’s unclear how many data centers are in the state. A California Energy Commission spokesperson told the Los Angeles Times it does not track this information. Data Center Map, a nongovernmental website that tracks data centers across the world, lists 289 facilities in California, with more than 4,000 nationwide.
The federal government has, so far, largely left it to states or localities to regulate data centers.
The facilities can generate significant revenue for local governments due to sales and property taxes.
But some new proposals are sparking a backlash. More than 200 community and environmental organizations, including a dozen from California, sent an open letter to Congress in December calling for a national moratorium on new data centers.
Robert Gould, a pathologist with San Francisco Bay Physicians for Social Responsibility, one of the organizations that signed the letter, explained data centers are causing a shift away from renewable energy and back toward fossil fuels because the facilities need a reliable and constant stream of power.
Cornell University researchers last year estimated that AI growth could add 24 to 44 million metric tons of carbon dioxide to the atmosphere annually by 2030, unless steps are taken to change course.
Gould said fossil fuel emissions are associated with various cancers, an increase in hospitalizations for older adults due to respiratory conditions, and asthma attacks or stunted lung growth in children. Particulate matter from fossil fuel emissions is also linked to cardiovascular events and negative effects on maternal fetal health.
Gould’s organization has noticed an alarming trend.
“These are generally placed in communities that are the least able to defend themselves,” he said.
Farmworkers toil in the noon heat to pick vegetables in Imperial. Agriculture is an important part of the Imperial Valley economy.
::
The debate over data centers is heating up in the Imperial Valley, a rural desert region in southeastern California where a proposed center faces fierce opposition from residents.
The county in 2025 granted the project an exemption for the California Environmental Quality Act, known as CEQA. The landmark 56-year-old state law has been credited with helping to preserve California’s natural beauty and protecting communities from hazardous impacts of construction projects — but also blamed for stymieing construction.
Imperial Valley Computer Manufacturing, a California-based limited liability company that started two years ago, plans to develop a 950,000-square-foot facility in the county that’s designed for advanced artificial intelligence and machine learning operations. The company says it will use reclaimed wastewater and EPA-certified natural gas generators, and create 2,500 to 3,500 construction jobs and 100 to 200 permanent positions.
“We are committed to Imperial County and to creating lasting economic opportunity,” the company website states. “The project will generate $28.75 million in annual property tax revenue for local schools, fire departments, libraries, and essential services.”
The Imperial County Board of Supervisors is moving toward finalizing the proposal.
Farmland spreads out in front of the Imperial Valley Fair near a proposed data center in Imperial.
Sebastian Rucci, an attorney and chief executive officer of Imperial Valley Computer Manufacturing, said he commissioned multiple studies assessing the proposed center’s potential effect on issues like traffic or the environment that found no or minimal harms. He threatened to pull his proposal if a CEQA review was required.
“CEQA leaves you in an unknown territory — some of the environmental groups have used it for extortion, they sue, they have no basis for the suit but they delay you, and then they can squeeze money out of you for settling the lawsuit,” said Rucci.
The exemption, however, has alarmed residents, who have spoken up at county board meetings and launched a community organization, Not in My Backyard Imperial, to protest the data center and demand a CEQA review.
“It feels like it’s us against the county,” said Camarillo, adding that many feel the board has dismissed their questions and concerns.
None of the Imperial County Board of Supervisors responded to requests for comment.
Resident Fernanda Camarillo’s home is right behind the proposed site of the data center in Imperial.
The center would be a neighbor to Camarillo’s house in Victoria Ranch, a family-friendly area with beige stucco homes topped with terracotta tile roofs. She worries about noise, pollution and spiking utility bills. Power companies that have to upgrade grids to meet data centers’ energy demands sometimes seek to recoup that cost by hiking up rates for all consumers.
Camarillo, a substitute teacher, is also scared for her students. The air quality in Imperial Valley is already so poor that schools use a system of color-coded flags to signal whether it’s safe for children to go outside during gym or recess, she said.
“I think they see [the valley] as easy pickings because we are a low-income community and we have such a large population of Latinos here,” Camarillo said.
A quick drive around the neighborhood shows others share her concerns. Signs protesting the data center pop up throughout the community, displayed on front lawns or nestled into rocky garden beds.
Victoria Ranch was quiet and peaceful on a sunny Sunday in late February. Francisco Leal, a resident and lead organizer for NIMBY Imperial, said that’s a major part of its appeal.
The colorful dusk sky hovers over a Little League baseball game at Freddie White Park in Imperial. The debate over data centers is heating up in the Imperial Valley, a rural desert region in southeastern California.
Leal wants answers about everything from potential health hazards and impacts on the local water supply to whether the fire department is equipped to handle a large-scale electrical blaze. But without a CEQA review, he says residents are left to trust assurances from the developer or privately hired consultants.
Leal plans to sell his property if the project goes forward, but the thought makes him emotional.
“It’s not just a house; it’s a home,” he said. “This is the only home my kids have ever known and all of our family memories are here.”
Gina Snow, another resident, isn’t necessarily against bringing a data center to the county. But she wants the proposal to undergo a CEQA review.
“Clearly we understand that there is economic development and the potential for that to be positive for the county, but at what cost?” she said.
Daniela Flores, executive director of Imperial Valley Equity and Justice, a nonprofit that works for social and environmental equality, stands on the site of the proposed data center.
::
Daniela Flores, executive director of Imperial Valley Equity and Justice, a nonprofit that works for social and environmental equality, said the community has good reason to be wary. Various industries have come into the region over the years and made grand promises that never panned out.
“We became a sacrifice zone,” she said, adding industries use the area’s resources while ultimately doing little to permanently improve the lives of most residents.
Flores said the community continues to struggle with a range of problems, including poor air quality, high poverty rates, weak worker protections and crumbling infrastructure. She believes a data center could add new and potentially dangerous challenges.
The valley has long, brutal summers with temperatures that swell to 120 degrees. If the data center strains the grid and causes a lengthy blackout, or low-income residents have their power shut off because they can’t afford the rising bills, Flores fears the situation could quickly turn deadly.
The city of Imperial also has concerns. The city has filed a lawsuit calling on the county to halt the project, arguing it should not have received a CEQA exemption.
The controversy has drawn attention from Padilla, whose district includes Imperial Valley. Padilla has echoed residents’ calls for more transparency from the county and introduced Senate Bill 887, which would ban data centers from receiving exemptions from CEQA.
“I am not anti-data center or anti-artificial intelligence,” Padilla said. But, he added, we need to “find a way to do this right and make sure there is adequate review and understanding.”
A dusty haze settles over the city of Imperial at dusk near the site of a proposed data center.
Another measure from Padilla, Senate Bill 886, would direct the Public Utilities Commission to create an electrical corporation tariff to cover the cost of data center-related grid upgrades.
Other related legislation this year includes Assembly Bill 2619 from Assemblymember Diane Papan (D-San Mateo) that would require data center owners to provide an estimate about expected water usage and sources before applying for a business license, and Assembly Bill 1577, by Assemblymember Rebecca Bauer-Kahan (D-Orinda), which would require data center owners to submit monthly information to a state commission about water and fuel consumption and energy efficiency.
While lawmakers weigh new policies at the statehouse, Camarillo said she hopes the priority will be protecting communities.
“Innovation is important, but innovation for the sake of innovation has never really been something that hasn’t had negative impacts,” she said. “Think about human lives.”
WASHINGTON — The Trump administration likes to promote its immigration enforcement agenda through numbers, with ambitious goals to deport 1 million people, report zero releases at the U.S.-Mexico border and arrest thousands of alleged gang members.
For all the boasting, the administration has been releasing less reliable, carefully vetted data than its predecessors on a signature policy that has become one of the most contentious of Trump’s second term.
The gap in information and a loss of figures from an office that has tracked immigration data back to the 1800s have left researchers, advocates, lawyers and journalists without important statistics to hold the Republican administration to account.
“They aren’t publishing the data,” said Mike Howell, who heads the conservative Oversight Project, an advocacy group pushing for more deportations. Instead, Howell said, the Department of Homeland Security has put out numbers in news releases “that purport to be statistics with no statistical backup and the numbers have jumped all over the place.”
With mass deportations a priority, new restrictions and increased enforcement have led to a surge in immigration arrests, detentions and deportations.
But finding the metrics that once measured those changes can be hard. It is an extension of earlier administration moves to limit the flow of government information by scrubbing or removing federal datasets or by the firing last year of the top official overseeing jobs data.
The Office of Homeland Security Statistics is responsible for publishing figures from Homeland Security agencies, including removals and the nationalities of those deported, to provide a comprehensive picture of immigration trends at the border and inside the United States.
Originally known as the Office of Immigration Statistics, it tracked such data since 1872. In its current form, created under the Biden administration, it also started publishing monthly reports that allowed researchers to track developments almost in real time.
But key enforcement metrics on its website have not been updated since early last year. A note on the page where the monthly reports were says it “is delayed while it is under review.”
“It’s the most timely data. It’s the most reliable data,” Austin Kocher, research professor at Syracuse University who closely follows immigration data trends, said about the monthly reports. “It has the most omniscient view of immigration enforcement across the entire agency.”
An interactive dashboard launched by U.S. Immigration and Customs Enforcement in December 2023 once let users examine whom the agency was arresting, their nationalities, criminal histories and removal numbers. ICE called it a “new era in transparency.”
Though intended for quarterly updates, the latest data is from January 2025. The agency’s annual report, typically released in December, had not been published as of mid-March.
Other agencies also publish data that touches on immigration, and parts of it do continue to roll out, such as U.S. Customs and Border Protection statistics detailing border encounters or data from the Department of Justice’s immigration courts.
But experts say other data has slowed.
The State Department’s most recent visa issuance data is from August. Key statistics from U.S. Citizenship and Immigration Services have not been updated since October.
The now-missing data had helped researchers study the effects of different policies. Lawyers could cite the figures to support their litigation. Journalists saw in them a powerful tool to hold the government to account on public claims or to report on important trends.
“We’re all a little bit in the dark about exactly how immigration enforcement is operating at a time when it’s taking new and unprecedented forms,” said Julia Gelatt, associate director of the U.S. Immigration Policy Program at the Migration Policy Institute.
DHS did not respond to detailed questions about why it was no longer releasing specific data.
“This is the most transparent Administration in history, we release new data multiple times a week and upon reporter request,” the department said in a statement.
Figures the administration has released are inconsistent and unverifiable.
In a Jan. 20 news release, DHS said it had deported more than 675,000 people since Trump returned to the White House. A day later, in a second release, the department put the figure at 622,000. In congressional testimony March 4, Homeland Security Secretary Kristi Noem said the figure was 700,000.
But ICE, an agency within DHS, also releases figures on how many people it has removed from the country, part of a large data release mandated by Congress. An Associated Press analysis of the figures put that number at roughly 400,000 over Trump’s first year.
DHS has said 2.2 million people who were in the U.S. illegally have gone home on their own, but the department has given no explanation for the count. Experts have questioned the source of that figure, saying this was not something that DHS historically has tracked.
The department did not respond to questions about where that data came from.
With key sources of data halted, researchers, advocates and others have had to rely on information the administration is obliged to report or that has come to light through legal action.
The publication of ICE detention figures — how many people are detained, for how long and whether they have committed a crime — is required by Congress and is generally released every two weeks. But the figures’ release has faced some delays and its data gets overwritten with every new publication, complicating the work of people who need access to it.
The University of California, Berkeley’s Deportation Data Project, a research initiative, successfully sued through the Freedom of Information Act to access data about ICE arrests including nationalities, conviction status and whether arrests occurred at jails or in the community.
Graeme Blair, co-director of the project, said every administration has struggled with transparency in immigration enforcement, and given the Trump administration’s ambitious enforcement goals, the team wanted to secure and verify information that the government might not publicly release.
“Given the scale of what they were talking about doing, it seemed really important to be able to understand, to be able to double check those numbers,” he said.
But there are limitations, he said. The data obtained through the lawsuit only runs through Oct. 15. It does not cover recent operations such as the Minneapolis enforcement surge, when federal immigration officers fatally shot two protesters, leading to widespread demonstrations and scrutiny of enforcement tactics.
The absence of data is one of the few issues that has drawn bipartisan criticism.
“We deserve to know the numbers, just like we deserve to know who’s in our country and who needs to leave,” Howell said.
Santana writes for the Associated Press.
PHOENIX — The Republican leader of Arizona’s state Senate said Monday that he has handed over records related to the 2020 presidential election to the FBI in the latest sign that the Trump administration is acting on the president’s long-standing falsehoods about a race he lost to Democrat Joe Biden.
Senate President Warren Petersen said in a social media post that he complied “late last week” with a federal grand jury subpoena for records related to a controversial audit of the election in Maricopa County that had been ordered by legislative Republicans.
“The FBI has the records,” Petersen said.
He did not immediately respond to requests for additional comment, and a spokesperson for Senate Republicans said in an email that Petersen “does not have anything to add outside of his X post at this time.” The FBI office in Phoenix did not immediately respond to a request for comment.
It marks the second time this year that the FBI has obtained records related to the 2020 election from the most populous county in a presidential battleground state, both of which Trump lost as he sought reelection. In January, the FBI seized ballots and other records from Georgia’s Fulton County, which includes Atlanta, after the Justice Department sought a search warrant from a judge. The search warrant affidavit showed that the request relied on years-old claims, many of which had been thoroughly investigated and found to have no connection to widespread fraud.
Arizona Atty. Gen. Kris Mayes, a Democrat, issued a scathing statement in response to Petersen’s post, noting that multiple audits, independent investigations and legal challenges related to the 2020 presidential election found no evidence of widespread fraud that could have affected the outcome.
“Warren Petersen knows all of this. He has known it for years. He spread false stories of election fraud in 2020, and he remains an unrepentant election denier,” Mayes said. “What the Trump administration appears to be pursuing now is not a legitimate law enforcement inquiry. It is the weaponization of federal law enforcement in service of crackpots and lies.”
A firm hired by Republican lawmakers spent six months in 2021 searching for evidence of fraud in the previous year’s presidential election, a process experts said was marred by bias and a flawed methodology. It explored outlandish conspiracy theories, such as dedicating time to checking for bamboo fibers on ballots to see if they were secretly shipped in from Asia.
The audit ended without producing proof to support former President Trump’s false claims of a stolen election — and in fact found that Biden received 360 more votes than stated in the certified results for Maricopa County, which includes Phoenix.
The firm, Cyber Ninjas, also acknowledged that there were “no substantial differences” between its hand count of the ballots and the official count.
Previous reviews of the 2.1 million ballots by nonpartisan professionals who followed state law found no significant problem with the 2020 election in Maricopa County, which was run by Republicans then and now. Biden won the county by 45,000 votes and went on to win Arizona by 10,500 votes.
Federal officials took different routes to obtain election records in the two states. The Georgia case involved a judicially approved search warrant that required the FBI to articulate grounds that probable cause exists to believe a crime was committed. In Arizona, the FBI relied on subpoenas, a law enforcement maneuver that does not require judicial sign-off or prosecutors’ assertion there’s probable cause of a crime.
The investigations into the 2020 election come as the Justice Department has clashed with a number of states, including some controlled by Republicans, over access to detailed voter data that include names, dates of birth, addresses and partial Social Security numbers. Election officials have expressed concerns that providing the information would violate both state and federal data privacy laws, and that it could be used to remove people from state voter rolls.
Arizona is among the states the Justice Department has sued to obtain the voter information. Secretary of State Adrian Fontes, a Democrat, suggested that at least some Maricopa County voter files could be among the records Petersen gave the FBI. In a statement Monday, Fontes said his office was considering legal options “to secure personal voter information in the 2020 data that was shared.”
Calli Jones, a spokesperson for the secretary of state, said the office is assessing what was released to the FBI.
“This could be an end run by the Department of Justice to obtain unredacted voter files,” she said.
Kelety writes for the Associated Press. AP writer Eric Tucker in Washington contributed to this report.