data

As Trump pushes deportations, immigration data becomes harder to find

The Trump administration likes to promote its immigration enforcement agenda through numbers, with ambitious goals to deport 1 million people, report zero releases at the U.S.-Mexico border and arrest thousands of alleged gang members.

For all the boasting, the administration has been releasing less reliable, carefully vetted data than its predecessors on a signature policy that has become one of the most contentious of Trump’s second term.

The gap in information and a loss of figures from an office that has tracked immigration data back to the 1800s have left researchers, advocates, lawyers and journalists without important statistics to hold the Republican administration to account.

“They aren’t publishing the data,” said Mike Howell, who heads the conservative Oversight Project, an advocacy group pushing for more deportations. Instead, Howell said, the Department of Homeland Security has put out numbers in news releases “that purport to be statistics with no statistical backup and the numbers have jumped all over the place.”

With mass deportations a priority, new restrictions and increased enforcement have led to a surge in immigration arrests, detentions and deportations.

But finding the metrics that once measured those changes can be hard. It is an extension of earlier administration moves to limit the flow of government information by scrubbing or removing federal datasets or by the firing last year of the top official overseeing jobs data.

Important data is no longer publicly available

The Office of Homeland Security Statistics is responsible for publishing figures from Homeland Security agencies, including removals and the nationalities of those deported, to provide a comprehensive picture of immigration trends at the border and inside the United States.

Originally known as the Office of Immigration Statistics, it tracked such data since 1872. In its current form, created under the Biden administration, it also started publishing monthly reports that allowed researchers to track developments almost in real time.

But key enforcement metrics on its website have not been updated since early last year. A note on the page where the monthly reports were says it “is delayed while it is under review.”

“It’s the most timely data. It’s the most reliable data,” Austin Kocher, research professor at Syracuse University who closely follows immigration data trends, said about the monthly reports. “It has the most omniscient view of immigration enforcement across the entire agency.”

An interactive dashboard launched by U.S. Immigration and Customs Enforcement in December 2023 once let users examine whom the agency was arresting, their nationalities, criminal histories and removal numbers. ICE called it a “new era in transparency.”

Though intended for quarterly updates, the latest data is from January 2025. The agency’s annual report, typically released in December, had not been published as of mid-March.

Other agencies also publish data that touches on immigration, and parts of it do continue to roll out, such as U.S. Customs and Border Protection statistics detailing border encounters or data from the Department of Justice’s immigration courts.

But experts say other data has slowed.

The State Department’s most recent visa issuance data is from August. Key statistics from U.S. Citizenship and Immigration Services have not been updated since October.

The now-missing data had helped researchers study the effects of different policies. Lawyers could cite the figures to support their litigation. Journalists saw in them a powerful tool to hold the government to account on public claims or to report on important trends.

“We’re all a little bit in the dark about exactly how immigration enforcement is operating at a time when it’s taking new and unprecedented forms,” said Julia Gelatt, associate director of the U.S. Immigration Policy Program at the Migration Policy Institute.

DHS did not respond to detailed questions about why it was no longer releasing specific data.

“This is the most transparent Administration in history, we release new data multiple times a week and upon reporter request,” the department said in a statement.

Researchers contend with a patchwork of numbers

Figures the administration has released are inconsistent and unverifiable.

In a Jan. 20 news release, DHS said it had deported more than 675,000 people since Trump returned to the White House. A day later, in a second release, the department put the figure at 622,000. In congressional testimony March 4, Homeland Security Secretary Kristi Noem said the figure was 700,000.

But ICE, an agency within DHS, also releases figures on how many people it has removed from the country, part of a large data release mandated by Congress. An Associated Press analysis of the figures put that number at roughly 400,000 over Trump’s first year.

DHS has said 2.2 million people who were in the U.S. illegally have gone home on their own, but the department has given no explanation for the count. Experts have questioned the source of that figure, saying this was not something that DHS historically has tracked.

The department did not respond to questions about where that data came from.

With key sources of data halted, researchers, advocates and others have had to rely on information the administration is obliged to report or that has come to light through legal action.

The publication of ICE detention figures — how many people are detained, for how long and whether they have committed a crime — is required by Congress and is generally released every two weeks. But the figures’ release has faced some delays and its data gets overwritten with every new publication, complicating the work of people who need access to it.

The University of California, Berkeley’s Deportation Data Project, a research initiative, successfully sued through the Freedom of Information Act to access data about ICE arrests including nationalities, conviction status and whether arrests occurred at jails or in the community.

Graeme Blair, co-director of the project, said every administration has struggled with transparency in immigration enforcement, and given the Trump administration’s ambitious enforcement goals, the team wanted to secure and verify information that the government might not publicly release.

“Given the scale of what they were talking about doing, it seemed really important to be able to understand, to be able to double check those numbers,” he said.

But there are limitations, he said. The data obtained through the lawsuit only runs through Oct. 15. It does not cover recent operations such as the Minneapolis enforcement surge, when federal immigration officers fatally shot two protesters, leading to widespread demonstrations and scrutiny of enforcement tactics.

The absence of data is one of the few issues that has drawn bipartisan criticism.

“We deserve to know the numbers, just like we deserve to know who’s in our country and who needs to leave,” Howell said.

Santana writes for the Associated Press.

Source link

White House widens probe of 2020 election as it gets data from Arizona

The Republican leader of Arizona’s state Senate said Monday that he has handed over records related to the 2020 presidential election to the FBI in the latest sign that the Trump administration is acting on the president’s long-standing falsehoods about a race he lost to Democrat Joe Biden.

Senate President Warren Petersen said in a social media post that he complied “late last week” with a federal grand jury subpoena for records related to a controversial audit of the election in Maricopa County that had been ordered by legislative Republicans.

“The FBI has the records,” Petersen said.

He did not immediately respond to requests for additional comment, and a spokesperson for Senate Republicans said in an email that Petersen “does not have anything to add outside of his X post at this time.” The FBI office in Phoenix did not immediately respond to a request for comment.

It marks the second time this year that the FBI has obtained records related to the 2020 election from the most populous county in a presidential battleground state, both of which Trump lost as he sought reelection. In January, the FBI seized ballots and other records from Georgia’s Fulton County, which includes Atlanta, after the Justice Department sought a search warrant from a judge. The search warrant affidavit showed that the request relied on years-old claims, many of which had been thoroughly investigated and found to have no connection to widespread fraud.

Arizona Atty. Gen. Kris Mayes, a Democrat, issued a scathing statement in response to Petersen’s post, noting that multiple audits, independent investigations and legal challenges related to the 2020 presidential election found no evidence of widespread fraud that could have affected the outcome.

“Warren Petersen knows all of this. He has known it for years. He spread false stories of election fraud in 2020, and he remains an unrepentant election denier,” Mayes said. “What the Trump administration appears to be pursuing now is not a legitimate law enforcement inquiry. It is the weaponization of federal law enforcement in service of crackpots and lies.”

A firm hired by Republican lawmakers spent six months in 2021 searching for evidence of fraud in the previous year’s presidential election, a process experts said was marred by bias and a flawed methodology. It explored outlandish conspiracy theories, such as dedicating time to checking for bamboo fibers on ballots to see if they were secretly shipped in from Asia.

The audit ended without producing proof to support former President Trump’s false claims of a stolen election — and in fact found that Biden received 360 more votes than stated in the certified results for Maricopa County, which includes Phoenix.

The firm, Cyber Ninjas, also acknowledged that there were “no substantial differences” between its hand count of the ballots and the official count.

Previous reviews of the 2.1 million ballots by nonpartisan professionals who followed state law found no significant problem with the 2020 election in Maricopa County, which was run by Republicans then and now. Biden won the county by 45,000 votes and went on to win Arizona by 10,500 votes.

Federal officials took different routes to obtain election records in the two states. The Georgia case involved a judicially approved search warrant that required the FBI to articulate grounds that probable cause exists to believe a crime was committed. In Arizona, the FBI relied on subpoenas, a law enforcement maneuver that does not require judicial sign-off or prosecutors’ assertion there’s probable cause of a crime.

The investigations into the 2020 election come as the Justice Department has clashed with a number of states, including some controlled by Republicans, over access to detailed voter data that include names, dates of birth, addresses and partial Social Security numbers. Election officials have expressed concerns that providing the information would violate both state and federal data privacy laws, and that it could be used to remove people from state voter rolls.

Arizona is among the states the Justice Department has sued to obtain the voter information. Secretary of State Adrian Fontes, a Democrat, suggested that at least some Maricopa County voter files could be among the records Petersen gave the FBI. In a statement Monday, Fontes said his office was considering legal options “to secure personal voter information in the 2020 data that was shared.”

Calli Jones, a spokesperson for the secretary of state, said the office is assessing what was released to the FBI.

“This could be an end run by the Department of Justice to obtain unredacted voter files,” she said.

Kelety writes for the Associated Press. AP writer Eric Tucker in Washington contributed to this report.

Source link

SK Broadband turns to AI data centers as pay TV loses subscribers

A graphic shows SK Broadband’s declining pay TV subscribers and rising AI data center revenue, alongside an overview of the planned Ulsan AI data center equipped with about 60,000 GPUs and a first phase of 40 megawatts scheduled for 2027. Graphic by Asia Today and translated by UPI

March 5 (Asia Today) — South Korea’s pay television industry is struggling to maintain growth as streaming services reshape the media landscape, pushing operators to seek new revenue sources such as artificial intelligence data centers.

SK Broadband, one of the country’s largest pay TV providers, lost about 150,000 subscribers last year as consumers increasingly shift to over-the-top streaming platforms.

The company has responded by pursuing a two-track strategy of industry cooperation and expansion into new technology businesses.

Earlier this year SK Broadband joined rivals KT and LG Uplus to establish a 40 billion won ($30 million) IPTV strategy fund as part of the government’s K-content media investment initiative.

The fund will support production of film and television content while helping secure programming for pay TV platforms and boost video-on-demand sales, a key revenue source for operators.

SK Broadband generated 4.53 trillion won ($3.4 billion) in total revenue last year, with pay TV accounting for more than 40% of the total.

However the industry has been hit by accelerating “cord cutting,” a trend in which viewers cancel traditional television services in favor of online streaming platforms.

The company reported 9.45 million pay TV subscribers last year, including 6.72 million IPTV users and 2.73 million cable TV subscribers. That represented a decline of roughly 158,000 customers from the previous year.

Pay TV revenue also fell by about 15 billion won ($11 million).

To strengthen cooperation within the industry, the three telecom companies also plan to launch VOD gift certificates that can be used across platforms regardless of service provider.

The initiative is intended to improve consumer access to pay TV services and expand distribution channels to corporate clients.

SK Broadband has also integrated the artificial intelligence agent A.dot into its IPTV platform B tv to provide personalized content recommendations. The company said the service has recorded more than 100 million uses.

At the same time SK Broadband is expanding its business-to-business services through data centers.

The company operates nine data centers nationwide and generated more than 1.4 trillion won ($1.05 billion) in related B2B revenue last year.

It is also building a large-scale AI data center in Ulsan with its parent company SK Telecom. The first phase is expected to begin operations next year.

By 2030 the company expects AI data centers alone to generate about 1 trillion won ($750 million) in annual revenue.

Last year revenue from AI data center operations rose 35% to 519.9 billion won ($390 million).

SK Broadband and SK Telecom have pledged to invest 3.4 trillion won ($2.55 billion) in AI data centers through 2028.

Industry officials say the company’s push into higher-margin technology businesses could help offset declining pay TV subscriptions.

— Reported by Asia Today; translated by UPI

© Asia Today. Unauthorized reproduction or redistribution prohibited.

Original Korean report: https://www.asiatoday.co.kr/kn/view.php?key=20260305010001413

Source link

South Korea uses tech, data to modernize reservist training

Reservists participate in simulated firing training during the first reserve forces exercise of the year at the Army’s 51st Infantry Division science-based reservist training center in Pyeongtaek, South Korea, on March 3, 2026. Photo by Yonhap News Agency

March 4 (Asia Today) — South Korea’s Army has begun its 2026 reservist training program using advanced simulation and data systems designed to improve combat realism and tactical efficiency.

At a science-based reservist training center in Seoul’s Seocho district on Wednesday, reservists trained with the Multiple Integrated Laser Engagement System, known as MILES, which uses laser signals and sensors to determine hits during simulated combat.

When a reservist was struck during a mock urban battle exercise, the equipment immediately sounded an alert indicating the participant had been “killed,” demonstrating the system’s ability to provide instant and objective combat assessments.

The Army said the system replaces earlier exercises that relied heavily on instructor judgment. Instead, the equipment records hits and performance data in real time, allowing trainees to review their results and identify areas for improvement.

Officials say the new approach encourages a more participatory training model in which reservists track their own performance and refine their skills based on data.

Indoor firing range reduces noise complaints

The facility also includes an indoor shooting range equipped with advanced soundproofing designed to address long-standing noise complaints from nearby residents.

Army officials said the range is quiet enough that it is difficult to detect gunfire outside the building.

Transparent ballistic acrylic panels at each firing lane and automated fire-control systems were installed to improve safety. Reservists monitor their shooting results in real time on digital displays during training.

The Army said the technology helps transform the facility from a traditional military site into security infrastructure that can coexist more easily with surrounding communities.

VR simulations recreate urban battlefields

Reservists also trained in virtual reality simulations using a three-screen system that recreates realistic urban environments.

The scenarios include detailed digital models of locations such as Seocho Station and the COEX underground shopping mall in Seoul’s Gangnam district.

Participants wearing helmets equipped with spatial-recognition technology practiced navigating the terrain and conducting simulated urban combat operations without the constraints of real-world training space.

Smart systems streamline training process

An information and communications technology management system links multiple stages of the training process.

Reservists register by scanning identification cards when they arrive, after which smartwatches and kiosks connect them to a network that manages equipment distribution, firing exercises, tactical drills, evaluation and discharge.

The Army plans to expand the system to additional training facilities and introduce more simulation-based exercises.

Officials said 29 science-based reservist training centers have been built nationwide so far, including one scheduled for completion in Busan later this month.

New facilities are also planned this year in Mokpo, Daejeon, Chilgok, Yeongcheon and Andong. The Army ultimately plans to operate about 40 such centers nationwide.

Col. Park Hyun-gyu, head of the Army’s reservist training policy division, said the program aims to improve readiness while making training more efficient.

“The science-based system enhances training results while minimizing inconvenience for participants,” Park said. “It will strengthen the combat readiness of our reservists while creating a training environment that can coexist with local communities.”

— Reported by Asia Today; translated by UPI

© Asia Today. Unauthorized reproduction or redistribution prohibited.

Original Korean report: https://www.asiatoday.co.kr/kn/view.php?key=20260304010001086

Source link

Japan’s Digital Infrastructure and the Growing Demand for Unlimited Mobile Data Among International Visitors

Japan is one of those spots on the map of the planet Earth where infrastructure and digital innovation are closely connected.

The country considers technology as an instrument of national competitiveness. For the last few years, this approach has extended, bringing revolution to Japan digital infrastructure, and exceeding expectations not only of citizens but also of international travellers.

5G Expansion and Digital Urban Infrastructure

5G Japan tourism connectivity has accelerated, reflecting broader structural changes in the Japan telecom market. The nationwide 5G coverage of the major carriers has rapidly expanded.

Not so long time ago 5G in Japan was closely connected with industrial policy goals, special highlights among which are automation, smart manufacturing, and AI deployment. As for the sphere of tourism, the impact is no less significant.

Concerning major urban centers such as Tokyo, Osaka, and Fukuoka, high-speed connectivity for them became a significant part of the smart traffic systems, services for real-time navigation tracking and a platform for digital payment. As a result, foreign visitors get into an environment where stable data access is guaranteed.

Japan’s digital infrastructure is reliable, fast, and efficient. These qualities maintain the broad economic model of the country. However, this situation brings high expectations from visitors who are upset with limited data packages because they create a big contrast to the high-tech urban ecosystem.

Digital tourism Japan can be smooth and easy with AI-driven translation services, booking services, and transportation networks.

As a result, in your Japan data usage, you can easily carry out your daily tasks such as streaming, making video conferences, and having cloud-based document access. Even if you’re a short-term visitor, you will need a lot of data and a stable connection for simultaneous operation of your devices.

Such providers as Mobal have become part of the broader ecosystem within this environment. It guarantees international mobility for the maximum comfort of users. Japan supports the strategy of revitalizing inbound tourism, which is linked to regional economic development, especially when talking about areas outside Tokyo. High-speed connectivity is vitally needed.

Remote Work, International Mobility and Data Demands

The latest trend towards Japan is not only the attraction of tourists, but also the creation of comfortable conditions for those who choose remote work Asia opportunities. A lot of people nowadays are choosing hybrid or fully remote jobs, so they can do their daily work and travel at the same time. As a result, these people need good connections not only for their travel needs, but also for joining conferences, working with large files and secure company systems. Public Wi-Fi is not enough, and the need for fast, reliable, and high-speed internet only increases.

In this context, as demand grows, many international visitors search for Japan eSIM unlimited data solutions that match their usage patterns. One example is available at Mobal Japan eSIM unlimited data, which provides an unlimited eSIM designed specifically for short-term stays, typically ranging from 3 to about 31 days, with unrestricted data usage suited to tourists and business travelers.

eSIM technology supports Japan’s tendency for digital transformation. eSIMs are the easiest way for travelers to stay connected, which can be arranged beforehand.

Policy, Regulation and Mobile Accessibility for Foreign Visitors

Japan’s telecom system is a perfect balance of competition and strict oversight. The market is tightly controlled by the rules around SIM registration, protection of consumers, and network licensing. As a result, foreign visitors may face a problem while getting a local SIM card.

At the same time, it’s clear that easy mobile access is needed for the positive experience of Japan for both business and travel spheres. Mobal provides a stable connection within the regulated system. All the services are perfectly adapted to correspond to legal requirements and the needs of travellers. The focus is not on promotion but on smooth service and security compliance.

Japan expands 5G networks, developing smart city technologies. As a result, regulations are constantly changing, covering such aspects as cybersecurity and digital identity. Such updates are needed for easily foreign visitors access and reliable mobile networks.

The Future of Digital Access in Japan

To sum up all of the said above, the focus of Japan’s digital strategy is on the deep use of AI technology and faster network standards. Any city needs data and smart systems. Mobile internet became a need because it provides people with an opportunity to access transport, arrange shopping, and carry out their daily tasks.

Most of the international visitors Japan data usage visitors have expectations, quite similar to the expectations of local residents. Fast data is a must. The demand for Japan eSIM unlimited data plans is constantly growing, and it’s not about trends, but about the fact that travel and digital infrastructure have become closely connected. Companies which provide data for travelers work between regulation, technology, and global travel. Their role can’t be underestimated because connectivity is needed for the support of tourism, business, and the workforce. For Japan as a country, known for technological leadership in smart cities Japan, the accessibility of reliable digital systems for all categories of visitors is highly important to support its reputation.

Talking about the latest trends, the line between physical and digital infrastructure will slowly disappear due to the expansion of 5G networks. The main challenge at the current stage of development is to make sure that networks match changing travel patterns. As a result, seamless mobile access for short-term visitors is not a temporary trend, but the best reflection of long-term changes in the digital economy of Japan.

Source link

Trump Administration Pushes Diplomats to Fight Data Sovereignty Laws

The Trump administration has directed U.S. diplomats to actively oppose foreign laws that restrict how American tech companies handle citizens’ data abroad. An internal State Department cable, dated February 18 and signed by Secretary of State Marco Rubio, described such measures as threats to artificial intelligence services, global data flows, and civil liberties.

Experts say the move signals a return to a more confrontational approach after previous efforts focused on building goodwill with European customers. The administration warned that data sovereignty rules could increase costs, introduce cybersecurity risks, and expand government control in ways that enable censorship.

Data Sovereignty in Focus

Data sovereignty or localization initiatives have accelerated, especially in Europe, amid ongoing tensions over U.S. trade policies and concerns about privacy and surveillance. European regulators, wary of American tech giants, have tightened rules on how data is stored and shared. The EU’s General Data Protection Regulation (GDPR) remains the most prominent example, restricting cross-border data transfers and imposing stiff fines on companies that fail to comply.

The State Department cable cited GDPR as “unnecessarily burdensome” and highlighted China’s restrictive data policies as an example of how technology rules can expand geopolitical influence. Beijing, it noted, bundles infrastructure projects with policies that provide access to international data for surveillance and strategic leverage.

Diplomatic Action Plan

The cable, labeled as an “action request,” instructed diplomats to track proposals that could limit cross-border data flows and to counter regulations deemed excessive. Talking points included promotion of the Global Cross-Border Privacy Rules Forum, a multinational initiative launched in 2022 by the United States, Mexico, Canada, Australia, and Japan to support free flow of data while ensuring privacy protections.

This directive follows a pattern of U.S. opposition to European digital regulation. Last year, diplomats were ordered to challenge the EU’s Digital Services Act, aimed at making the internet safer by forcing social media firms to remove illegal content. The U.S. is also reportedly planning an online portal to help users bypass content moderation, including restrictions on material flagged as hate speech or terrorist propaganda.

Analysis: A More Assertive U.S. Digital Strategy

The cable reflects a strategic shift toward actively protecting the interests of U.S. tech companies globally. While previous administrations attempted to engage Europe diplomatically, the current approach pressures foreign governments to loosen privacy and data storage regulations that could hinder U.S. business.

By framing data sovereignty laws as a threat to AI development, cybersecurity, and civil liberties, the administration is positioning the free flow of data as a cornerstone of U.S. economic and technological influence. At the same time, rising competition from China in digital infrastructure and AI adds urgency, highlighting the geopolitical stakes of controlling international data flows.

The broader implication is a growing clash between national data policies and global digital commerce. As countries enact stricter rules to protect citizens’ data, U.S. tech firms and policymakers are increasingly asserting that global interoperability and AI innovation must take priority, signaling potential tensions in transatlantic and international digital governance for years to come.

With information from Reuters.

Source link

Trump’s plan for rising energy costs: Pump oil, make data centers pay

Energy affordability was in the spotlight during President Trump’s lengthy and at times rambling State of the Union address Tuesday evening as the president promised to bring down electricity prices in an effort to assuage voter concerns about rising costs.

The president announced a new “ratepayer protection pledge” to shield residents from higher electricity costs in areas where energy-thirsty artificial intelligence data centers are being built. Trump said major tech companies will “have the obligation to provide for their own power needs” under the plan, though the details of what the pledge actually entails remain vague.

“We have an old grid — it could never handle the kind of numbers, the amount of electricity that’s needed, so I am telling them they can build their own plant,” the president said. “They’re going to produce their own electricity … while at the same time, lowering prices of electricity for you.”

The announcement comes as polling shows Americans are dissatisfied with the economy and concerned about the cost of living. Experts on both sides of the political spectrum have said the energy affordability issue could translate to poor outcomes for Republicans in the midterm elections this November, as it did in a few key races in New Jersey, Virginia and Georgia last year.

While Trump has focused on ramping up domestic production of oil, gas and coal, residential electric bills have been soaring — jumping from 15.9 cents per kilowatt-hour in January 2025 on average to 17.2 cents at the end of December, according to the U.S. Energy Information Administration.

Through one year into his second term as president, Trump has vastly changed the federal landscape when it comes to energy and the environment, reversing many of the efforts made by the Biden administration to prioritize electrification initiatives and investments in renewable energy via the Inflation Reduction Act and Bipartisan Infrastructure Law.

Among several changes, Trump’s administration has slashed funding for solar programs, ended federal tax credits for electric vehicles and canceled grants for offshore wind power — even going so far as to try to halt some such projects that were nearing completion along the East Coast.

Trump has also championed fossil fuel production and on Tuesday doubled down on his “drill baby drill” agenda, touting lower gasoline prices, increased production of American oil and new imports of oil from Venezuela.

Many of the president’s efforts are designed to loosen Biden-era regulations that he has said were burdensome, ideologically motivated and expensive for taxpayers.

Trump has taken direct aim at California, which has long been a leader on the environment. Last year, the president moved to block California’s long-held authority to set stricter tailpipe emission standards than the federal government — an ability that helped the state address historical air quality issues and also underpinned its ambitious ban on the sale of new gas-powered cars in 2035.

Trump also slashed $1.2 billion in federal funding for California’s effort to develop clean hydrogen energy while leaving intact funding for similar projects in states that voted for him. In November, his administration announced that it will open the Pacific Coast to oil drilling for the first time in nearly four decades, a move the state vowed to fight.

But perhaps no issue has come across voters’ kitchen tables more than energy affordability.

So far this term, Trump has canceled or delayed enough projects to power more than 14 million homes, according to a tracker from the nonprofit Climate Power. The group’s senior advisor, Jesse Lee, described the president’s data center announcement as a “toothless, empty promise based on backroom deals with his own billionaire donors.”

“Making it worse, Trump is continuing to block clean-energy production across the board — the only sources that can keep up with demand, ensure utility bills don’t keep skyrocketing, and prevent massive new amounts of pollution,” Lee said in a statement.

Earlier this month, Trump’s Environmental Protection Agency repealed the endangerment finding, the U.S. government’s 2009 affirmation that greenhouse gases are harmful to human health and the environment, in what officials described as the single largest act of deregulation in U.S. history. The finding formed the foundation for much of U.S. climate policy. The EPA also loosened guidelines around emissions from coal power plants, including mercury and other dangerous pollutants.

The president’s environmental record so far is “written in rollbacks that put the interests of some corporate polluters above the health of everyday Americans,” read a statement from Marc Boom, senior director of the Environmental Protection Network, a group composed of more than 750 former EPA staff members and appointees.

Further, Trump has worked to undermine climate science in general, often describing global warming as a “hoax” or a “scam.” During his first year in office, he fired hundreds of scientists working to prepare the National Climate Assessment, laid off staffers at the National Oceanic and Atmospheric Administration and dismantled the National Center for Atmospheric Research, one of the world’s leading climate and weather research institutions, among many other efforts.

In all, the administration has taken or proposed more than 430 actions that threaten the environment, public health and the ability to confront climate change, according to a tracker from the nonprofit Natural Resources Defense Council.

The opposition’s choice for a rebuttal speaker is indicative of how seriously it is taking the issue of energy affordability: Virginia Gov. Abigail Spanberger focused heavily on energy affordability during her campaign against Republican Lt. Gov. Winsome Earle-Sears last year, including vows to expand solar energy projects and technologies such as fusion, geothermal and hydrogen. Virginia is home to more than a third of all data centers worldwide.

Source link

Gaza death toll exceeds 75,000 as independent data verify loss | Israel-Palestine conflict

The true human cost of Israel’s genocidal war on the Gaza Strip has far exceeded previous official estimates, with independent research published in the world’s leading medical journals verifying more than 75,000 “violent deaths” by early 2025.

The findings, emerging from a landmark series of scientific papers, suggest that administrative records from the Gaza Ministry of Health (MoH) represent a conservative “floor” rather than an overcount, and provide a rigorous bedrock to the scale of Palestinian loss.

The Gaza Mortality Survey (GMS), a population-representative household study published in The Lancet Global Health, estimated 75,200 “violent deaths” between October 7, 2023 and January 5, 2025. This figure represents approximately 3.4 percent of Gaza’s pre-conflict 2.2 million population and sits 34.7 percent higher than the 49,090 “violent deaths” reported by the MoH for the same period.

The Gaza Health Ministry estimates that as of January 27 this year, at least 71,662 people have been killed since the start of the war. Of those, 488 people have been killed since the declaration of a ceasefire in the Gaza Strip on October 10, 2025.

Israel has consistently questioned the ministry’s figures, but an Israeli army official told journalists in the country in January that the army accepted that about 70,000 people had been killed in Gaza during the war.

Despite the higher figure, researchers noted that the demographic composition of casualties – where women, children, and the elderly comprise 56.2 percent of those killed – remains remarkably consistent with official Palestinian reporting.

INTERACTIVE - Gaza death toll exceeds 75000 Lancet study-1771400778
(Al Jazeera)

Scientific validation of the toll

The GMS, which interviewed 2,000 households representing 9,729 individuals, provides a rigorous empirical foundation for a death toll.

Michael Spagat, a professor of economics at Royal Holloway University of London and the study’s lead author, found that while MoH reporting remains reliable, it is inherently conservative due to the collapse of the very infrastructure required to document death.

Notably, this research advances upon findings published in The Lancet in January 2025, which used statistical “capture-recapture” modelling to estimate 64,260 deaths during the war’s first nine months.

While that earlier study relied on probability to flag undercounts, this report shifts from mathematical estimation to empirical verification through direct household interviews. It extends the timeline through January 2025, confirming a violent toll exceeding 75,000 and quantifying, for the first time, the burden of “non-violent excess mortality”.

According to a separate commentary in the same publication, the systematic destruction of hospitals and administrative centres has created a “central paradox” where the more devastating the harm to the health system, the more difficult it becomes to analyse the total death toll.

Verification is further hindered by thousands of bodies still buried under rubble or mutilated beyond recognition. Beyond direct violence, the survey estimated 16,300 “non-violent deaths”, including 8,540 “excess” deaths caused directly by the deterioration of living conditions and the blockade-induced collapse of the medical sector.

Researchers highlighted that the MoH figures appear to be conservative and reliable, dispelling misinformation campaigns aimed at discrediting Palestinian casualty data. “The validation of MoH reporting through multiple independent methodologies supports the reliability of its administrative casualty recording systems even under extreme conditions,” the study concluded.

A decade of reconstructive backlogs

While the death toll continues to mount, survivors face an unprecedented burden of complex injury that Gaza’s decimated healthcare system is no longer equipped to manage. A predictive, multi-source model published in eClinicalMedicine quantified 116,020 cumulative injuries as of April 30, 2025.

The study, led by researchers from Duke University and Gaza’s al-Shifa Hospital, estimated that between 29,000 and 46,000 of these injuries require complex reconstructive surgery. More than 80 percent of these injuries resulted from explosions, primarily air attacks and shelling in densely populated urban zones.

The scale of the backlog is staggering. Ash Patel, a surgeon and co-author of the study, noted that even if surgical capacity were miraculously restored to pre-war levels, it would take approximately another decade to work through the estimated backlog of predicted reconstructive cases. Before the escalation, Gaza had only eight board-certified plastic and reconstructive surgeons for a population exceeding 2.2 million people.

The collapse of the health system

The disparity between reconstructive need and capacity is exacerbated by what researchers describe as the “systematic destruction” of medical infrastructure. By May 2025, only 12 of Gaza’s 36 hospitals remained capable of providing care beyond basic emergency triage, with approximately 2,000 hospital beds available for the entire population, down from more than 3,000 beds before the war.

“There is little to no reconstructive surgery capacity left within Gaza,” the research concluded, warning that specialised expertise like microsurgery is almost absent. The clinical challenge is further compounded by Israel’s use of incendiary weapons, which produce severe burns alongside blast-related fractures.

The long-term effect of these injuries is often irreversible. Without prompt medical treatment, patients face high risks of wound infection, sepsis, and permanent disability. The data indicate that tens of thousands of Palestinians will remain with surgically addressable disabilities for life unless there is a huge international increase in reconstructive capacity and aid.

Interactive_TwoYearofGaza_HOSPITALS_DESTROYED_DAMAGED

The ‘grey zone’ of mortality

Writing in The Lancet Global Health, authors Belal Aldabbour and Bilal Irfan observed a growing “grey zone” in mortality where the distinction between direct and indirect death becomes blurred. Patients who die of sepsis months after a blast, or from renal failure after a crushing injury because they cannot access clean water or surgery, occupy a space that risks understating the true lethality of military attacks.

Conditions have only deteriorated since the data collection periods. By late 2025, forced evacuations covered more than 80 percent of Gaza’s area, with northern Gaza and Rafah governorates facing full razing by Israeli forces. Famine was declared in northern Gaza in August 2025, further reducing the physiological reserve of injured survivors and complicating any surgical recovery.

This series of independent studies serves as an urgent call for accountability and an immediate cessation of hostilities. “The healthcare infrastructure in Gaza is being repeatedly decimated by attacks despite protection by international humanitarian law,” researchers stated. They underscored that the only way to prevent the reconstructive burden from growing further is an immediate end to attacks against civilians and vital infrastructure.

Source link

Analysis: Will Big Tech’s colossal AI spending crush Europe’s data sovereignty?

Several Big Tech companies have reported earnings in recent weeks and provided estimates for their spending in 2026, along with leading analysts’ projections.


ADVERTISEMENT


ADVERTISEMENT

The data point that seems to have caught Wall Street’s attention the most is the estimated capital expenditure (CapEx) for this year, which collectively represents an investment of over $700bn (€590bn) in AI infrastructure.

That is more than the entire nominal GDP of Sweden for 2025, one of Europe’s largest economies, as per IMF estimates.

Global chip sales are also projected to reach $1tn (€842bn) for the first time this year, according to the US Semiconductor Industry Association.

In addition, major banks and consulting firms, such as JPMorgan Chase and McKinsey, project that total AI CapEx will surpass $5tn (€4.2tn) by 2030, driven by “astronomical demand” for compute.

CapEx refers to funds a company spends to build, improve or maintain long-term assets like property, equipment and technology. These investments are meant to boost the firm’s capacity and efficiency over several years.

The expenditure is also not fully deducted in the same year. CapEx costs are capitalised on the balance sheet and gradually expensed through depreciation, representing a key indicator of how a company is investing in its future growth and operational strength.

The leap this year confirms a definitive pivot that began in 2025, when Big Tech is estimated to have spent around $400bn (€337bn) on AI CapEx.

As Nvidia founder and CEO Jensen Huang has repeatedly stated, including at the World Economic Forum in Davos last month, we are witnessing “the largest infrastructure build-out in human history”.

Hyperscalers bet the house

At the top of the spending hierarchy for 2026 sits Amazon, which alone is guiding to invest a mammoth $200bn (€170bn).

To put the number into perspective, the company’s individual AI CapEx guidance for this year surpasses the combined nominal GDP of the three Baltic countries in 2025, according to IMF projections.

Alphabet, Google’s parent company, follows with $185bn (€155bn), while Microsoft and Meta are set to deploy $145bn (€122bn) and $135bn (€113bn) respectively.

Oracle also raised its 2026 CapEx to $50bn (€42.1bn), nearly $15bn (€12.6bn) above earlier estimates.

Additionally, Tesla projects double the spending with almost $20bn (€16.8bn), primarily to scale its robotaxi fleet and advance the development of the Optimus humanoid robot.

Another of Elon Musk’s companies, xAI, will also spend at least $30bn (€25.2bn) in 2026.

A new $20bn (€16.8bn) data centre named MACROHARDRR will be built in Mississippi, which Governor Tate Reeves stated is “the largest private sector investment in the state’s history”.

xAI will also expand the so-called Colossus, a cluster of data centres in Tennessee that has been described by Musk as the world’s largest AI supercomputer.

Furthermore, the company was acquired by SpaceX in an all-stock transaction at the start of this month.

The merger valued SpaceX at $1tn (€842bn) and xAI at $250bn (€210bn), creating an entity worth $1.25tn (€1.05tn), reputedly the largest private company by valuation in history.

There are also reports that SpaceX intends to IPO sometime this year, with Morgan Stanley allegedly in talks to manage the offering that now includes exposure to xAI.

Elon Musk stated that the goal is to build an “integrated innovation engine” combining AI, rockets and satellite internet, with long-term plans that include space-based data centres powered by solar energy.

Conversely, Apple continues to lag in spending with “only” a projected $13bn (€10.9bn).

However, the company announced a multi-year partnership with Google last month to integrate Gemini AI models into the next generation of Apple Intelligence.

Specifically, the collaboration will focus on overhauling Siri and enhancing on-device AI features. Therefore, one could say that Apple is outsourcing a lot of the investment it needs to be competitive on AI development.

As for Nvidia, it will report earnings and release projections on 25 February.

The company is primarily in the business of selling AI chips, and is expected to get the lion’s share of the Big Tech’s spending. Particularly, for the build-out of data centres.

In last August’s earnings call, CEO Jensen Huang estimated a cost per gigawatt of data centre capacity between $50bn (€42.1bn) and $60bn (€50.5bn), with about $35bn (€29.5bn) of each investment going towards Nvidia hardware.

The great capital rotation

Wall Street has had mixed feelings about the enormous spending Big Tech companies have planned for 2026.

On the one hand, investors understand the necessity and urgency of developing a competitive edge in the artificial intelligence age.

On the other, the sheer scale of the spending has also spooked some shareholders. The market’s tolerance hinges on demonstrable ROI from this year onwards, as the investments are also increasingly financed with massive debt raises.

Morgan Stanley estimates that hyperscalers will borrow around $400bn (€337bn) in 2026, more than double the $165bn (€139bn) that was loaned out in 2025.

This surge could push the total issuance of high-grade US corporate bonds to a record $2.25tn (€1.9tn) this year.

Currently, projected AI revenue for 2026 is nowhere near matching the spending, and there are valid concerns. For instance, the possibility of hardware rapidly depreciating due to innovation, and other high operational costs such as energy usage.

It can be confidently stated that the numbers have a heavy reliance on future success.

As Google CEO Sundar Pichai acknowledged this month, there are “elements of irrationality in the current spending pace”.

Back in November, Alex Haissl, an analyst at Rothschild & Co, became a dissenting voice as he downgraded ratings for Amazon and Microsoft.

In a note to clients, the analyst wrote “investors are valuing Amazon and Microsoft’s CapEx plans as if cloud-1.0 economics still applied”, referring to the low-cost structure of cloud-based services that allowed Big Tech firms to scale in the last two decades.

However, the analyst added “there are a few problems that suggest the AI boom likely won’t play out in the same way, and it is probably far more costly than investors realise”.

This view is also shared by Michael Burry, who is best known for being among the first investors to predict and profit from the subprime mortgage crisis in 2008. Burry has argued that the current AI boom is a potential bubble pointing to unsustainable CapEx.

Big Tech’s AI race is funded by a tremendous amount of leverage. Whether this strategy will pay off, and which companies will be the winners and the losers, only time will tell.

At the moment, Nvidia certainly seems to be a great beneficiary. Moreover, Apple has a distinct approach by increasing third party reliance, through a partnership with Google, instead of massively scaling their spending. It is a different trade-off.

Europe’s industrial deficit

Amid all this spending, urgent questions have also been raised about Europe’s ability to compete in a race that has become a battle of balance sheets.

For the European Union, the transatlantic contrast is sobering. While American firms are mobilising nearly €600bn in a single year, the EU’s coordinated efforts do not even match the financial firepower of the lowest spender among the US tech titans.

Brussels has attempted to rally with the AI Factories initiative, and the AI Continent Action Plan launched last April, which aim to mobilise public-private investments.

However, the numbers tell a stark story. Total European spending on sovereign cloud data infrastructure is forecast to reach just €10.6bn in 2026.

While this is a respectable 83% increase year-on-year, it remains a rounding error compared to the US AI build-out.

Last year, at the time when the initiatives mentioned were being discussed, the CEO of the French unicorn Mistral AI, Arthur Mensch, stated that “US companies are building the equivalent of a new Apollo program every year”.

Mensch also added that “Europe is building excellent regulation with the AI Act, but you cannot regulate your way to computing supremacy”.

Mistral represents one of the only flickers of European resistance in the AI race. The French company is employing the same strategy as most of Big Tech and aggressively expanding its physical footprint.

In September 2025, Mistral AI raised a €1.7bn Series C at a valuation of almost €12bn, with the Dutch semiconductor giant ASML leading the round by singly investing €1.3bn.

During the World Economic Forum in Davos last month, Mistral’s CEO confirmed a €1bn CapEx plan for 2026.

Just last week, the company also announced a major €1.2bn investment to build a data centre in Borlänge, Sweden.

In a partnership with the Swedish operator, EcoDataCenter, the facility will be designed to offer “sovereign compute” compliant with the EU’s strict data standards, and leveraging Sweden’s abundant green energy.

Set to open in 2027, this data centre will provide the high-performance computing required to train and deploy Mistral’s next-generation AI models.

This is an important move for the company, as it is the first infrastructure project outside France, and it is also a core venture for European data sovereignty.

Meanwhile, US tech titans are attempting to placate European regulators by offering “sovereign-light” solutions. Several Big Tech projects have been rolled out for “localised cloud zones”, for example in Germany and Portugal, promising data residency.

However, critics argue these remain technically dependent on US parent companies, leaving the European industry vulnerable to the whims of the American economy and foreign policy.

As 2026 unfolds, the stakes are clear. The US is betting the house, and its credit rating, on AI dominance.

Europe, cautious and capital-constrained, is hoping that targeted investments and regulation will be enough to carve out a sovereign niche in a world increasingly run on American technology.

Source link