TL;DR. OpenAI is scaling its Stargate infrastructure to power the AGI era — a massive concentration of compute on US soil. For European enterprises, this expansion redefines the terms of digital dependency: AI sovereignty is no longer just about models, but about the physical infrastructure running them.
What just happened
On 29 April 2026, OpenAI published a document titled Building the compute infrastructure for the Intelligence Age. The message is unambiguous: Stargate, the data center project announced earlier this year, is scaling up. According to the official announcement, OpenAI is adding new compute capacity to meet growing AI demand and to power AGI systems. All of this infrastructure is being deployed on US soil.
Why this matters for European businesses
Until now, European AI dependency was primarily a software issue — proprietary models, closed APIs. With Stargate, it becomes physical. When a Belgian or German company accesses OpenAI's AGI agents, it relies on servers located outside European jurisdiction, governed by US law, operated by an entity whose trajectory is now explicitly oriented toward AGI. The GDPR provides a layer of personal data protection, but does not address dependency on compute resources that remain outside European regulatory reach.
A parallel dynamic, often overlooked, is accelerating at the same time. According to an analysis published on the same day by Hugging Face, AI model evaluation is becoming a new computational bottleneck. In concrete terms: even measuring a model's performance now requires massive compute resources. The dependency thus extends from training to evaluation — two critical steps in the AI chain that largely escape European control.
Three opportunities for European and Belgian leaders
- Seize the open-model window. On 29 April 2026, IBM published the Granite 4.1 series — open models designed for deployment in sovereign environments. These offer a concrete alternative for use cases where compute traceability and data residency carry regulatory or competitive value.
- Revisit data residency clauses in AI cloud contracts. Stargate's scale-up strengthens the negotiating leverage of any buyer who can demonstrate a viable alternative — open-weight model, European hosting, or hybrid architecture. That renegotiation window narrows as dependency normalises.
- Include the physical layer in vendor risk audits. Audit committees assessing AI risk purely at the model or data layer are missing a critical dimension: the jurisdiction of the data centers, their geographic location, and the growing concentration among a handful of US actors.
Three risks if Europe stays passive
- Infrastructural lock-in within two years. If AGI architectures become standardised on Stargate before Europe has credible alternatives, migration costs will become prohibitive for most organisations.
- Evaluation asymmetry. If the compute resources needed to evaluate AI models are themselves concentrated in the US and China — as the Hugging Face analysis suggests — European regulators may find themselves unable to independently certify or audit the systems they are mandated to govern.
- Competitive disadvantage in high-value segments. Sectors where speed of access to AGI agents will be decisive — finance, pharma, advanced logistics — will be structurally disadvantaged if their compute infrastructure is subject to regulatory latencies or data transfer restrictions imposed from outside.
A field observation
Large-scale AI data center construction is not a new phenomenon, but OpenAI's rhetoric has shifted register. The conversation is no longer about infrastructure for language models — it is about infrastructure for AGI. This semantic shift carries practical consequences: it justifies massive investment, energy relocation, and above all a concentration logic that leaves little room for regional actors without comparable funding. Europe managed to create Mistral. It has not yet created the European equivalent of Stargate.
Three levers to activate this week
- Map the physical layer of your current AI vendors. For each active AI contract, identify the location of the data centers used, the applicable jurisdiction, and the data transfer clauses. This work takes one to two audit days and frequently reveals blind spots that legal teams have not yet addressed.
- Test a Granite 4.1 model on an internal use case. IBM has made the Granite 4.1 series publicly available. Benchmarking it against an existing document or analytics pipeline objectifies the performance delta versus a proprietary solution and grounds any diversification decision in real data.
- Put infrastructure resilience on the next board agenda. This is not a technical question — it is a strategic one. What percentage of the organisation's AI value chain depends on infrastructure outside GDPR reach and European sovereignty? That figure deserves to be known before concentration becomes irreversible.
Where does your organisation stand?
The question raised by Stargate's expansion is not «should we use OpenAI's AI?» — it is «with what architecture, from which territory, and with what exit capacity?» The answer to that question determines tomorrow's room for manoeuvre.
If this analysis speaks to you, I publish a piece of this calibre every day on digital innovation and enterprise AI. 👉 Get the next one straight in your inbox — sign-up takes ten seconds, and each edition is read before 9 a.m. by leaders of European SMEs, mid-caps and public institutions.
Sources
This article is part of the Neurolinks AI & Automation blog.
Read in: French | Dutch