As AI applications become increasingly widespread, the underlying data centre infrastructure is coming into sharper focus. Particularly where new AI workloads meet established IT and operational structures, technical, organizational, and economic tensions arise. In an eco interview, Thimo Groneberg, Chief Commercial Officer at eco member Polarise, shares insights into the current challenges of operating and scaling AI computing power, highlights the differences between new and existing-site approaches, and outlines which questions companies should ask themselves before putting AI into productive use.
Polarise is working on the infrastructure for AI applications. In your experience, where do the greatest tensions between new requirements and established operating models arise when integrating AI workloads into existing IT and data centre structures?
The challenges extend across all areas, because AI infrastructure is more than just screwing a few GPUs into an existing rack and somehow operating them. Retrofitting existing data centres during live operation for modern AI systems is akin to the proverbial “open-heart surgery” and is nearly impossible. That said, proven business models remain valid – not every new data centre has to be an “AI factory”, and it shouldn’t be. It is no coincidence that this term is steadily emerging as a new category within data centres. More on that later.
In many companies, the demand for AI computing power is growing very rapidly. In your experience, what is the most common reason for scaling failures in practice?
Clearly, it is the availability of infrastructure. That’s why we at Polarise place a special focus on speed as one of the three pillars of our identity, alongside sovereignty and sustainability. Companies in Germany and Europe are currently finding it particularly difficult to find the right infrastructure that meets their requirements in terms of data protection, scalability and performance. Of course, there are many announcements being made these days, but it is foreseeable that these projects will only reach maturity in a year or two. What do we do until then? That is the gap Polarise is closing – because when it comes to AI, Europe has no time to lose.
When building AI data centres, a distinction is often made between new builds and the further development of existing locations. What specific challenges does the use of existing data centre infrastructure for AI workloads entail?
A very particular challenge is the design of power supply and cooling systems. Both are highly dense in the AI environment and require appropriate implementation. However, even infrastructure that is considered sufficient today may already be “outdated” tomorrow with the announcement of the next chip generation. The rapid advances in AI chips and systems make especially forward-looking planning that is specifically tailored to the systems. With our “AI Pods,” we have created a flexible and modular solution that can be rolled out extremely quickly even in the existing structures mentioned.
Many companies are still in the early stages of using AI. What questions should they ask themselves before deploying AI workloads to productive use?
We believe that it is crucial for companies to have identified a real use case and real added value before doing “anything with AI” just for AI’s sake. The figures speak for themselves: 63% of all companies that have integrated AI into their processes report a direct improvement in their productivity. But certainly only when the groundwork has been done properly beforehand – a clear use case, a clear goal, a clear data pool. Both we at Polarise and our strong partners can help by asking the right questions to ultimately find exactly the right tool that the customer can then operate on our sovereign platform.
Finally, a look ahead: Which development in the field of AI data centres do you consider most likely?
As already indicated, AI factories will form their own class within data centres in the future due to their special requirements profile. Our vision at Polarise is a decentralised, interconnected network of data centres that are locally integrated to strengthen local ecosystems and communities, while also providing space for AI startups, education and research, and entering into a meaningful circular economy with surrounding residents and companies. Only in this way will we find a model in Europe that is harmony with the growing demand and the resources available to us.


