eco
30.08.2024

heise Editor Susanne Nolte on Current Debates in the Server Industry, the Influence of AI and the S2N

Ms Nolte, could you give us an overview of the S2N and what topics are in focus? What audience is the event aimed at?

The two-day heise Conference S2N covers current topics in data centre IT in four parallel tracks: This ranges from migrating VMware servers to Proxmox, performance analysis for software-defined storage, to managing EVPN fabrics. Practical topics such as ransomware incident response, optimising performance for resource-hungry applications, accurate inventorying with Netbox as a single source of truth in the data centre, or secure data export are also discussed.

The S2N, as a training and networking event, is aimed at data centre and IT managers, CTOs, IT administrators, system architects, and all IT experts dealing with data centre IT – whether in small or large server landscapes, storage systems, or networks.

How does this year’s S2N format differ from the previous storage2day?

The focus of storage2day was on storage and storage networks. Today, however, these can no longer be considered separately from server and network design or from the relevant system administration concepts. Developments and concepts such as software-defined infrastructure, hyper-converged infrastructure and configuration automation contribute to breaking down the traditional boundaries between servers, storage systems, and network technology. However, applications such as AI training and conferences also require a holistic view of data centre IT if implemented efficiently.

What current trends and innovations in storage technologies and network infrastructures will be discussed at S2N?

In networking, EVPN-VXLAN and the increase in east-west traffic are the key topics, while performance analysis and optimisation remain ever-relevant. In storage, software-defined is becoming increasingly prominent and unfortunately ransomware continues to be a pressing issues. Key topics here include air gap, immutable storage, incident response and cold starts.

How are new technologies such as NVMe, cloud storage and 5G influencing discussions at S2N?

New technologies are of course always a topic, as are insights into developments in research labs. New technologies being introduced into data centres also change the IT landscape. NVMe, for instance, allows for different server designs, new IO behaviours and even new network and storage designs with NVMe-over-fabric.

Cloud computing has significantly changed IT, particularly in terms of how applications or IT infrastructure are “consumed” today. For admins and IT decision-makers, however, there are many more considerations involved. In addition to questions such as “What am I taking this for?”, there are always strategic decisions to be made concerning vendor lock, cost traps and data protection: decisions about public, hybrid, private or multi-cloud are fundamental choices with corresponding consequences. Then there are many technical questions to address, such as: “How can I manage my cloud and on-premises resources in a unified way?” I see 5G more in the edge area, but here the number of small or edge data centres is also growing.

What challenges do you currently see in the industry and how can they be addressed?

First, new regulations such as NIS2 and CSRD are being introduced. These pose significant challenges for the IT and DC industries: On one hand, Europe urgently needs to raise its cybersecurity standards. On the other hand, companies are now being held accountable for their production processes and supply chains and are required to take active measures in the event of deficiencies – whether related to sustainability or corporate social responsibility.

Calculating Scope 3 emissions for internal IT alone is anything but trivial. For data on outsourced equipment, such as web servers, backup, and secondary systems, as well as for all used cloud services,. Many smaller companies are often unaware that the CSR directive also applies to them. However, they too must provide their customers, who are held accountable, with ESG data for their services. The same applies to NIS2, as this regulation also indirectly impacts many data centres and Internet service providers as part of the supply chain.

Companies should recognise and take advantage of the opportunities associated with the early and thoughtful implementation of the guidelines and a sincere commitment, such as the competitive advantage that results from this, which should not be underestimated – not only for the companies directly affected by the regulations, but also and above all for IT and DC service providers.

The integration of AI into enterprise IT also presents entirely new challenges for infrastructure operators, as well as IT departments responsible for providing and managing AI resources. AI applications demand servers, storage, and networks in a different way than traditional business applications. AI aligns more with HPC (high-performance computing). This means GPU power instead of CPU power, high, parallelised data throughput with low latency and in the network, east-west traffic instead of north-south traffic. Utilising AI resources from the cloud or moving them there will certainly change the demands on Internet infrastructure, just as cloud computing has already done.

How will innovative technologies such as AI and quantum computing impact the sector? What is your personal outlook for the industry?

I think that quantum computing will complement binary computing in this century but certainly will not replace them. Currently, quantum computing is in the Noisy Intermediate Scale Quantum (NISQ) era. There is still a lack of sufficient qubits and resources for error correction. Even in the next phase of quantum computing, fault-tolerant quantum computing (FTQC), it will only be suitable for specific applications: Calculations to optimise processes or combinatorial tasks for example in the pharmaceuticals industry or materials development.

The first real quantum computing (QC) applications utilise the unpredictability of results, where error correction plays no role, such as in generating unpredictable random numbers. This is precisely what binary computers, as deterministic systems, cannot do. They therefore often use the human input, such as the user’s typing behaviour, to generate random numbers from never truly regular frequency on a millisecond level. However, binary systems can perform other tasks much better than quantum systems.

And the occasional fear that quantum computers will crack all digital keys in no time at all and we will be left without suitable encryption is, in my view, completely unfounded. The NIST has been very early and very far along with its tender process for quantum-safe encryption and authentication, which has been ongoing for several years. Several strong methods have made it to the final round and are now being further analysed. A quantum computer with the ability to crack even today’s common keys is not yet on the horizon.

All in all, I believe the industry will remain true to itself with its hypes and drops as well as its ability to respond to opportunities and demands and to adapt.

Thank you very much for the interview, Ms Nolte!

More information about this year’s S2N can be found here.

heise Editor Susanne Nolte on Current Debates in the Server Industry, the Influence of AI and the S2N