Analysis

Data centre experts predict 6x increase in DCI bandwidth in 5 years

18th March 2025
Harry Fowle
0

The rapid growth of AI workloads is driving a major transformation in data centre network infrastructure, with global data centre experts anticipating a significant increase in interconnect bandwidth needs over the next five years, according to a study commissioned by Ciena.

The survey, conducted in partnership with Censuswide, queried more than 1,300 data centre decision makers across 13 countries. More than half (53%) of respondents believe AI workloads will place the biggest demand on data centre interconnect (DCI) infrastructure over the next 2-3 years, surpassing cloud computing (51%) and big data analytics (44%).

To meet surging AI demands, 43% of new data centre facilities are expected to be dedicated to AI workloads. With AI model training and inference requiring unprecedented data movement, data centre experts predict a massive leap in bandwidth needs. In addition, when asked about the needed performance of fibre optic capacity for DCI, 87% of participants believe they will need 800 Gb/s or higher per wavelength.

"AI workloads are reshaping the entire data centre landscape, from infrastructure builds to bandwidth demand," said Jürgen Hatheier, Chief Technology Officer, International, Ciena. "Historically, network traffic has grown at a rate of 20-30% per year. AI is set to accelerate this growth significantly, meaning operators are rethinking their architectures and planning for how they can meet this demand sustainably.”

Creating more sustainable AI-driven networks

Survey respondents confirm there is a growing opportunity for pluggable optics to support bandwidth demands and address power and space challenges. According to the survey, 98% of data centre experts believe pluggable optics are important for reducing power consumption and the physical footprint of their network infrastructure.

Distributed computing

The survey found that, as requirements for AI compute continue to increase, the training of Large Language Models (LLMs) will become more distributed across different AI data centres.  According to the survey, 81% of respondents believe LLM training will take place over some level of distributed data centre facilities, which will require DCI solutions to be connected to each other. When asked about the key factors shaping where AI inference will be deployed, the respondents ranked the following priorities:

  • AI resource utilisation over time is the top priority (63%)
  • Reducing latency by placing inference compute closer to users at the edge (56%)
  • Data sovereignty requirements (54%)
  • Offering strategic locations for key customers (54%)

Rather than deploying dark fibre, the majority (67%) of respondents expect to use Managed Optical Fiber Networks (MOFN), which utilise carrier-operated high-capacity networks for long-haul data centre connectivity.

"The AI revolution is not just about compute—it’s about connectivity," added Hatheier. "Without the right network foundation, AI’s full potential can’t be realised. Operators must ensure their DCI infrastructure is ready for a future where AI-driven traffic dominates."

Product Spotlight

Upcoming Events

View all events

Further reading

A selection of Analysis articles for further reading

Read more
Newsletter
Latest global electronics news
© Copyright 2025 Electronic Specifier