As the AI market matures, the focus for many businesses is likely to shift from the training of models and tools to inference, the process of running these tools so that they can benefit businesses and consumers.

And as inference grows in importance, Edge computing could come into its own, with infrastructure located close to end users potentially offering a low latency solution to efficiently deliver AI-powered services.

Building for this AI future at the Edge will require a high degree of flexibility, according to Luca Beltramino, chief data center officer at Italian media infrastructure provider Rai Way. Required specifications are likely to vary wildly from deployment to deployment, so open minds will be needed to come up with the optimal solution, he says.

In this supplement, we feature more insights from Beltramino and Cambridge Management Consulting’s Duncan Clubb, who spoke to DCD about the requirements for building robust Edge infrastructure. Alongside AI, the duo cover optimal locations for Edge data centers, as well as security, data sovereignty, and sustainability. Elsewhere in the supplement:

  • Can liquid cooling become the great enabler for high-density compute deployments at the Edge?
  • How Viavi is revolutionizing network monitoring with AI-powered analytics, real-time insights, and automation to help businesses master their operations
  • A focus on Duos Edge, which wants to close the digital divide experienced by schools in rural locations across the US