40.5 F
London
December 22, 2024
PI Global Investments
Infrastructure

Your AI, Your Rules: The Case for Private Infrastructure


As organizations rush to embrace GenAI, they must protect proprietary data and reduce regulatory risk.

According to a recently published analysis from IDC and sponsored by Equinix, private AI infrastructure is a critical part of a responsible AI strategy as organizations look to keep proprietary data safe, reduce their regulatory risk and get a payoff in terms of performance and cost efficiency.

The rise in private AI is also driving the subsequent surge in data center growth, such as initiatives by Equinix, which recently launched an AUD240 million expansion of its major Sydney and Melbourne facilities.

Dave McCarthy, a research vice president for cloud and infrastructure services at IDC, says his firm projects that GenAI alone will add nearly USD10 trillion to global GDP over the next ten years. Adopting this scale will require bulletproof solutions for privacy and compliance.

IDC estimates that around one-third of organizations invest heavily in GenAI, while another 32% are in the initial testing phases of models and use cases.

These investments will also require a commitment to IT infrastructure, and it is forecasted that USD100 billion will be spent on AI-specific infrastructure by 2025.

“This new AI-ready infrastructure will consist of GPU-accelerated services, all-flash storage, and high-speed networks that can scale to meet the needs of data-dense environments,” McCarthy says.

Cloud deployment decisions

This leads to crucial decisions around cloud deployment and how much remains on-premises or in a colocation data center.

“Organizations are realizing they need a variety of infrastructure options to meet the performance, scale and security requirements for AI applications,” says McCarthy.

Private AI, he says, is an architectural approach that can balance the business gains from AI with the practical and legal requirements around privacy and compliance.

It is a concept “rooted in the overall construct of responsible AI” and, in most cases, complements an organization’s AI activity in the public cloud.

“Since private AI is emerging as the preferred model of adoption, access to high-performance digital infrastructure is essential.”

Training large language models on publicly available data in the cloud can save time and money. Still, there are concerns that proprietary corporate data could end up in these public models, which motivates technology leaders to separate it.

Another consideration is the power and cooling requirements needed for high-performance AI infrastructure, which typically exceeds the capacity of most enterprise data centers.

With the rise of green IT, there are concerns about the energy demands and carbon emissions created by AI and ML model training.

Payoffs with latency

This is where colocation, says McCarty, finds its “sweet spot.”

“With sustainable data centers designed for dense AI infrastructure and high-speed interconnection to multiple public cloud providers, it creates a best of both world scenario,” he says, in satisfying the deployment of AI in a service provider facility while enabling complete control over the infrastructure and data.

There are also payoffs with latency in distributing private AI infrastructure adjacent to public cloud networks.

While public cloud resources are shared among multiple users, potentially leading to performance fluctuations, private infrastructure offers dedicated resources that ensure consistent performance and predictability and allow some hardware customization.

The adoption of GenAI, says McCarthy, will lead to digital infrastructure strategies that bring together both public and private resources to balance the needs of the business.

“Colocation and interconnection providers will be a key source for digital infrastructure with services designed with the high-density racks and specialized cooling required for AI workloads,” he says.

“This is accomplished with renewable energy that creates operational efficiencies and offsets carbon emissions.”

In conclusion, private AI will be an “essential element” of the enterprise AI strategy, configured next to cloud-adjacent digital infrastructure providers with a global footprint and an ecosystem of partners capable of delivering AI-ready infrastructure.

In Australia, Equinix’s response has been to add 4,175 cabinets to two of its facilities, SY5 in Sydney’s Alexandria and ME2 at Fisherman’s Bend in Melbourne.

“Since private AI is emerging as the preferred model of adoption, access to high-performance digital infrastructure is essential,” said Guy Danskine, managing director of Equinix Australia.

“This provides organizations with low-latency connectivity to multiple cloud providers, as well as private infrastructure, enabling them to leverage best-of-breed AI models while maintaining localization and control of their data.”

Image credit: iStockphoto/etraveler



Source link

Related posts

Nigeria’s Kidnapping Crisis Highlights the Country’s Problematic Intelligence Infrastructure

D.William

Russian Attack on Odessa Port Infrastructure Results in Fatalities and Injuries

D.William

We can do better: Why Park City Mountain’s infrastructure is aging and how to fix it

D.William

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.