Intelligent CIO Middle East Issue 116 | Page 81

t cht lk

t cht lk

Drivers and inhibitors
The benefits of building in-house AI data centres include customised infrastructure optimised for specific workloads, enhanced data security and compliance, long-term cost savings by avoiding recurring cloud fees, and greater control over performance and scalability. Enterprises can finetune hardware, cooling, and networking for maximum efficiency.
“ The challenges involve high initial capital investment, ongoing maintenance, and complex infrastructure management. Enterprises should address power and cooling requirements, hire skilled personnel to manage operations, and keep up with rapid AI hardware advancements,” says ManageEngine’ s Vijayarangakannan.
Enterprises that build their own data centres for AI workloads gain greater control over data, enhanced security, and regulatory compliance, aligning with federated governance models.
“ They also benefit from cost efficiencies by reducing reliance on cloud providers and optimised performance tailored for AI workloads,” says Qlik’ s Mehta.
However, these advantages come with challenges, including high initial investment and operational complexity, governance risks, and scalability limitations compared to cloud AI. Additionally, talent shortages in AI infrastructure management pose long-term sustainability concerns.
“ Building an AI data centre is a strategic decision, not just an IT choice. Once a strategy is in place, cost and the ability to obtain GPUs are two major factors to consider,” says VAST Data’ s Aziz.
On the plus side, enterprises gain full control over performance, security, and data governance, but the challenges are formidable. AI hardware evolves at such a pace, it makes infrastructure investment and ROI a moving target. Power and cooling become existential questions, particularly in regions where energy costs are high.
Designing and running an AI-optimised facility requires a blend of infrastructure, AI, and software expertise that few organisations have in-house. This is why many enterprises are leaning towards hybrid models, building what they must, but supplementing with cloud for peak demand or specialised AI services.
Challenges include high upfront costs, complex integration of liquid cooling systems, and significant power demands. Retrofitting existing facilities to support AI workloads requires substantial upgrades to hardware and building infrastructure, which can be complex and costly.
Balancing customisation with operational efficiency is crucial, and enterprises may consider hybrid or colocation strategies to mitigate risks.“ Strategic planning, investment in skilled personnel, and adoption of advanced technologies are essential for scalable AI growth,” summarises Vertiv’ s Paul. p
Haider Aziz, Vice President META, VAST Data
Ian Paul, Hyperscale and Colocation Strategic Segments Director METCA, Vertiv
www. intelligentcio. com INTELLIGENTCIO MIDDLE EAST 81