Dell Technologies Fuels Enterprise AI Innovation with Infrastructure, Solutions and Services

0
5
Dell AI Factory

19 May 2025 – Dell Technologies (NYSE: DELL), the world’s No. 1 provider of AI infrastructure,1 announces Dell AI Factory advancements, including powerful and energy-efficient AI infrastructure, integrated partner ecosystem solutions and professional services to drive simpler and faster AI deployments.

Why it matters
AI is now essential for businesses, with 75% of organisations saying AI is key to their strategy2 and 65% successfully moving AI projects into production.3 However, challenges like data quality, security concerns and high costs can slow progress.

The Dell AI Factory approach can be up to 62% more cost effective for inferencing LLMs on-premises than the public cloud4 and helps organisations securely and easily deploy enterprise AI workloads at any scale. Dell offers the industry’s most comprehensive AI portfolio designed for deployments across client devices, data centres, edge locations and clouds.5 More than 3,000 global customers across industries are accelerating their AI initiatives with the Dell AI Factory.6

Dell infrastructure advancements help organisations deploy and manage AI at any scale

Dell introduces end-to-end AI infrastructure to support everything from edge inferencing on an AI PC to managing massive enterprise AI workloads in the data centre.

Dell Pro Max AI PC delivers industry’s first enterprise-grade discrete NPU in a mobile form factor7

The Dell Pro Max Plus laptop with Qualcomm AI 100 PC Inference Card is the world’s first mobile workstation with an enterprise-grade discrete NPU.8 It offers fast and secure on-device inferencing at the edge for large AI models typically run in the cloud, such as today’s 109-billion-parameter model.

The Qualcomm AI 100 PC Inference Card features 32 AI-cores and 64GB memory, providing power to meet the needs of AI engineers and data scientists deploying large models for edge inferencing.

Dell redefines AI cooling with innovations that reduce cooling energy costs by up to 60%9

The industry-first Dell PowerCool Enclosed Rear Door Heat Exchanger (eRDHx) is a Dell-engineered alternative to standard rear door heat exchangers. Designed to capture 100% of IT heat generated with its self-contained airflow system, the eRDHx could reduce cooling energy costs by up to 60%10 compared to currently available solutions.

With Dell’s factory integrated IR7000 racks equipped with future-ready eRDHx technology, organisations can:

  • Significantly cut costs and eliminate reliance on expensive chillers given the eRDHx operates with water temperatures warmer than traditional solutions (between 32 and 36 degrees Celsius).
  • Maximise data centre capacity by deploying up to 16% more racks11 of dense compute, without increasing power consumption.
  • Enable air cooling capacity up to 80 kW per rack for dense AI and HPC deployments.12
  • Minimise risk with advanced leak detection, real-time thermal monitoring and unified management of all rack-level components with Dell Integrated Rack Controller software.

Dell PowerEdge servers with AMD accelerators maximise performance and efficiency

Dell PowerEdge XE9785 and XE9785L servers will support AMD MI350x accelerators with 288 GB of HBM3e memory per GPU and up to 35 times greater13 inferencing performance.14 Available in liquid-cooled and air-cooled configurations, the servers will reduce facility cooling energy costs.

Dell advancements power efficient and secure AI deployments and workflows

Because AI is only as powerful as the data that fuels it, organisations need a platform designed for performance and scalability. The Dell AI Data Platform updates improve access to high quality structured, semi-structured and unstructured data across the AI lifecycle.

  • Dell Project Lightning is the world’s fastest parallel file system, accelerating training time for large-scale and complex AI workflows. According to early benchmarking results, Project Lightning delivers up to two times greater throughput than competing parallel file systems.15
  • Dell Data Lakehouse enhancements simplify AI workflows and accelerate use cases – such as recommendation engines, semantic search and customer intent detection – by creating and querying AI-ready datasets.

“We’re excited to work with Dell to support our cutting-edge AI initiatives, and we expect Project Lightning to be a critical storage technology for our AI innovations,” said Dr. Paul Calleja, director, Cambridge Open Settascale Lab and Research Computing Services, University of Cambridge.

With additional advancements, organisations can:

  • Lower power consumption, reduce latency and boost cost savings for high performance computing (HPC) and AI fabrics with Dell Linear Pluggable Optics.
  • Increase trust in the security of their AI infrastructure and solutions with Dell AI Security and Resilience Services, which provide full stack protection across AI infrastructure, data, applications and models.

Dell expands AI partner ecosystem with customisable AI solutions and applications

Dell is collaborating with AI ecosystem players to deliver tailored solutions that simply and quickly integrate into organisations’ existing IT environments. Organisations can:

  • Enable intelligent, autonomous workflows with a first-of-its-kind on-premises deployment of Cohere North, which integrates various data sources while ensuring control over operations.
  • Maintain data securely on-premises using Google Gemini models, available on Dell PowerEdge XE9680 and XE9780 servers.
  • Prototype and build agent-based enterprise AI applications with Dell AI Solutions with Llama, using Meta’s latest Llama Stack distribution and Llama 4 models.
  • Securely run scalable AI agents and enterprise search on-premises with Glean. Dell and Glean’s collaboration will deliver the first on-premises deployment architecture for Glean’s Work AI platform.
  • Build and deploy secure, customisable AI applications and knowledge management workflows with solutions jointly engineered by Dell and Mistral AI.
  • Simplify business-critical AI deployments while providing flexibility and security with Red Hat’s software stack available on the Dell AI Factory.

The Dell AI Factory also expands to include:

  • Advancements to the Dell AI Platform with AMD add 200G of storage networking and an upgraded AMD ROCm open software stack for organisations to simplify workflows, support LLMs and efficiently manage complex workloads. Dell and AMD are collaborating to provide Day 0 support and performance optimised containers for AI models such as Llama 4.
  • The new Dell AI Platform with Intel helps enterprises deploy a full stack of high performance, scalable AI infrastructure with Intel® Gaudi® 3 AI accelerators.

Dell also announced advancements to the Dell AI Factory with NVIDIA and updates to Dell NativeEdge to support AI deployments and inferencing at the edge.

Perspectives
“It has been a non-stop year of innovating for enterprises, and we’re not slowing down. We have introduced more than 200 updates to the Dell AI Factory since last year,” said Jeff Clarke, chief operating officer, Dell Technologies. “Our latest AI advancements — from groundbreaking AI PCs to cutting-edge data centre solutions — are designed to help organisations of every size to seamlessly adopt AI, drive faster insights, improve efficiency and accelerate their results.”

Article Provided

LEAVE A REPLY

Please enter your comment!
Please enter your name here