LAS VEGAS — The term “AI factory” surfaced frequently in keynote presentations,media briefings,and executive interviews at this year’s Consumer Electronics Show (CES). (The show took place last week.)
However, the concept carries diffrent meanings for different people.
“The term is used in multiple — and frequently enough imprecise — ways,” said Alex Cordovil, Dell’Oro Group research director. He found that most people use it to mean “specialized data center” in conversations with operators, hyperscalers, and infrastructure vendors.
That aligns with Nvidia CEO Jensen Huang’s definition. “I don’t call it a data center because a data center stores data,” Huang stated during his CES keynote.
“AI factories are fundamentally different from data centers,” Siemens CEO Roland Busch explained to reporters. “They consume more energy, require liquid cooling, and necessitate industrial-level controls, rather than standard building controllers.”
Booz Allen Hamilton CTO Bill Vass agrees. “An AI factory is a data center specifically designed for AI hardware,” he clarified. This might include specialized concrete for server weight, and employing AI, digital twins, and simulations to optimize power, cooling, and server placement.
Others use the term to describe something smaller than a data center – the servers, software, and systems needed to run AI.
For example, the AWS AI Factory combines hardware and software that runs on-premises, managed by AWS, and includes services like Bedrock, networking, storage, databases, and security.
Lenovo views AI factories as packaged servers designed for AI workloads. “We’re designing an architecture of a fixed number of racks, all working together as one,” said Scott Tease, vice president and general manager of AI and high-performance computing at Lenovo’s infrastructure solutions group.
These racks range from a single unit to hundreds, Tease told Computerworld.
Each rack is a bit larger than a refrigerator,fully assembled,and often preconfigured for specific customer use cases. “Once it arrives, our service personnel simply connect power and networking,” Tease said.
Still others focus on the software component of an AI factory.
Some experts use the phrase in the context of the AI technology stack. MIT sloan Management Review’s Thomas H. Davenport and Randy Bean define it as “technology platforms,methods,data,and algorithms that accelerate AI system growth.”
“Building on an established foundation reduces cost and time to scale AI,” they wrote.
the European Commission has a broad definition, picturing ecosystems including computing hardware, talent, and data, culminating in AI models and applications.
The EC also discusses “AI Gigafactories” – large datacenters dedicated to training advanced AI models. The EU plans to allocate €20 billion to create up to five AI gigafactories in 2025.
Deloitte defines an AI factory as a combination of hardware,software,and services enabling the entire AI lifecycle. “The core product is intelligence, measured by token throughput,” said Nicholas Merizzi, a principal at Deloitte Consullting and the firm’s AI infrastructure US offering lead.
Like a conventional factory converting raw materials into a final product, an AI factory transforms power, data, and large language models (LLMs) into valuable insights, he said.
With the term’s growing popularity, clarity is crucial. When a vendor promotes an “AI factory,” are they referencing a conceptual model, a physical building, or a collection of server racks? Avoiding such misunderstandings is vital to prevent costly investments in the wrong solution.
Editor’s note: Lenovo covered Maria Korolov’s travel and lodging for this year’s CES, but had no influence over the content of this story.