Off-Prem

Alibaba Cloud claims its modular datacenter architecture shrinks build times by 50 percent

Also reveals boosted utilization rates, upgraded IaaS and more – all in the name of AI apps


Alibaba Cloud has revealed a modular datacenter architecture it claims will help it to satisfy demand for AI infrastructure by improving performance and build times for new facilities.

Announced at its annual Apsara conference yesterday, the "CUBE DC 5.0" architecture was described as using "prefabricated modular designs" plus "advanced and proprietary technologies such as wind-liquid hybrid cooling system, all-direct current power distribution architecture and smart management system."

Alibaba Cloud hasn't explained those techs in depth, but claims the modular approach reduces deployment times by up to 50 percent compared to traditional datacenter building techniques.

The Register has asked Alibaba to explain the workings of a "wind-liquid hybrid cooling system." For what it's worth, machine translation of the term into Mandarin, then fed into search engines, produced results describing cold plate cooling – a technique that sees thin reservoirs of cooled liquids placed on hardware, with cooling achieved by circulating liquid and/or blowing air across the plates.

Whatever the term describes, CEO of Alibaba Cloud Intelligence Eddie Wu told the conference his company "is investing heavily in building an AI infrastructure for the future."

"These enhancements are not just about keeping up with AI demands but about setting a global standard for efficiency and sustainability."

Other steps towards that goal include a scheduler said to better manage hardware resources so that they achieve up to 90 percent utilization rates.

Alibaba Cloud's IaaS offering, the Enterprise Elastic Compute Service (ECS), has reached its ninth generation. Conference attendees were told it is better equipped for AI applications as it has improved recommendation engine speeds by 30 percent and database read/write queries per second by 17 percent.

Also at the conference, Alibaba Cloud announced an "Open Lake data utility" that integrates multiple big data engines so they can be used by generative AI applications. Another new offering, "DMS: OneMeta+OneOps," apparently combines and manages metadata from 40 different data sources.

It's 2024, so Alibaba Cloud also announced some AI news: the release of its Qwen 2.5 multimodal models, available in sizes from 0.5 to 72 billion parameters, supporting 29 languages and tuned for the needs of sectors including automotive and gaming. The new models are said to have "enhanced knowledge [and] stronger capabilities in math and coding."

A text-to-video AI model that works with both Chinese and English prompts, Tongyi Wanxiang, was also released.

"The new model is capable of generating high-quality videos in a wide variety of visual styles from realistic scenes to 3D animation," boasted Alibaba Cloud execs. ®

Send us news
4 Comments

Despite Wall Street jitters, AI hopefuls keep spending billions on AI infrastructure

Sunk cost fallacy? No, I just need a little more cash for this AGI thing I’ve been working on

Microsoft's drawback on datacenter investment may signal AI demand concerns

Investment bank claims software giant ditched 'at least' 5 land parcels due to potential 'oversupply'

If you thought training AI models was hard, try building enterprise apps with them

Aleph Alpha's Jonas Andrulis on the challenges of building sovereign AI

Satya Nadella says AI is yet to find a killer app that matches the combined impact of email and Excel

Microsoft CEO is more interested in neural nets boosting GDP than delivering superhuman intelligence

Cash torrent pouring into Nvidia slows – despite booming Blackwell adoption

May we all have problems like annual revenue growth dropping from 126 to 114 percent

Euro cloud biz trials 'server blades in a cold box' system

Hot air or a 50% energy saving? Exoscale datacenter runs proof-of-concept to test veracity of Digger's claims

We meet the protesters who want to ban Artificial General Intelligence before it even exists

STOP AI warns of doomsday scenario, demands governments pull the plug on advanced models

Like a kid handing in homework at the last minute, Supermicro finally files its missing financial figures

SMCI had to come up with long-delayed report – or lose its slot on NASDAQ again

UK government insiders say AI datacenters may be a pricey white elephant

Economy-boosting bit barn? Not in my back yard, some locals expected to say

How nice that state-of-the-art LLMs reveal their reasoning ... for miscreants to exploit

Blueprints shared for jail-breaking models that expose their chain-of-thought process

Intel loses another exec as datacenter, AI chief named Nokia CEO

Justin Hotard tapped to replace Pekka Lundmark at the Finnish telco

LLM aka Large Legal Mess: Judge wants lawyer fined $15K for using AI slop in filing

Plus: Anthropic rolls out Claude 3.7 Sonnet