AI networking standard, Major tech companies, including Meta, Microsoft, Advanced Micro Devices (AMD), and Broadcom, have announced the development of a new industry standard for networking in artificial intelligence (AI) data centers. This is the latest initiative to challenge the market leader, Nvidia.
The “Ultra Accelerator Link” is an attempt to establish an open standard for communication between AI accelerators, systems that help process the vast amounts of data used in AI tasks.
Other members include Google, Cisco Systems, Hewlett Packard Enterprise, and Intel.
AI networking standard. Why It Matters
Nvidia, the largest player in the AI chip market with around 80% market share, is not part of this group. Tech giants like Google and Meta are eager to reduce their dependence on Nvidia, whose networking business is an essential part of the package that enables its AI dominance.
Broadcom’s central rival in the networking and custom chip market, Marvell Technologies, is also not part of the group.
Key Quote
“An industry specification becomes critical to standardize the interface for AI and Machine Learning, HPC (high-performance computing), and Cloud applications for the next generation of AI data centers and implementations,” the companies said in a statement.
Context
Tech companies are investing billions of dollars in the hardware required to support AI applications, boosting demand for AI data centers and the chips that run them.
The Ultra Accelerator Link group has designed specifications governing connections among different accelerators in a data center. The specifications will be available in the third quarter of 2024 to companies that join the Ultra Accelerator Link (UALink) Consortium.
Response
A spokesperson for Nvidia declined to comment. Marvell did not immediately respond to a request for comment.
AgĂȘncia EON is closely following market trends in technology and will continue to bring you the latest news and innovations.