Highlights:

  • According to The Information’s source, the Microsoft custom network card resembles Nvidia’s ConnectX-7 network adapter series.
  • Microsoft recently unveiled a proprietary AI accelerator named the Maia 100, accompanied by an internally designed server rack and liquid cooling system.

According to a report by The Information recently, engineers from Microsoft Corp. are developing a specialized network device for its data centers.

According to a source cited by the publication, the development effort for the project is anticipated to span over a year. According to sources familiar with the matter, Pradeep Sindhu, a Co-Founder and Former Chief Executive of Juniper Networks Inc., reportedly spearheaded the initiative. As part of a USD 14 billion deal, that company is presently undergoing acquisition by Hewlett Packard Enterprise Co.

The servers within a data center are interconnected by a network, enabling seamless communication and data sharing. Data traffic doesn’t flow directly from a server to the data center network it’s connected to; instead, it passes through an intermediary called a network card or adapter. This specialized chip facilitates each packet’s routing to its intended destination and often handles additional tasks, such as optimizing connection speeds.

Reportedly, Microsoft intends to incorporate its custom network card into its artificial intelligence infrastructure. Specifically, the emphasis is on managing the network traffic of servers outfitted with Nvidia Corp. graphics cards. Reportedly, Microsoft aims for the custom card to accelerate AI workloads and simultaneously lower hardware procurement expenses.

According to The Information’s source, the Microsoft custom network card resembles Nvidia’s ConnectX-7 network adapter series. The latter product can process up to 400 gigabits of traffic per second. It supports Ethernet and InfiniBand, the two communications standards on which most data center networks are based.

For effective coordination, the servers composing an AI cluster must exchange data stored in their memory pools. Requests for data sharing typically need to pass through a server’s central processing unit (CPU). Nvidia’s ConnectX-7 adapter incorporates RDMA technology, which bypasses the CPU and notably accelerates data retrievals.

Additionally, the device boasts several other performance optimization features. Significantly, it can execute cybersecurity tasks like encrypting data traffic, which would otherwise burden a server’s CPU, freeing up more CPU capacity for applications. ConnectX-7 also delegates specific computations associated with detecting data transmission errors.

Microsoft may aim to replicate those features with its custom network card. Sindhu, the executive reportedly leading the development of the adapter, joined the company in 2023 following its acquisition of a startup he founded called Fungible Inc. Fungible developed a chip that, akin to Nvidia’s ConnectX-7, can offload specific cybersecurity tasks and related computations from a server’s CPU to enhance application performance.

Microsoft’s custom network card would add to the extensive roster of data center components it internally develops. The company recently unveiled a proprietary AI accelerator named the Maia 100, accompanied by an internally designed server rack and liquid cooling system. Microsoft intends to deploy the accelerator in its data centers alongside a CPU named the Cobalt 100, developed by its engineers based on designs from Arm Holdings plc.