Silicon Valley startup d-Matrix has announced the shipment of its first AI chip, which aims to revolutionize how artificial intelligence handles user interactions. The Santa Clara-based startup, co-founded by Sid Sheth and Sudeep Bhoja, has secured over $160 million in funding including support from Microsoft's venture capital arm.
d-Matrix specializes in AI inference hardware and focuses on optimizing the process of managing user requests on trained AI systems, such as chatbots and video generators.
The company's chip is engineered to support a high volume of simultaneous user requests. This capability is particularly appealing for video generation applications, where multiple users may require unique, customized outputs concurrently. Super Micro Computer is expected to incorporate d-Matrix chips into its servers, which will be available for commercial use starting next year.
"We are getting a lot of interest in video use cases where we have customers coming and saying, 'Hey, look, we want to generate videos, and we want a collection of users, all interacting with their own respective video,'" said Sheth, d-Martix's chief executive officer, according to Reuters.
Sheth brings over 20 years of expertise in semiconductor innovation, having held senior roles at companies like Inphi, Broadcom, and Intel. Sheth's strategic insights have driven significant growth in data center-focused businesses, such as Inphi's networking division, which he expanded into a $200 million enterprise before its $10 billion acquisition by Marvell Technology.
READ: AI chip startup Cerebras files for IPO (October 2, 2024)
Chief Technical Officer Bhoja has extensive engineering experience, specializing in high-speed data interconnects and optical networking. Bhoja's pioneering work at Inphi in developing digital signal processing technologies for data centers laid the groundwork for d-Matrix's focus on scalable, energy-efficient AI hardware.
While d-Matrix doesn't aim to replace industry leaders like NVIDIA, it seeks to complement their offerings by addressing the efficiency and cost challenges of AI inference. Its innovative "chiplet" architecture and in-memory computing techniques position the startup as a promising disruptor in the AI hardware space, aiming to redefine how data centers handle increasingly complex workloads.