d-Matrix is building one of the most disruptive computing platforms for AI in cloud and edge data-centers. A pioneer of in-memory computing (IMC) in the data-center, d-Matrix has attacked the physics of memory-compute integration using innovative circuit techniques, and built a 40-100 TOPs/W datacenter inference engine, with a path to further 1000x gains in compute efficiency.
d-Matrix was founded in May 2019 by dedicated entrepreneurs with a 20+ year history in building businesses that have shipped over a 100M chips and generated over 1B in revenue. Our core team comes from companies like Inphi/Marvell, Broadcom, Intel, Nvidia, Qualcomm, Apple, with a background in silicon, systems and software.
At d-Matrix we plan to usher in a new way of doing datacenter AI inferencing using in-memory computing (IMC) techniques with chiplet level scale-out interconnects. We are attacking 4 key pain-points for datacenter customers - power efficiency, model growth, real-time performance and TCO.
We have a holistic approach to hardware micro-architecture, circuits, algorithms, packaging, instruction set and compiler co-design to optimize data movements and enable general purpose programmable deep neural net acceleration.
Our first silicon and software stack, which are now fully functional, have validated our unique architectural approach and are showing very large improvements over incumbents with traditional digital architectures on all critical workloads for NLP, Recommendations and Vision applications.
We have raised $15m from Nautilus Venture Partners, Entrada Ventures, Doorga Capital, Tsingyuan Ventures, TSVC and other strategic partners.
We are always looking for A+ qualified candidates who want to be part of a trailblazing and fast-growing startup that is looking to change the rules in a large, exciting and once-in-a-lifetime opportunity.REACH OUT