This module seamlessly integrates into systems with a PCIe Gen 3 M.2 slot
Founded by tech innovators from the University of Michigan, MemryX has introduced a new M.2 module priced at $149 [PDF]. This device is engineered to enhance AI processing in compact systems, ideally suiting edge computing environments that demand both efficiency and a small footprint.
The module incorporates four MemryX MX3 AI accelerator chips within the standard M.2 2280 size, making installation straightforward in any system with a PCIe Gen 3 M.2 slot. Each chip provides 6 TOPS (Tera Operations Per Second), culminating in a robust 24 TOPS while only using 6 to 8 watts. It supports various data formats such as 4-bit, 8-bit, 16-bit weights, and BFloat16. Impressively, the module does not require active cooling and includes a passive heatsink to handle heat dissipation efficiently.
Phoronix has put the MemryX MX3 M.2 module through its paces, assessing both its performance and ease of integration as an AI accelerator. When installed in a system equipped with an M.2 PCIe Gen 3 slot and operating on Ubuntu 24.04 LTS, it was seamlessly integrated using MemryX’s open-source drivers and development tools. The module’s capability to deliver 24 TOPS through its four chips makes it adequate for a variety of inference tasks, especially those that leverage 8-bit weights.
The evaluation also emphasized the module’s extensive software support, including compatibility with frameworks such as TensorFlow and ONNX. It efficiently operates small to medium AI models. Each MX3 chip can manage up to 10.5 million 8-bit parameters, allowing the entire module to handle up to 42 million parameters. This capacity is currently limited by the lack of onboard DRAM, but MemryX has announced plans to release a new PCIe card by 2025 that will incorporate additional MX3 AI chips.
Offered at $149 USD, the MX3 M.2 module is positioned as a cost-effective solution for developers and organizations looking to boost AI processing in edge devices. MemryX has also revealed that it will display the module at the upcoming Consumer Electronics Show (CES) 2025 in Las Vegas, where it will showcase the MX3’s capabilities in various real-world scenarios, further proving its adaptability.
Similar Posts
- Intel’s New Modular PC Design Promises Easy Repairs and Sustainability for Laptops!
- Nvidia, AMD, Intel Invest in Light-Based Tech for Next-Gen Chips, Ayar Labs Secures $155M Funding
- AMD Smashes Expectations with AI Servers 28.3 Times More Efficient Than 2020 Models
- Cerebras AI Smashes AWS, Writes Code 75x Faster Using World’s Biggest Chip!
- Harvard’s Breakthrough Chip Maps 70K Synaptic Connections in Rat Neurons!

Avery Carter explores the latest in tech and innovation, delivering stories that make cutting-edge advancements easy to understand. Passionate about the digital age, Avery connects global trends to everyday life.