Energy Revolution in AI: Innovative CRAM Technology is Coming

Researchers at the University of Minnesota have developed a new solution that could reduce artificial intelligence’s growing energy consumption by 1,000 times or more. Introducing the innovative CRAM technology…

Artificial intelligence, regarded as one of the greatest technologies of our time, continues to transform all sectors with its innovations. The growing demand for AI applications has brought energy consumption issues into the spotlight. However, this could soon change. Researchers have developed a new “Computational Random Access Memory” (CRAM) prototype that could reduce AI energy use by at least 1,000 times or more.

A Revolutionary Development for AI Computing

For those unfamiliar, current machine learning or AI processes rely on the von Neumann architecture, where logic processes information and memory stores data within the system. This results in significant power and energy consumption during data transfer.

Energy Revolution in AI_0

The newly developed CRAM takes this process directly into the memory using spintronic devices known as Magnetic Tunnel Junctions (MTJs). In spintronic computers, information is stored based on the spin state of electrons rather than just electric charge. This makes it a more efficient alternative to transistor-based chips.

Additionally, scientists state that the project builds on over 20 years of research and pioneering work by Engineering Professor Jian-Ping Wang, who has used MTJ nanodevices in his studies.

Up to 2,500 Times Energy Savings

Ulya Karpuzcu, a co-author of the paper published in *Nature*, noted, “As an extremely energy-efficient digital in-memory computing infrastructure, CRAM is highly flexible in that computation can be performed at any location in the memory array. Therefore, we can reconfigure CRAM to best suit the performance needs of various AI algorithms.” She also mentioned that CRAM is more energy-efficient compared to the building blocks of traditional AI systems.

Energy Revolution in AI_1

In 2022, global electricity consumption for AI training and applications was estimated at 460 terawatt-hours, and it is projected to rise to 1,000 terawatt-hours by 2026. Therefore, technologies like CRAM could be revolutionary in making AI much more efficient at a time when energy demands are surging.

The team behind the study, including researchers from the University of Minnesota, is currently working with leaders in the semiconductor industry to finalize the project and develop hardware to advance AI functionality. However, it’s important to note that this is not a short-term process. Researchers will need to overcome challenges related to scalability, manufacturing, and integration with existing silicon technologies.

Scroll to Top