Skip To Main Content
Dr. Suin Yi
Dr. Suin Yi is bridging the gap between digital computers and quantum computers. | Image: Texas A&M Engineering

The graphics processing unit (GPU) has become one of the most important types of computing technology, as seen by the fabless semiconductor company Nvidia hitting the record market cap of over $1 trillion. Used in a wide range of applications, GPU technology has advanced to open new opportunities for gaming, high-performance computing, machine learning and artificial intelligence (AI), and more.

While GPUs continue to open new doors for AI and machine learning that can benefit society, they are still consuming an immense amount of energy from the data centers where they run. The record high for power consumption from a single data center is 500 megawatts, which is equal to half of what a single nuclear power plant generates. Due to these factors, current AI and machine-learning applications are reaching a point of long-term unsustainability.

Dr. Suin Yi, assistant professor in the Department of Electrical and Computer Engineering at Texas A&M University, is working to change this by developing a better hardware system that mimics the human brain and can better process the work being done by the GPUs.

I’m excited to provide an opportunity to go far beyond our current limitations so that AI and machine learning can become safer and more reliable.

Dr. Suin Yi

Digital computers digitize numbers, meaning every number is binary — consisting of only 0s and 1s. Quantum computers, surging as one of the hottest topics in academia, are yet to be deployed in daily life due to a required cryogenic environment (-459 degrees Fahrenheit). Yi is bridging the gap between digital computers and quantum computers to develop a new memristor computer that builds upon the existing infrastructure of current semiconductor chip manufacturing technology but is far more advanced and energy efficient than what is currently available.

As part of this project, Yi and his team experimentally demonstrated small-scale neural networks of 4,000 synapses successfully, which is beyond the exploratory materials-level research in literature which typically involves one synapse. Looking ahead, Yi is aiming to expand the memristor computer to more large-scale synapses (over 1 trillion), which approaches what the human brain has (about 100 trillion synapses), to accommodate applications such as chatGPT.

“I have no doubt that most people these days are excited by new machine-learning techniques such as ChatGPT, computer vision and some self-driving applications, but those are not trustworthy yet due to the limitation of computing resources,” Yi said. “I’m excited to provide an opportunity to go far beyond our current limitations so that AI and machine learning can become safer and more reliable.”

This project, funded by the U.S. Air Force Office of Scientific Research and Sandia National Laboratories, was conducted alongside Hewlett Packard Enterprise Company Chair Professor Dr. R. Stanley Williams and published in the January 2023 issue of Nature Electronics.