Breakthrough Chip Technology Aims to Reduce AI Energy Consumption
Researchers at Oregon State University (OSU) have developed a groundbreaking chip designed to cut the energy consumption of artificial intelligence (AI) large language models by 50%. The innovative technology addresses the growing concern about the high energy demands of AI systems.
The increasing use of AI tools, such as Gemini and ChatGPT, has raised concerns about their significant energy consumption. The International Energy Agency projects that electricity consumption by data centers will double by 2026, reaching 1,000 terawatts – comparable to Japan’s current total consumption.

The challenge lies in the high-speed data transmission required for AI operations. According to Ramin Javadi, a doctoral student at OSU involved in the project, when data is transmitted at high speeds, errors and data corruption can occur. Correcting these errors typically requires additional energy.
The new chip aims to address this issue by recognizing and correcting errors, thereby reducing energy usage. Javadi likened the process to sending a large puzzle through a small pipe. “You have a very small pipe between two buildings. You can’t move a large puzzle by yourself; you need to break it into smaller pieces and send them one by one,” he explained.
Javadi emphasized the importance of accurate data reconstruction. “During transmission, pieces might get lost, damaged, or disturbed. We need to reconstruct the puzzle accurately, with an error rate of less than one in 10 to the power of 11. This means that out of 10 billion pieces, we should reconstruct the puzzle with only one error,” he said.
The technology embedded in the chip is expected to significantly aid in recovering original data and identifying errors, ultimately contributing to a 50% reduction in energy consumption. This development could have a substantial impact on the future of AI and its environmental footprint.