Low-Power Circuits Increasingly Needed In Wireless Age
WEST LAFAYETTE, Ind. — Purdue University engineers have designed an innovative circuit shown to drastically reduce the amount of power needed to run a computer’s memory. The technology is aimed at saving energy, enabling portable devices to run longer on a single charge and to use lighter-weight batteries. “The ultimate goal is to keep the performance at the highest level possible while reducing power consumption to as low as possible,” says T.N. Vijaykumar, an assistant professor of electrical and computer engineering at Purdue. Power conservation is critical for laptop computers, medical devices that are worn on or implanted in the body, and a plethora of emerging wireless devices that run on batteries. The low-power issue also is becoming increasingly important for ultra-powerful “parallel processors” used for everything from weather forecasting to animation. These computers require so much power that they place an enormous load on a building’s electrical system. Meanwhile, just as high-performance wireless and portable devices are proliferating, battery technology is reaching its limits, making low-power designs more attractive.
“It’s very hard to hope for large improvements in battery performance,” Vijaykumar says, noting that engineers have barely begun to develop new circuits designed to reduce power consumption. He is involved with other faculty in a project at Purdue called ICALP, for Integrated Circuit/Architecture Approach to Low Power, which was formed to develop innovative low-power computer microprocessors.
Recent ICALP work, made public during an international conference in July, has resulted in a circuit design shown to dramatically reduce the amount of energy needed to run a computer’s memory. The new circuit is designed to continually monitor how much memory is needed — depending on the programs that are running at any given time — and then strategically shut down unneeded memory circuits automatically. The design also reduces the amount of electricity that is normally “leaked” from memory circuits in a computer’s microprocessor chip. “We marry the two concepts in this project,” Vijaykumar says. “These are new schemes. We are not putting two old ideas together.”
Computer simulations have shown that the design would reduce the amount of energy consumed by a computer’s cache memory by 62 percent, while degrading overall performance by only 4 percent.
Cache temporarily stores only the information being accessed most often by a computer user, making for much faster retrieval of that information than would be possible if it were stored along with all the other memory. However, there is a tradeoff for the high performance provided by cache memory: It consumes a large amount of energy.
Presently, computers run on full power all the time, even if they are using programs that require only a small portion of the system’s total memory. “Sure, they have beautiful performance, but they give you that performance whether you want it or not,” Vijaykumar says. “There are very crude ways of conserving power. For example, in your laptop there is a power-saving mode. That just shuts down the whole chip. It’s all or nothing. “But sometimes we will need only 10 percent of the memory that is on the chip. As the application is running, we are going to figure out how much memory it needs and cut down the power for the rest of the unused portion.” The smart circuit reevaluates how much memory is needed every thousandth of a second by counting the number of times cache memory is unable to find information requested during that time. If the cache memory is too often unable to retrieve requested information, then more memory is automatically made available. Conversely, if performance is higher than the level required, memory is reduced.
Vijaykumar discussed the work in a research paper presented in July during the International Symposium on Low Power Electronics and Design, in Rapallo, Italy. He cowrote the paper with Kaushik Roy, an associate professor; Babak Falsafi, an assistant professor; and graduate students Michael Powell and Se-Hyun Yang, all from the school of electrical and computer engineering.