Despite countless transistors, today’s chips are not prepared for all the challenges of digitalization. In the future, they will also have to be able to monitor and control themselves.
In the ideal factory of tomorrow, systems will use sensor data to record and analyze their current status and make independent decisions based on forecasts. In this context, one often hears the term “digital twin.” These virtual images of machines, systems, or entire production areas can be used to predict faults, optimize maintenance intervals, detect and remedy errors, but also to precisely configure IoT installations. They simulate the entire value chain and integrate additional capabilities such as functions resulting from artificial intelligence. For the analysts at Gartner, digital twin is one of the ten key technology trends for 2018.
From large-scale to small-scale
A German-American research team now wants to apply what works on a large scale to computer chips. Only the size of a finger nail, they now consist of billions of transistors and require complicated controls for switching frequency, power consumption, and temperature to prevent overloads. A network of sensors on the chips provides the data. During development, a variety of such controls have emerged that work side by side, sometimes even against each other. Therefore, such circuits are designed very conservatively for safety-critical or high-reliability applications. The result: a lot of performance potential is lost.
Autonomous chips with “smart” control
To prevent this, the monitoring and control functions must run autonomously in future computer chips. The component will then be able to monitor and control itself. For this purpose, an intelligent, learning control is based on a constantly updated self-image of the chip. Applications in the field of autonomous driving and medical technology should serve as initial examples. At the Technical University of Munich, for example, new concepts for multi-core processor SoCs are being developed that run various critical applications on an FPGA-based platform.
About the project
Since January 1, 2018, a German-American research team consisting of scientists from the Technical University of Braunschweig, Technical University of Munich, and University of California have been carrying out research and development in the area of “information processing factory.” With around €2 million, the project is funded by the German Research Foundation (DFG) and the U.S. National Science Foundation (NSF).