ELLIIT researchers at Lund University are developing a toolbox to simulate integrated circuits, or chips, before they are manufactured. A chip consists of a large number of heterogeneous components that must all function together. Complexity is increasing, and even a slight mistake may have serious consequences for a company.
We take it for granted today that we can receive and use streamed data. It’s obvious that a camera can record both text and images, sometimes even moving images. We are happy to switch background on a digital collaboration platform while we are seen in the foreground, and with sound transfer in real time.
Behind the scenes are skilled and experienced designers of integrated circuits, or chips, which make it work, every time, without fail.
“All major companies are now working with data streams, and it’s a fundamental problem here that all systems consist of a complex and heterogeneous mixture of components that process different parts of the signals”, says Jörn Janneck, senior lecturer in the Department of Computer Science at Lund University.
He uses the camera as an example: one part of the chip receives the signal and processes it, another part looks for patterns (many cameras, for example, can zoom in to a face), while other parts carry out image processing and machine learning. The different parts of the chip have well-defined tasks.
“If a product is to be competitive, it must be small and consume as little power as possible. This means that the complexity of the heterogeneous collection of components is continuously increasing”, Jörn Janneck explains.
Chips must also be sufficiently fast, with some of the processing being carried out in hardware and some in software. The locations of components must be decided early during the design process, together with how they are to work together.
“This is becoming increasingly complicated as more functions are added. Today the performance of a chip can’t be evaluated or analysed until a late stage of the process”, he says.
This is where Jörn Janneck and his colleagues at ELLIIT come into the picture: in collaboration with the electronics industry they are building a toolbox that can simulate chip function. A first step was taken in 2016 in a project that was part of Eurostars, financed by Vinnova. This project was a collaboration with hardware manufacturer Magillem, software developer Softeam, and École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland.
The Swedish part of the project resulted in a compiler suite named Tÿcho, which enabled a comprehensive view of both hardware and software development.
“We have continued this work and developed a tool that can contribute to the design of many different types of system. It can analyse and characterise each algorithm on a chip. We then build a model in which it is possible to simulate several variants, and obtain feedback about what functions and what doesn’t”, says Jörn Janneck.
Decisions can be made earlier in the design process and several versions can be tested without having to build the chip.
“Companies that manufacture their own chips have skilled professionals who have been designing integrated circuits for many years. But today they are compelled to make qualified guesses about what will work, and then they add a safety margin to this. Despite this, however, expensive mistakes are sometimes made.”
Such an error in the design of a chip is not only expensive: it also delays the market introduction of new functions and products, which are extremely important for both large and small companies working with streaming data.
Jörn Janneck, senior lecturer at Department of Computer Science, Lund University, together with three post doc and two doctoral students.