SemiExpo 2022 » Utilizing More Data To Improve Chip Design
  • Русский
  • English
08 April 2019

Just about every step of the IC tool flow generates some amount of data. But certain steps generate a mind-boggling amount of data, not all of which is of equal value. The challenge is figuring out what’s important for which parts of the design flow. That determines what to extract and loop back to engineers, and when that needs to be done in order to improve the reliability of increasingly complex chips and reduce the overall time to tapeout.

This has become essential in safety-critical markets, where standards require tracking of data analysis. But beyond that, analyzing data can help pinpoint issues in design and manufacturing, including issues stemming from thermal and other physical effects to detecting transient anomalies and latent defects. Along with that data, new techniques for data analysis are being developed, including the use of digital twins, artificial intelligence and machine learning.

In logic verification flows, the data from testbench code coverage is analyzed exhaustively to show which parts of the RTL version of designs have been exercised, and to what extent they have been tested. This data analysis is critical to the logic verification flow. Other areas of the IC design flow generate a lot of data that traditionally has not been analyzed as exhaustively because it simply provides an indication of whether chips are functioning as expected. That is changing, however.

“With the increasing emergence of digitalization, autonomous and AI in just about everything, all data is growing in importance, said Joe Sawicki, executive vice president of IC EDA at Mentor, a Siemens Business. “The growing number of safety standards also require new levels of tracking processes and mandate documentation of the data generated by all these processes and tools. Increasingly these standards call for extensive testing at all levels of development—from IP to block to chip to PCB to ECU to entire electronic system to end product electrical/mechanical—and that test data needs to be documented.”

It also needs to be looked at in context, which is where some of the new tools become critical. In automotive and some medical devices, for example, one system may need to fail over to another. This is where AI and concepts such as digital twins fit in, because these devices need to be tested in virtual scenarios running real software before the products are manufactured and tested in the real world.

“All that needs to be tested, and the methods of testing and results need to be documented,” Sawicki said, noting that to achieve this, integrated solutions are essential to not only enable design and verification teams to test and collect data at the level of the systems, across engineering disciplines, and as a digital twin, but then to use that data to produce more robust next-generation products and even more efficient ways to manufacture those products. “Having more data, and sharing that data between tools will enable the emerging-generation of AI/ML-powered tools to be better trained to perform tasks faster and with greater accuracy—not just for the tools in traditional IC EDA flow, but for all of the tools involved in building better, more complete digital twins.”



About the Author