Article: Verification Planning And Management With Formal

January 30th, 2020 – By: Tom Anderson

 

 

Getting a unified look at what formal, simulation, and acceleration/emulation contribute to verification.

 

Over the last twenty years, formal verification has grown from a niche technology practiced only by specialists to an essential part of mainstream chip development. Along the way, several advances were needed to make wider adoption of formal feasible. These included the standardisation of assertion languages, enhanced formal engine performance and capacity, better debug capabilities, and pushbutton “apps” to solve specific problems. In addition, formal tools and their results were integrated with the results from traditional simulation tests so that team members and leaders could have a unified view of verification progress.

 

This may sound obvious, but it was not the case for many years. Even for chip teams using formal tools, it was common to hear management complaining that they didn’t know how to gauge what value “that group of PhDs in the corner” was adding to the verification process. For those considering adoption, one of the usual questions was “if I use formal, what do I not have to test in simulation?” It was hard to answer these concerns without a unified look at what formal, simulation, and acceleration/emulation contributed to verification.

 

At OneSpin’s recent Osmosis users’ group meeting in Munich, one of our expert users gave an excellent talk to address exactly this need. Antti Rautakoura is an SoC/ASIC Verification Specialist currently on leave from Nokia to pursue his PhD at Finland’s Tampere University. He began by discussing the incredible complexity of today’s large chips, including the increase in parallelism to compensate for the fact that single-thread performance is falling behind Moore’s Law. This puts even more pressure on verification teams, whose work typically consumes about two-thirds of the total project effort.