The industry says the focus of new verification efforts is misguided and that only the easy problems are being tackled.
—
by Brian Bailey, Technology Editor/EDA, Semiconductor Engineering
—
SoC design traditionally has been an ad-hoc process, with implementation occurring at the register transfer level. This is where verification starts, and after the blocks have been verified, it becomes an iterative process of integration and verification that continues until the complete system has been assembled.
But today, this methodology has at least two major problems, which were addressed in a DAC panel entitled “Scalable Verification: Evolution or Revolution?” The first problem is that the constrained random methodology removed processors from the design because they were not fully controllable. This was acceptable 20 years ago when processors were simple controllers, but today they are an integral part of the design and verification cannot be performed without them. The second is that simulation scaling has stopped, meaning that verification is forced to migrate onto emulators for integration and even for some block-level verification. Both of these problems are becoming more acute.
One solution under development within Accellera is the Portable Stimulus Working Group . This aims to develop a new way of creating stimulus that is portable in several ways. First, it offers horizontal reuse in that the same vectors can be used for virtual prototypes, simulation, emulation, FPGA prototypes and final silicon. The second type of reuse is vertical reuse, where use-cases developed for the system-level can be reused for sub-system or block-level verification and the third type of reuse combines different aspects of the design such as software, power and functionality.
Panelists included Ali Habibi, design verification methodology lead at Qualcomm; Steven Jorgensen, architect for the Hewlett-Packard Networking Provision ASIC Group; Bill Greene, CPU verification manager for ARM and Mark Glasser, verification architect at Nvidia.