Intelligent Verification, including intelligent testbench automation, is a form of functional verification of electronic hardware designs used to verify that a design conforms to specification before device fabrication. Intelligent verification uses information derived from the design and specification(s) to expose bugs in and between hardware IPs. Intelligent verification tools require considerably less engineering effort and user guidance to achieve verification results that meet or exceed the standard approach of writing a testbench program.

The first generation of intelligent verification tools optimized one part of the verification process known as Regression testing with a feature called automated coverage feedback. With automated coverage feedback, the test description is automatically adjusted to target design functionality that has not been previously verified (or "covered") by other tests existing tests. A key property of automated coverage feedback is that, given the same test environment, the software will automatically change the tests to improve functional design coverage in response to changes in the design.

Newer intelligent verification tools are able to derive the essential functions one would expect of a testbench (stimulus, coverage, and checking) from a single, compact, high-level model. Using a single model that represents and resembles the original specification greatly reduces the chance of human error in the testbench development process that can lead to both missed bugs and false failures.

Other properties of intelligent verification may include:

  • Providing verification results on or above par with a testbench program but driven by a compact high-level model
  • Applicability to all levels of simulation to decrease reliance on testbench programs
  • Eliminating opportunities for programming errors and divergent interpretations of the specification, esp. between IP and SoC teams
  • Providing direction as to why certain coverage points were not detected.
  • Automatically tracking paths through design structure to coverage points, to create new tests.
  • Ensuring that various aspects of the design are only verified once in the same test sets.
  • Scaling the test automatically for different hardware and software configurations of a system.
  • Support for different verification methodologies like constrained random, directed, graph-based, use-case based in the same tool.

"Intelligent Verification" uses existing logic simulation testbenches, and automatically targets and maximizes the following types of design coverage:

History

Achieving confidence that a design is functionally correct continues to become more difficult. To counter these problems, in the late 1980s fast logic simulators and specialized hardware description languages such as Verilog and VHDL became popular. In the 1990s, constrained random simulation methodologies emerged using hardware verification languages such as Vera[1] and e, as well as SystemVerilog (in 2002), to further improve verification quality and time.

Intelligent verification approaches supplement constrained random simulation methodologies, which bases test generation on external input rather than design structure.[2] Intelligent verification is intended to automatically utilize design knowledge during simulation, which has become increasingly important over the last decade due to increased design size and complexity, and a separation between the engineering team that created a design and the team verifying its correct operation.[1]

There has been substantial research into the intelligent verification area, and commercial tools that leverage this technique are just beginning to emerge.

See also

Vendors offering Intelligent Verification

Footnotes

References

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.