SlideShare a Scribd company logo
Dealing with the Three Horrible Problems in Verification Prof. David L. Dill Department of Computer Science Stanford University
An excursion out of the ivory tower 0-In, July 1996, initial product design discussions:  There are  three horrible problems in verification: Specifying the properties to be checked Specifying the environment Computational complexity of attaining high coverage Up to then, I had assumed that the first two were someone else’s problem, and focused on the last. I still think this is a reasonable framework for thinking about verification.
Topics Mutation coverage (Certess) System-Level Equivalence Checking (Calypto) Integrating verification into early system design (research). Conclusions
Typical verification experience Weeks Bugs per week (Based on fabricated data) Functional testing Tapeout Purgatory
Coverage Analysis: Why? What aspects of design haven’t been exercised? Guides test improvement How comprehensive is the verification so far? Stopping criterion Which aspects of the design have not been well-tested? Helps allocate verification resources.
Coverage Metrics A metric identifies important  structures in a design representation HDL lines, FSM states, paths in netlist classes of behavior Transactions,  event sequences Metric classification based on level of representation. Code-based  metrics (HDL code) Circuit structure-based metrics (Netlist) State-space based metrics (State transition graph) Functionality-based metrics (User defined tasks) Spec-based metrics (Formal or executable spec)
Code-Based Coverage Metrics On the HDL description Line/code block coverage Branch/conditional coverage Expression coverage Path coverage Useful guide for writing test cases Little overhead Inadequate in practice always @ (a or b or s) // mux begin if  (  ~s  &&  p  ) d = a; r =  x  else  if( s ) d = b; else d = 'bx; if( sel == 1 ) q = d; else if ( sel == 0 ) q = z
Circuit Structure-Based Metrics Toggle coverage : Is each  node in the circuit toggled? Register activity : Is each register initialized? Loaded? Read? Counters : Are they reset? Do they reach the max/min value? Register-to-register interactions :   Are all feasible paths exercised? Datapath-control interface : Are all possible combinations  of control and status signals  exercised?  (0-In checkers have these kinds of measures.) s init s 3 s 4 s 2 s 5 s 6 Control Datapath
Observability problem A buggy assignment may be stimulated, but still missed Examples: Wrong value generated  speculatively, but never used. Wrong value is computed  and stored in register Read 1M cycles later,  but simulation doesn’t  run that long.
Detection terminology To detect a bug Stimuli must  activate  buggy logic Verification Environment Compare Reference Model Stimuli Activation Bug Design under Verification
Detection terminology To detect a bug Stimuli must  activate  buggy logic The bug must  propagate  to a checker Verification Environment Compare Reference Model Stimuli Propagation Activation Bug Design under Verification
Detection terminology To detect a bug Stimuli must  activate  buggy logic The bug must  propagate  to a checker The checker must  detect  the bug Verification Environment Compare Reference Model Stimuli Propagation Detection Activation Bug Design under Verification
Detection terminology Traditional verification metrics do not account for non-propagated, or non-detected bugs Traditional verification metrics No visibility with traditional metrics Verification Environment Compare Reference Model Stimuli Propagation Detection Activation Bug Design under Verification
Mutation testing To evaluate testbench’s bug  detection  ability Inject fake bugs into design (“mutations”). Simulate and see whether they are detected. If not, there is a potential gap in the testbench. There can be many kinds of mutations “ Stuck at” faults Wrong logical or other operators Idea originates in software testing But is obviously related to testability. Efficient implementation is a challenge.
Certess approach to Mutation Analysis Fault  Model  Analysis Fault  Activation  Analysis Qualify the Verif. Env. Static analysis of the design Analysis of the verification environment behavior Measure the ability of the verification environment to detect mutations Iterate if needed Report Report Report
Avoiding the horrible problems Qualify test framework, not design Environment/properties are in existing test bench. High-quality coverage metric targets resources at maximizing useful coverage.
SEC Advantages SEC vs. Simulation Simulation is resource intensive, with lengthy run times - SEC runs orders of magnitude faster than simulation Vector generation effort-laden, may be source of errors – SEC requires minimal setup, no test vectors Simulation output often requires further processing for answers – SEC is exhaustive (all sequences over all time) SEC vs. Property Checkers Properties are created to convey specification requirements –  SEC uses the golden model as the specification Properties are often incomplete, and not independently verifiable Properties are time consuming to construct
Enabling ESL™ SLEC™ SLEC comprehensively proves functional equivalence Identifies design differences (bugs) Supports sequential design changes  State Changes Temporal differences I/O differences = ? Reference Model Implementation Model
SLEC Finds Functional Differences in C-C Verification  Customer example  Verify HLS model is the functionally equivalent to the reference model  Simulation uncovered no differences -  for the given test-bench SLEC System found differences between the two models Reference model was incorrect Probable corner case not easily detectable by simulation SLEC System finds all possible errors or inconsistencies. Simulation is not exhaustive, and therefore cannot fully prove equivalence. Typical functional differences introduced during refinement Code optimization for HLS Datapath word size optimization Ambiguous ESL code Ex:  Out of array bounds Behavioral C/C++ HLS C/C++ wrapper HLS Model wrapper Ref Model C to C Verification Failed! DIFFERENCE FOUND! User Defined Input Constraints Reference Model HLS Input Code
Design Bugs Caught by SLEC System Bugs Found Application High- Level Synthesis bug in “wait_until()” interpretation DCT Function High-Level Synthesis bug in array’s access range Wireless baseband Design bug in logic on asynchronous reset line Video processing Design bug in normalization of operands Custom DSP Block Design bug at proof depth = 1.  Image Resizer High-Level Synthesis bug in sign extension Video processing
System-level Formal Verification Sequential Logic Equivalence Checking (SLEC) Leverages system-level verification Comprehensive verification – 100% coverage Quick setup - no testbenches required Rapid results – eliminates long regressions Focused debug – short counter examples Why is it needed Independent verification Find bugs caused by language ambiguities or incorrect synthesis constraints Shift left by -1 Divide by zero Verify RTL ECO’s Parallels current RTL synthesis methodology
High-level Synthesis Bugs found by SLEC Bugs Found Application Shift by an integer in the C-code could be a shift by a negative number which is undefined in C Multi-media processor Dead end states created Multi-media Processor Combinational loops created FFT Shift left or right by N bits, when the value being shifted is less than N bits Ultra wideband Filter Divide by zero defined in RTL, but undefined in C code Quantize
RTL to RTL Verification with  Sequential Differences RTL Pipelining Latency & throughput changes Clock speed enhancement cmd1 data1 calcA1 out1 calcB1 cmd2 data2 calcA2 out2 calcB2 cmd3 data3 calcA3 out3 calcB3 cmd4 data4 calcA4 out4 calcB4 cmd1 data1 calc1 out1 cmd2 data2 calc2 out2 cmd3 calc3 out3 cmd4 data4 calc4 out4 data3 Verified  Equivalent Or  Counter-Example = ?
RTL to RTL Verification with  Sequential Differences RTL Resource Sharing State and latency changes Size optimization Verified  Equivalent Or  Counter-Example A B C Sum + + clk reset B C Sum + A = ?
RTL to RTL Verification with  Sequential Differences RTL Re-Timing State changes Slack adjustment Allows micro-architecture modifications without breaking testbenches Verified  Equivalent Or  Counter-Example C2 C1 C2 C1 D Q Comb. Logic C2 C1 C2 C1 Comb. Logic D Q Reduced Comb. Logic Comb. Logic = ?
Designer’s Dilemma– Efficient Design for Power At 90nm and below, power is becoming the most critical design constraint Exponential increase in leakage power consumption Quadratic increase in power density Clock Gating is the most common design technique used for reducing power Designers manually add clock gating to control dynamic power Clock gating is  most efficiently  done at the RTL level, but is error prone Mistakes in implementation cause delays and re-spins Difficult to verify with simulation regressions Requires modifications to testbenches Insufficient coverage of clock gating dependencies Aggressive clock gating approaches sometimes rejected due to verification complexity
Addressing Power in the Design Flow Power management  schemes are considered globally as part of the system model and initial RTL functionality Sleep modes Power down  Power optimizations  are local changes made to RTL that do not effect the design functionality Disabling previous pipeline stages when the data is not used Data dependent computation, like multiple by zero Physical Implementation   Manual RTL Optimization RTL High level Synthesis  or  Manual Creation   Optimized RTL System Model
Combinational clock gating Sequential clock gating Sequential clock gating Combinational Equivalence Checking Sequential Equivalence Checking CG CG CG en clk en clk CG CG CG CG
Research Verification currently based on finding and removing bugs. Finding bugs earlier in the design process would be beneficial Early descriptions (protocol, microarchitecture) are smaller, more tractable Early bugs are likely to be serious, possibly lethal Bug cost goes up by >10x at each stage of design. People have been saying this for years. Why can’t we start verifying at the beginning of the design?
An Experiment DARPA-sponsored “Smart Memories” project starting up Have verification PhD student (Jacob Chang) work with system designers Try to verify subsystems as soon as possible. Understand what “keeps designers awake at night.” Try to understand “Design for verification” (willing to trade off some system efficiency for verification efficiency).
Initial Results: Dismal Used a variety of formal verification tools SRI’s PVS system Cadence SMV Did some impressive verification work Didn’t help the design much By the time something was verified, design had changed. We know this would be a problem, but our solutions weren’t good enough
Desperate measures required We discarded tools, used pen and paper This actually helped! Real bugs were found Design principles were clarified Designers started listening
What did we learn? Early verification methods need to be nimble Must be able to keep up with design changes. Existing formal methods are not nimble. Require comprehensive descriptions High level of abstraction helps… But one description still takes on too many issues So design changes necessitate major changes in descriptions – too slow!
Approach: Perspective-based Verification Need to minimize the number of issues that we tackle at one time. Perspective :  Minimal high-level formalization of a design to analyze a particular class of properties. Perspectives should be based on designer’s abstractions What does he/she draw on the whiteboard? Should capture designer’s reasoning about correctness
Example: Resource dependencies Verify Cache Coherence message system is deadlock free Model Dependency graph and check for cycles Analysis method Search for cycles In this case: by hand! System-level deadlocks are notoriously hard to find using conventional formal verification tools.
Dependency graph (cache & memory)
Resource Dependency Perspective Partial formalization of design Relevant details Request buffers dependency Protocol dependencies e.g. cancel must follow all other SyncOp commands Virtual channel in networks Irrelevant details Network ordering requirements Cache controller ordering requirements Buffer implementation One class of verification properties Deadlock free Captures why the property is correct Ensure no cycle in resource dependency
Bug found Dependency Cycle found Taken into account dependency behavior of Virtual channel  Memory controller Cache controller Easy to find once formal model is constructed Hard to find using simulation All channels must be congested Bug found before implementation Cache Controller Memory Controller Cache Controller SyncMiss Sync Op Unsuccessful SyncMiss Sync Op Successful Wake Up Replay Replay
Parallel Transaction Perspective Many systems process a set of transactions Memory reads/writes/updates Packet processing/routing User thinks of transactions as non-interfering processes Hardware needs to maintain this illusion. Model: State transaction diagram Analysis: Systematically check whether one transaction can interfere with another. Several important bugs were found by manually applying this method.
Parallel Transaction Perspective Partial formalization of design Relevant details Effect of transition on self and others Locking mechanism Irrelevant details Timing and ordering information Buffering issues Deadlock issues Targets on one verification property Same behavior of single process in a multi-process environment Captures why the property is correct Interrupts are conflict free
Transaction Diagram Verifier Tool developed for verification of the parallel transaction perspective User input Invariants Transition guards Transition state changes Invariants easy to see it’s true for single process TDV verifies invariant for single process, plus Invariants are true even if other processes execute at the same time
TDV User supplies Blocks (transaction steps) Pre-conditions, post-conditions, guards, assignments Links between blocks (control flow) Tool loops through all pairs of block Construct the verification tasks Verify the tasks through another tool STP decision procedure Not a model checker Verifies unbounded number of transactions Uses theorem-proving technology.
Tradeoffs Sacrifices must be made Perspectives are necessarily partial Not easy to link perspectives to RTL Not easy to link perspectives to each other …but, at least, you can verify or find bugs while their still relevant to the design!
The horrible problems Perspectives omit irrelevant details. including irrelevant environmental constraints. Properties are at the level the  designer thinks, so they are easier to extract Computational complexity reduced as well
Conclusions Practical verification technology must take account of the three horrible problems Products currently on the market do this in innovative ways Coverage analysis that is a closer match to actual bug-finding ability Evaluates existing verification environment System-level equivalence checking avoids need to add assertions Environmental constraint problem reduced. We need a new perspective on system-level verification  

More Related Content

PPT
Dealing with the Three Horrible Problems in Verification
DVClub
 
PDF
Change Impact Analysis for Natural Language Requirements
Lionel Briand
 
PDF
Personalized Defect Prediction
Sung Kim
 
PDF
Combining genetic algoriths and constraint programming to support stress test...
Lionel Briand
 
PPTX
CrashLocator: Locating Crashing Faults Based on Crash Stacks (ISSTA 2014)
Sung Kim
 
PPTX
Automatically Generated Patches as Debugging Aids: A Human Study (FSE 2014)
Sung Kim
 
PPTX
REMI: Defect Prediction for Efficient API Testing (

ESEC/FSE 2015, Industria...
Sung Kim
 
PPTX
The Impact of Test Ownership and Team Structure on the Reliability and Effect...
Kim Herzig
 
Dealing with the Three Horrible Problems in Verification
DVClub
 
Change Impact Analysis for Natural Language Requirements
Lionel Briand
 
Personalized Defect Prediction
Sung Kim
 
Combining genetic algoriths and constraint programming to support stress test...
Lionel Briand
 
CrashLocator: Locating Crashing Faults Based on Crash Stacks (ISSTA 2014)
Sung Kim
 
Automatically Generated Patches as Debugging Aids: A Human Study (FSE 2014)
Sung Kim
 
REMI: Defect Prediction for Efficient API Testing (

ESEC/FSE 2015, Industria...
Sung Kim
 
The Impact of Test Ownership and Team Structure on the Reliability and Effect...
Kim Herzig
 

What's hot (20)

PDF
It's Not a Bug, It's a Feature — How Misclassification Impacts Bug Prediction
sjust
 
PDF
HITECS: A UML Profile and Analysis Framework for Hardware-in-the-Loop Testing...
Lionel Briand
 
PPTX
Issre2014 test defectprediction
Kim Herzig
 
PDF
Effective Test Suites for ! Mixed Discrete-Continuous Stateflow Controllers
Lionel Briand
 
PDF
Formal verification
DIlawar Singh
 
PDF
Analyzing Natural-Language Requirements: The Not-too-sexy and Yet Curiously D...
Lionel Briand
 
PDF
Formal Verification
Ilia Levin
 
PDF
Interface-Implementation Contract Checking
Dharmalingam Ganesan
 
PPTX
Secure application programming in the presence of side channel attacks
Dharmalingam Ganesan
 
PPT
system verilog
Vinchipsytm Vlsitraining
 
PPTX
Model-based Testing of a Software Bus - Applied on Core Flight Executive
Dharmalingam Ganesan
 
PDF
MTV15
Rico Angell
 
PPTX
Soft quality & standards
Prince Bhanwra
 
PDF
SBST 2019 Keynote
Shiva Nejati
 
PDF
Applying Product Line Use Case Modeling ! in an Industrial Automotive Embedde...
Lionel Briand
 
PDF
Design for Testability
kumar gavanurmath
 
PDF
Automated Change Impact Analysis between SysML Models of Requirements and Design
Lionel Briand
 
PDF
Partitioning Composite Code Changes to Facilitate Code Review (MSR2015)
Sung Kim
 
PDF
Artificial Intelligence for Automated Software Testing
Lionel Briand
 
PDF
How to Actually DO High-volume Automated Testing
TechWell
 
It's Not a Bug, It's a Feature — How Misclassification Impacts Bug Prediction
sjust
 
HITECS: A UML Profile and Analysis Framework for Hardware-in-the-Loop Testing...
Lionel Briand
 
Issre2014 test defectprediction
Kim Herzig
 
Effective Test Suites for ! Mixed Discrete-Continuous Stateflow Controllers
Lionel Briand
 
Formal verification
DIlawar Singh
 
Analyzing Natural-Language Requirements: The Not-too-sexy and Yet Curiously D...
Lionel Briand
 
Formal Verification
Ilia Levin
 
Interface-Implementation Contract Checking
Dharmalingam Ganesan
 
Secure application programming in the presence of side channel attacks
Dharmalingam Ganesan
 
system verilog
Vinchipsytm Vlsitraining
 
Model-based Testing of a Software Bus - Applied on Core Flight Executive
Dharmalingam Ganesan
 
Soft quality & standards
Prince Bhanwra
 
SBST 2019 Keynote
Shiva Nejati
 
Applying Product Line Use Case Modeling ! in an Industrial Automotive Embedde...
Lionel Briand
 
Design for Testability
kumar gavanurmath
 
Automated Change Impact Analysis between SysML Models of Requirements and Design
Lionel Briand
 
Partitioning Composite Code Changes to Facilitate Code Review (MSR2015)
Sung Kim
 
Artificial Intelligence for Automated Software Testing
Lionel Briand
 
How to Actually DO High-volume Automated Testing
TechWell
 
Ad

Viewers also liked (9)

PDF
Colwell validation attitude
Obsidian Software
 
PDF
Chris brown ti
Obsidian Software
 
PDF
Jonathan bromley doulos
Obsidian Software
 
PDF
Ludden power7 verification
Obsidian Software
 
PDF
3 d to _hpc
Obsidian Software
 
PPT
Dv club foils_intel_austin
Obsidian Software
 
PDF
Benjamin q4 2008_bristol
Obsidian Software
 
PDF
Zehr dv club_12052006
Obsidian Software
 
PDF
Zhang rtp q307
Obsidian Software
 
Colwell validation attitude
Obsidian Software
 
Chris brown ti
Obsidian Software
 
Jonathan bromley doulos
Obsidian Software
 
Ludden power7 verification
Obsidian Software
 
3 d to _hpc
Obsidian Software
 
Dv club foils_intel_austin
Obsidian Software
 
Benjamin q4 2008_bristol
Obsidian Software
 
Zehr dv club_12052006
Obsidian Software
 
Zhang rtp q307
Obsidian Software
 
Ad

Similar to Dill may-2008 (20)

PDF
C044061518
IJERA Editor
 
PDF
SSBSE 2020 keynote
Shiva Nejati
 
PPT
testing
Rashmi Deoli
 
PPT
Parasoft .TEST, Write better C# Code Using Data Flow Analysis
Engineering Software Lab
 
PDF
Coverage and Introduction to UVM
Dr. Shivananda Koteshwar
 
PPT
cupdf.com_chapter-11-system-level-verification-issues-the-importance-of-verif...
SamHoney6
 
PPTX
Automating The Process For Building Reliable Software
guest8861ff
 
PPTX
Training - What is Performance ?
Betclic Everest Group Tech Team
 
PPTX
#DOAW16 - DevOps@work Roma 2016 - Testing your databases
Alessandro Alpi
 
PPTX
Soft quality & standards
Prince Bhanwra
 
PDF
Making Model-Driven Verification Practical and Scalable: Experiences and Less...
Lionel Briand
 
PDF
Finding Bugs Faster with Assertion Based Verification (ABV)
DVClub
 
PDF
Lear unified env_paper-1
Obsidian Software
 
PPTX
Model Driven Testing: requirements, models & test
Gregory Solovey
 
PPT
Chapter 8 Testing Tactics.ppt
VijayaPratapReddyM
 
PPT
Qat09 presentations dxw07u
Shubham Sharma
 
PDF
Software Testing and Quality Assurance Assignment 2
Gurpreet singh
 
PDF
ASIC SoC Verification Challenges and Methodologies
Dr. Shivananda Koteshwar
 
ODP
Workshop BI/DWH AGILE TESTING SNS Bank English
Marcus Drost
 
PPT
Manualtestingppt
balaji naidu
 
C044061518
IJERA Editor
 
SSBSE 2020 keynote
Shiva Nejati
 
testing
Rashmi Deoli
 
Parasoft .TEST, Write better C# Code Using Data Flow Analysis
Engineering Software Lab
 
Coverage and Introduction to UVM
Dr. Shivananda Koteshwar
 
cupdf.com_chapter-11-system-level-verification-issues-the-importance-of-verif...
SamHoney6
 
Automating The Process For Building Reliable Software
guest8861ff
 
Training - What is Performance ?
Betclic Everest Group Tech Team
 
#DOAW16 - DevOps@work Roma 2016 - Testing your databases
Alessandro Alpi
 
Soft quality & standards
Prince Bhanwra
 
Making Model-Driven Verification Practical and Scalable: Experiences and Less...
Lionel Briand
 
Finding Bugs Faster with Assertion Based Verification (ABV)
DVClub
 
Lear unified env_paper-1
Obsidian Software
 
Model Driven Testing: requirements, models & test
Gregory Solovey
 
Chapter 8 Testing Tactics.ppt
VijayaPratapReddyM
 
Qat09 presentations dxw07u
Shubham Sharma
 
Software Testing and Quality Assurance Assignment 2
Gurpreet singh
 
ASIC SoC Verification Challenges and Methodologies
Dr. Shivananda Koteshwar
 
Workshop BI/DWH AGILE TESTING SNS Bank English
Marcus Drost
 
Manualtestingppt
balaji naidu
 

More from Obsidian Software (20)

PDF
Yang greenstein part_2
Obsidian Software
 
PDF
Yang greenstein part_1
Obsidian Software
 
PDF
Williamson arm validation metrics
Obsidian Software
 
PDF
Whipp q3 2008_sv
Obsidian Software
 
PPT
Vishakantaiah validating
Obsidian Software
 
PDF
Validation and-design-in-a-small-team-environment
Obsidian Software
 
PDF
Tobin verification isglobal
Obsidian Software
 
PDF
Tierney bq207
Obsidian Software
 
PDF
The validation attitude
Obsidian Software
 
PPT
Thaker q3 2008
Obsidian Software
 
PDF
Thaker q3 2008
Obsidian Software
 
PDF
Strickland dvclub
Obsidian Software
 
PDF
Stinson post si and verification
Obsidian Software
 
PDF
Shultz dallas q108
Obsidian Software
 
PDF
Shreeve dv club_ams
Obsidian Software
 
PDF
Sharam salamian
Obsidian Software
 
PDF
Schulz sv q2_2009
Obsidian Software
 
PDF
Schulz dallas q1_2008
Obsidian Software
 
PDF
Salamian dv club_foils_intel_austin
Obsidian Software
 
PDF
Sakar jain
Obsidian Software
 
Yang greenstein part_2
Obsidian Software
 
Yang greenstein part_1
Obsidian Software
 
Williamson arm validation metrics
Obsidian Software
 
Whipp q3 2008_sv
Obsidian Software
 
Vishakantaiah validating
Obsidian Software
 
Validation and-design-in-a-small-team-environment
Obsidian Software
 
Tobin verification isglobal
Obsidian Software
 
Tierney bq207
Obsidian Software
 
The validation attitude
Obsidian Software
 
Thaker q3 2008
Obsidian Software
 
Thaker q3 2008
Obsidian Software
 
Strickland dvclub
Obsidian Software
 
Stinson post si and verification
Obsidian Software
 
Shultz dallas q108
Obsidian Software
 
Shreeve dv club_ams
Obsidian Software
 
Sharam salamian
Obsidian Software
 
Schulz sv q2_2009
Obsidian Software
 
Schulz dallas q1_2008
Obsidian Software
 
Salamian dv club_foils_intel_austin
Obsidian Software
 
Sakar jain
Obsidian Software
 

Dill may-2008

  • 1. Dealing with the Three Horrible Problems in Verification Prof. David L. Dill Department of Computer Science Stanford University
  • 2. An excursion out of the ivory tower 0-In, July 1996, initial product design discussions: There are three horrible problems in verification: Specifying the properties to be checked Specifying the environment Computational complexity of attaining high coverage Up to then, I had assumed that the first two were someone else’s problem, and focused on the last. I still think this is a reasonable framework for thinking about verification.
  • 3. Topics Mutation coverage (Certess) System-Level Equivalence Checking (Calypto) Integrating verification into early system design (research). Conclusions
  • 4. Typical verification experience Weeks Bugs per week (Based on fabricated data) Functional testing Tapeout Purgatory
  • 5. Coverage Analysis: Why? What aspects of design haven’t been exercised? Guides test improvement How comprehensive is the verification so far? Stopping criterion Which aspects of the design have not been well-tested? Helps allocate verification resources.
  • 6. Coverage Metrics A metric identifies important structures in a design representation HDL lines, FSM states, paths in netlist classes of behavior Transactions, event sequences Metric classification based on level of representation. Code-based metrics (HDL code) Circuit structure-based metrics (Netlist) State-space based metrics (State transition graph) Functionality-based metrics (User defined tasks) Spec-based metrics (Formal or executable spec)
  • 7. Code-Based Coverage Metrics On the HDL description Line/code block coverage Branch/conditional coverage Expression coverage Path coverage Useful guide for writing test cases Little overhead Inadequate in practice always @ (a or b or s) // mux begin if ( ~s && p ) d = a; r = x else if( s ) d = b; else d = 'bx; if( sel == 1 ) q = d; else if ( sel == 0 ) q = z
  • 8. Circuit Structure-Based Metrics Toggle coverage : Is each node in the circuit toggled? Register activity : Is each register initialized? Loaded? Read? Counters : Are they reset? Do they reach the max/min value? Register-to-register interactions : Are all feasible paths exercised? Datapath-control interface : Are all possible combinations of control and status signals exercised? (0-In checkers have these kinds of measures.) s init s 3 s 4 s 2 s 5 s 6 Control Datapath
  • 9. Observability problem A buggy assignment may be stimulated, but still missed Examples: Wrong value generated speculatively, but never used. Wrong value is computed and stored in register Read 1M cycles later, but simulation doesn’t run that long.
  • 10. Detection terminology To detect a bug Stimuli must activate buggy logic Verification Environment Compare Reference Model Stimuli Activation Bug Design under Verification
  • 11. Detection terminology To detect a bug Stimuli must activate buggy logic The bug must propagate to a checker Verification Environment Compare Reference Model Stimuli Propagation Activation Bug Design under Verification
  • 12. Detection terminology To detect a bug Stimuli must activate buggy logic The bug must propagate to a checker The checker must detect the bug Verification Environment Compare Reference Model Stimuli Propagation Detection Activation Bug Design under Verification
  • 13. Detection terminology Traditional verification metrics do not account for non-propagated, or non-detected bugs Traditional verification metrics No visibility with traditional metrics Verification Environment Compare Reference Model Stimuli Propagation Detection Activation Bug Design under Verification
  • 14. Mutation testing To evaluate testbench’s bug detection ability Inject fake bugs into design (“mutations”). Simulate and see whether they are detected. If not, there is a potential gap in the testbench. There can be many kinds of mutations “ Stuck at” faults Wrong logical or other operators Idea originates in software testing But is obviously related to testability. Efficient implementation is a challenge.
  • 15. Certess approach to Mutation Analysis Fault Model Analysis Fault Activation Analysis Qualify the Verif. Env. Static analysis of the design Analysis of the verification environment behavior Measure the ability of the verification environment to detect mutations Iterate if needed Report Report Report
  • 16. Avoiding the horrible problems Qualify test framework, not design Environment/properties are in existing test bench. High-quality coverage metric targets resources at maximizing useful coverage.
  • 17. SEC Advantages SEC vs. Simulation Simulation is resource intensive, with lengthy run times - SEC runs orders of magnitude faster than simulation Vector generation effort-laden, may be source of errors – SEC requires minimal setup, no test vectors Simulation output often requires further processing for answers – SEC is exhaustive (all sequences over all time) SEC vs. Property Checkers Properties are created to convey specification requirements – SEC uses the golden model as the specification Properties are often incomplete, and not independently verifiable Properties are time consuming to construct
  • 18. Enabling ESL™ SLEC™ SLEC comprehensively proves functional equivalence Identifies design differences (bugs) Supports sequential design changes State Changes Temporal differences I/O differences = ? Reference Model Implementation Model
  • 19. SLEC Finds Functional Differences in C-C Verification Customer example Verify HLS model is the functionally equivalent to the reference model Simulation uncovered no differences - for the given test-bench SLEC System found differences between the two models Reference model was incorrect Probable corner case not easily detectable by simulation SLEC System finds all possible errors or inconsistencies. Simulation is not exhaustive, and therefore cannot fully prove equivalence. Typical functional differences introduced during refinement Code optimization for HLS Datapath word size optimization Ambiguous ESL code Ex: Out of array bounds Behavioral C/C++ HLS C/C++ wrapper HLS Model wrapper Ref Model C to C Verification Failed! DIFFERENCE FOUND! User Defined Input Constraints Reference Model HLS Input Code
  • 20. Design Bugs Caught by SLEC System Bugs Found Application High- Level Synthesis bug in “wait_until()” interpretation DCT Function High-Level Synthesis bug in array’s access range Wireless baseband Design bug in logic on asynchronous reset line Video processing Design bug in normalization of operands Custom DSP Block Design bug at proof depth = 1. Image Resizer High-Level Synthesis bug in sign extension Video processing
  • 21. System-level Formal Verification Sequential Logic Equivalence Checking (SLEC) Leverages system-level verification Comprehensive verification – 100% coverage Quick setup - no testbenches required Rapid results – eliminates long regressions Focused debug – short counter examples Why is it needed Independent verification Find bugs caused by language ambiguities or incorrect synthesis constraints Shift left by -1 Divide by zero Verify RTL ECO’s Parallels current RTL synthesis methodology
  • 22. High-level Synthesis Bugs found by SLEC Bugs Found Application Shift by an integer in the C-code could be a shift by a negative number which is undefined in C Multi-media processor Dead end states created Multi-media Processor Combinational loops created FFT Shift left or right by N bits, when the value being shifted is less than N bits Ultra wideband Filter Divide by zero defined in RTL, but undefined in C code Quantize
  • 23. RTL to RTL Verification with Sequential Differences RTL Pipelining Latency & throughput changes Clock speed enhancement cmd1 data1 calcA1 out1 calcB1 cmd2 data2 calcA2 out2 calcB2 cmd3 data3 calcA3 out3 calcB3 cmd4 data4 calcA4 out4 calcB4 cmd1 data1 calc1 out1 cmd2 data2 calc2 out2 cmd3 calc3 out3 cmd4 data4 calc4 out4 data3 Verified Equivalent Or Counter-Example = ?
  • 24. RTL to RTL Verification with Sequential Differences RTL Resource Sharing State and latency changes Size optimization Verified Equivalent Or Counter-Example A B C Sum + + clk reset B C Sum + A = ?
  • 25. RTL to RTL Verification with Sequential Differences RTL Re-Timing State changes Slack adjustment Allows micro-architecture modifications without breaking testbenches Verified Equivalent Or Counter-Example C2 C1 C2 C1 D Q Comb. Logic C2 C1 C2 C1 Comb. Logic D Q Reduced Comb. Logic Comb. Logic = ?
  • 26. Designer’s Dilemma– Efficient Design for Power At 90nm and below, power is becoming the most critical design constraint Exponential increase in leakage power consumption Quadratic increase in power density Clock Gating is the most common design technique used for reducing power Designers manually add clock gating to control dynamic power Clock gating is most efficiently done at the RTL level, but is error prone Mistakes in implementation cause delays and re-spins Difficult to verify with simulation regressions Requires modifications to testbenches Insufficient coverage of clock gating dependencies Aggressive clock gating approaches sometimes rejected due to verification complexity
  • 27. Addressing Power in the Design Flow Power management schemes are considered globally as part of the system model and initial RTL functionality Sleep modes Power down Power optimizations are local changes made to RTL that do not effect the design functionality Disabling previous pipeline stages when the data is not used Data dependent computation, like multiple by zero Physical Implementation Manual RTL Optimization RTL High level Synthesis or Manual Creation Optimized RTL System Model
  • 28. Combinational clock gating Sequential clock gating Sequential clock gating Combinational Equivalence Checking Sequential Equivalence Checking CG CG CG en clk en clk CG CG CG CG
  • 29. Research Verification currently based on finding and removing bugs. Finding bugs earlier in the design process would be beneficial Early descriptions (protocol, microarchitecture) are smaller, more tractable Early bugs are likely to be serious, possibly lethal Bug cost goes up by >10x at each stage of design. People have been saying this for years. Why can’t we start verifying at the beginning of the design?
  • 30. An Experiment DARPA-sponsored “Smart Memories” project starting up Have verification PhD student (Jacob Chang) work with system designers Try to verify subsystems as soon as possible. Understand what “keeps designers awake at night.” Try to understand “Design for verification” (willing to trade off some system efficiency for verification efficiency).
  • 31. Initial Results: Dismal Used a variety of formal verification tools SRI’s PVS system Cadence SMV Did some impressive verification work Didn’t help the design much By the time something was verified, design had changed. We know this would be a problem, but our solutions weren’t good enough
  • 32. Desperate measures required We discarded tools, used pen and paper This actually helped! Real bugs were found Design principles were clarified Designers started listening
  • 33. What did we learn? Early verification methods need to be nimble Must be able to keep up with design changes. Existing formal methods are not nimble. Require comprehensive descriptions High level of abstraction helps… But one description still takes on too many issues So design changes necessitate major changes in descriptions – too slow!
  • 34. Approach: Perspective-based Verification Need to minimize the number of issues that we tackle at one time. Perspective : Minimal high-level formalization of a design to analyze a particular class of properties. Perspectives should be based on designer’s abstractions What does he/she draw on the whiteboard? Should capture designer’s reasoning about correctness
  • 35. Example: Resource dependencies Verify Cache Coherence message system is deadlock free Model Dependency graph and check for cycles Analysis method Search for cycles In this case: by hand! System-level deadlocks are notoriously hard to find using conventional formal verification tools.
  • 37. Resource Dependency Perspective Partial formalization of design Relevant details Request buffers dependency Protocol dependencies e.g. cancel must follow all other SyncOp commands Virtual channel in networks Irrelevant details Network ordering requirements Cache controller ordering requirements Buffer implementation One class of verification properties Deadlock free Captures why the property is correct Ensure no cycle in resource dependency
  • 38. Bug found Dependency Cycle found Taken into account dependency behavior of Virtual channel Memory controller Cache controller Easy to find once formal model is constructed Hard to find using simulation All channels must be congested Bug found before implementation Cache Controller Memory Controller Cache Controller SyncMiss Sync Op Unsuccessful SyncMiss Sync Op Successful Wake Up Replay Replay
  • 39. Parallel Transaction Perspective Many systems process a set of transactions Memory reads/writes/updates Packet processing/routing User thinks of transactions as non-interfering processes Hardware needs to maintain this illusion. Model: State transaction diagram Analysis: Systematically check whether one transaction can interfere with another. Several important bugs were found by manually applying this method.
  • 40. Parallel Transaction Perspective Partial formalization of design Relevant details Effect of transition on self and others Locking mechanism Irrelevant details Timing and ordering information Buffering issues Deadlock issues Targets on one verification property Same behavior of single process in a multi-process environment Captures why the property is correct Interrupts are conflict free
  • 41. Transaction Diagram Verifier Tool developed for verification of the parallel transaction perspective User input Invariants Transition guards Transition state changes Invariants easy to see it’s true for single process TDV verifies invariant for single process, plus Invariants are true even if other processes execute at the same time
  • 42. TDV User supplies Blocks (transaction steps) Pre-conditions, post-conditions, guards, assignments Links between blocks (control flow) Tool loops through all pairs of block Construct the verification tasks Verify the tasks through another tool STP decision procedure Not a model checker Verifies unbounded number of transactions Uses theorem-proving technology.
  • 43. Tradeoffs Sacrifices must be made Perspectives are necessarily partial Not easy to link perspectives to RTL Not easy to link perspectives to each other …but, at least, you can verify or find bugs while their still relevant to the design!
  • 44. The horrible problems Perspectives omit irrelevant details. including irrelevant environmental constraints. Properties are at the level the designer thinks, so they are easier to extract Computational complexity reduced as well
  • 45. Conclusions Practical verification technology must take account of the three horrible problems Products currently on the market do this in innovative ways Coverage analysis that is a closer match to actual bug-finding ability Evaluates existing verification environment System-level equivalence checking avoids need to add assertions Environmental constraint problem reduced. We need a new perspective on system-level verification 

Editor's Notes

  • #24: When the initial block of design does not meet timing engineers must transform the RTL into a faster implementation . A common technique is re-timing. By moving logic around, long paths can be reduced . However, by moving logic around the value/meaning of state elements is completely changed . Combinatorial formal techniques do not support these changes. SLEC handles this type of design easily Outside of changes, two designs should have identical functionality No requirement for internal statepoints to map/match
  • #25: When the initial block of design does not meet timing engineers must transform the RTL into a faster implementation . A common technique is re-timing. By moving logic around, long paths can be reduced . However, by moving logic around the value/meaning of state elements is completely changed . Combinatorial formal techniques do not support these changes. SLEC handles this type of design easily Outside of changes, two designs should have identical functionality No requirement for internal statepoints to map/match
  • #26: When the initial block of design does not meet timing engineers must transform the RTL into a faster implementation . A common technique is re-timing. By moving logic around, long paths can be reduced . However, by moving logic around the value/meaning of state elements is completely changed . Combinatorial formal techniques do not support these changes. SLEC handles this type of design easily Outside of changes, two designs should have identical functionality No requirement for internal statepoints to map/match