SlideShare a Scribd company logo
Introduction to
Verification of VLSI
Design
and
Functional Verification
1
DrUshaMehta02-08-2019
Dr Usha Mehta
usha.mehta@ieee.org
usha.mehta@nirmauni.ac.in
Acknowledgement…..
This presentation has been summarized from various
books, papers, websites and presentations on VLSI
Design and its various topics all over the world. I
couldn’t item-wise mention from where these large
pull of hints and work come. However, I’d like to
thank all professors and scientists who created such
a good work on this emerging field. Without those
efforts in this very emerging technology, these notes
and slides can’t be finished.
2
DrUshaMehta02-08-2019
3
DrUshaMehta02-08-2019
4
DrUshaMehta02-08-2019
System Design Flow
5
DrUshaMehta02-08-2019
Design Flow
6
DrUshaMehta02-08-2019
VLSI Design Flow
7
DrUshaMehta02-08-2019
Source of Errors
• Errors in Specification
• Unspecified Functionality
• Conflicting requirements
• Unrealized features
• No model for checking as it is at top of
abstraction hierarchy
• Errors in Implementation
• human error in interpreting design functionality
8
DrUshaMehta02-08-2019
How to reduce human
introduced errors in
interpretation?
• Automation
• Poka-Yoke
9
DrUshaMehta02-08-2019
• Automation
• The obvious way to reduce human
introduced error
• It is not always possible specially when the
processes are not well defined and requires a
human ingenuity and creativity.
• Poka-Yoke
• A Japanese term that means "mistake-
proofing" or “inadvertent error prevention”
• Towards the fool automation but not
complete automation
• Human intervention is needed only to decide
on the particular sequence or steps required
to obtain the desired results.
• Verification now a days remains an art.
10
DrUshaMehta02-08-2019
Redundancy
• Most costly but highly efficient approach
• Most widely used for ASICs
11
DrUshaMehta02-08-2019
Reconvergence Model
It consists the following steps:
1. Creating the design at a higher level of abstraction
2. Verifying the design at that level of abstraction
3. Translating the design to a lower level of abstraction
4. Verifying the consistency between steps 1 and 3
5. Steps 2, 3, and 4 are repeated until tapeout
The transformation can be any process like
:
• RTL coding from specification
• Insertion of a scan chain
• Synthesizing a RTL code into gate level netlist
• Synthesizing a gate level netlist in to lay out …..
12
DrUshaMehta02-08-2019
Do you recall?
13
DrUshaMehta02-08-2019
Verification Methods
• Functional Verification
• Formal Verification
• Equivalence Checking
• Model Checking
• Semiformal Verification
• Assertion Based Methods
14
DrUshaMehta02-08-2019
Verification Techniques
• Simulation (functional and timing)
• Behavioral
• RTL
• Gate-level (pre-layout and post-layout)
• Switch-level
• Transistor-level
• Formal Verification (functional)
• Binary Decision Diagrams
• Equivalence Checking
• Model Checking
• Static Timing Analysis (timing)
15
DrUshaMehta02-08-2019
16
DrUshaMehta02-08-2019
Functional Verification Approaches:
Black Box Approach
• Can not look into the design
• Functional verification to be performed
without any internal implementation
knowledge
• Through available interfaces only, no
internal state access
• Examples:
• Check a multiplier by supplying random
numbers to multiply
• Check a braking system by hitting the
brakes at different speeds
17
DrUshaMehta02-08-2019
Black Box…..
• Advantage
• Independent of implementation
• Verification process parallel with design
process
• Less efforts and time consumption
• Disadvantage
• Lack of visibility and controllability
• Difficult to set interesting state/combinations
• Difficult to locate the source of problem
• Difficulty rises when there is a long delay
between occurrence of a problem and its
symptom is visible
18
DrUshaMehta02-08-2019
Functional Verification Approaches
• White Box
• Intimate knowledge and controls of internals of a
design
• This approach can ensure that implementation
specific features behave properly
• Pure white box approach is being used at system
level where modules are treated like black boxes
but system itself is treated like white box.
• Grey Box
• Black box test cases written with full knowledge
of internal details.
• Mostly written to increase code coverage
19
DrUshaMehta02-08-2019
20
DrUshaMehta02-08-2019
Test Bench
• TestBench mimic the environment in which the design
will reside.
• It checks whether the RTL Implementation meets the
design spec or not.
• This Environment creates invalid and unexpected as well
as valid and expected conditions to test the design.
• It does three functions:
• To generate the stimulus for simulation
• To apply this stimulus to the module under test and collect
output response
• To compare the output response with expected golden
values
21
DrUshaMehta02-08-2019
Test Bench Architecture
22
DrUshaMehta02-08-2019
Design Under
Verification
Input
generator
Golden
Output
generator
Compara
tor
Pass/Fail
Input Generation
• Repetitive Input Generator
• Using specific syntax/code
• Using counter/LFSR etc
• Directed Input Generation
• By specifically writing the input pattern
• Using text file
• Very lengthy and time consuming method
• Very narrow but focused coverage
• Random Input Generation
• Using specific syntax/code
• Very simple and speedy process
• Very broad but shallow coverage
23
DrUshaMehta02-08-2019
Repetitive Waveform
Generatorinitial
begin
reset =0;
#100 reset =1;
#80 reset =0;
#30 reset = 1;
end
module check_clock(my_clk2);
output my_clk2;
reg my_clk2;
parameter tp=10;
initial
my_clk2 = 0;
always
#tp my_clk2 = ~ my_clk2;
endmodule
module my_clock( my_clk);
output my_clk;
reg start;
initial
begin
start = 1;
#5 start = 0;
end
nor #2(my_clk, start, my_clk);
endmodule
24
DrUshaMehta02-08-2019
Directed Input Generator
• module adder(a,b,c); //DUT code start
input [15:0] a;
input [15:0] b;
output [16:0] c;
assign c = a + b;
endmodule //DUT code end
module top(); //TestBench code start
reg [15:0] a;
reg [15:0] b;
wire [16:0] c;
adder DUT(a,b,c); //DUT Instantiation
initial
begin
a = 16'h45; //apply the stimulus
b = 16'h12;
#10 $display(" a=%0d,b=%0d,c=%0d",a,b,c);
//send the output to terminal for visual inspection
end
endmodule //TestBench code end
25
DrUshaMehta02-08-2019
Directed Input Generation
• Using text file containing inputs and
expected outputs
26
DrUshaMehta02-08-2019
Random Input Generation
module adder(a,b,c); //DUT code start
input [15:0] a,b;
output [16:0] c;
assign c = a + b;
endmodule //DUT code end
module top(); //TestBench code start
reg [15:0] at;
reg [15:0] bt;
wire [16:0] ct;
adder DUT(at,bt,ct); //DUT Instantiation
initial
repeat(100) begin
a = $random; //apply random stimulus
b = $random;
#10 $display(" a=%0d,b=%0d,c=%0d",a,b,c);
end
endmodule //TestBench code end
27
DrUshaMehta02-08-2019
Constrained Based Random
Generation
28
DrUshaMehta02-08-2019
How to check the results…
• Use waveform viewers for debugging
designs, not for testbench.
• Most of the operation in TestBench
executes in zero time, where waveform
viewer will not be helpful.
• Check in message window
• Store in the log file
29
DrUshaMehta02-08-2019
Writing outputs in test file
30
DrUshaMehta02-08-2019
Self Checking Test
benches
• module top(); //TB code start
reg [15:0] a;
reg [15:0] b;
wire [16:0] c;
adder DUT(a,b,c); //DUT Instantiation
initial
repeat(100) begin
a = $random; //apply random stimulus
b = $random;
#10
$display(" a=%0d,b=%0d,c=%0d",a,b,c);
if( a + b != c) // monitor logic.
$display(" *ERROR* ");
end
endmodule //TB code end
31
DrUshaMehta02-08-2019
Simulation Based
Functional Verification Flow
32
DrUshaMehta02-08-2019
Limitations of Functional
Verification• Large numbers of simulation vectors are
needed to provide confidence that the design
meets the required specifications.
• Logic simulators must process more events
for each stimulus vector because of
increased design size and complexity.
• More vectors and larger design sizes cause
increased memory swapping, slowing down
performance
• Once the Behavioural design is verified, there
are many requirements for small non-
functional modifications in RTL.
• Ideally, after each such modification, there
must be a round of verification which is not
practical.
33
DrUshaMehta02-08-2019
Examples of Non-Functional
Changes in RTL of Design
• Adding clock gating circuitry for power reduction
• Restructuring critical paths
• Reorganizing logic for area reduction
• Adding test logic (scan circuitry) to a design
• Reordering a scan chain in a design
• Inserting a clock tree into a design
• Adding I/O pads to the netlist
• Performing design layout
• Performing flattening and cell sizing
34
DrUshaMehta02-08-2019
Formal Verification Methods
• Technique to prove or disprove the functional
equivalence of two designs.
• The techniques used are static and do not require
simulation vectors.
• You only need to provide a functionally correct, or “golden”
design (called the reference design),and a modified version
of the design (called the implementation).
• By comparing the implementation against the reference
design, you can determine whether the implementation is
functionally equivalent to the reference design
• Methods
• Equivalence Checking
• Modal Checking
35
DrUshaMehta02-08-2019
36
DrUshaMehta02-08-2019
Linting
• It finds common programmer mistake
• It will allow programmer to find mistakes quickly and
efficiently very early instead of at the end waiting for full
programme to fail
• Checks for static errors or potential errors and coding
style guideline violations.
• Static errors: Errors that do not require input
vectors.
• E.g.
• A bus without driver,
• mismatch of port width in module definition and
instantiation.
• dangling input of a gate.
37
DrUshaMehta02-08-2019
Simulator
• Most common and familiar verification tool.
• Its role is limited to approximate reality.
• Simulators attempt to create an artificial universe that mimic the
future real design.
• This lets the designer interact with the design before it is
manufactured and correct flaws and problems earlier.
• Functional correctness and accuracy is a big issue as errors can not
be proven not to exist
• Simulator makes a computing model of the circuit, executes the
model for a set of input signals (stimuli, patterns, or vector),
and verifies the output signals.
• Limitations of simulation
• Timing issues with the simulator.
• The simulator can never mimic the real signal where actual electron
flows at a speed of light.
• Can’t be exhaustive for non-trivial designs
• Performance bottleneck 38
DrUshaMehta02-08-2019
Simulators
at different abstraction level
• System level –everything electrical, mechanical,
optical etc.
• Behavioral level – algorithm or data flow graph
by HDL
• Instruction set level – for CPU
• Register Transfer level + combinational level
• Gate level – gate as a basic element
• Switch level - transistor as a switch
• Circuit level - current and voltage parameter
• Device level - fabrication parameter
• Timing simulation – timing model
• Fault simulation- checks a test vector for fault
39
DrUshaMehta02-08-2019
RTL Level Simulators
Type: Event Driven
• Event: change in logic value at a node, at a certain
instant of time  (V,T)
• Performs both timing and functional verification
– All nodes are visible
– Glitches are detected
• Most heavily used and well-suited for all types of designs
• Uses a timewheel to manage the relationship between
components
• Timewheel = list of all events not processed yet, sorted in
time (complete ordering)
• When event is generated, it is put in the appropriate
point in the timewheel to ensure causality
40
DrUshaMehta02-08-2019
RTL Level Simulators
Type: Cycle Based
• Take advantage of the fact that most digital
designs are largely synchronous (state
elements change value on active edge of
clock)
• Compute steady-state response of the
circuit
– at each clock cycle
– at each boundary node
• Only boundary nodes are evaluated
41
DrUshaMehta02-08-2019
Internal Node
Boundary Node
L
a
t
c
h
e
s
L
a
t
c
h
e
s
Comparison:
Event Driven vs. Cycle
Based• Cycle-based is 10x-100x faster than event-driven
(and less memory usage)
• Cycle-based does not detect glitches and
setup/hold time violations, while event-driven
does.
42
DrUshaMehta02-08-2019
• Cycle-based:
– Only boundary nodes
– No delay information
• Event-driven:
– Each internal node
– Need scheduling and
functions may be
evaluated multiple times
Common Simulators used in
Industry…
• NC-Sim
• Verilog-XL
• VCS
• Modelsim
• More…..
43
DrUshaMehta02-08-2019
Co-Simulators….
• VHDL-Verilog
• Analog-Digital
• Hardware-Software…
• Performance is reduced by communication and
Synchronization overhead.
• Translating events and values from one simulator to other
can create ambiguities
44
DrUshaMehta02-08-2019
Waveform Viewer
• It can play back the events that occurred
during the simulation that were recorded in
some trace file
• Recording waveform trace data is a
overburden on simulation and decreases its
performance
45
DrUshaMehta02-08-2019
Verification Matrices
• Code Coverage
• % of total code executed by given test cases
• Functional Coverage
• % of total functions executed by given test cases
46
DrUshaMehta02-08-2019
Code Coverage Tools
• To expose bugs, you should exercise as
many path as possible
• It shows which part of DUT is exercised
by testbench so it shows how good the
DUT is verified.
• To find new holes
• To measure the progress in test plan
• Bugs are often sensitive to branches and
conditions. For example, incorrectly
writing a condition such as i<=n rather
than i<n may cause a boundary error
bug.
47
DrUshaMehta02-08-2019
Types of Code Coverage
• Statement/Line Coverage
• Block Coverage
• Branch/Decision Coverage
• Condition/Expression Coverage
• Toggle Coverage
• FSM Coverage
48
DrUshaMehta02-08-2019
Statement / Line Coverage
• An indication of how many statements (lines) are covered in
the simulation, by excluding lines like module, endmodule,
comments, timescale etc.
• This is important in all kinds of design and has to be 100%
for verification closure.
• Statement coverage includes procedural statements
49
DrUshaMehta02-08-2019
Block Coverage
• A group of statements which are in the
begin-end or if-else or case or wait or while
loop or for loop etc. is called a block.
• The dead-code in design code can be found
by analyzing block coverage.
50
DrUshaMehta02-08-2019
Branch / Decision Coverage
• In Branch coverage or Decision coverage
reports, conditions like if-else, case and the
ternary operator (?: ) statements are
evaluated in both true and false cases.
51
DrUshaMehta02-08-2019
Condition / Expression
Coverage
• This gives an indication how well variables and expressions
(with logical operators) in conditional statements are
evaluated.
• Conditional coverage is the ratio of number of cases
evaluated to the total number of cases present.
• If an expression has Boolean operations like XOR, AND ,OR
as follows, the entries which is given to that expression to
the total possibilities are indicated by expression coverage.
52
DrUshaMehta02-08-2019
Toggle Coverage
• Toggle coverage gives a report that how
many times signals and ports are toggled
during a simulation run.
• It also measures activity in the design,
such as unused signals or signals that
remain constant or less value changes.
53
DrUshaMehta02-08-2019
State / FSM Coverage
• FSM coverage reports, whether the
simulation run could reach all of the states
and cover all possible transitions or arcs in
a given state machine.
• This is a complex coverage type as it works
on behaviour of the design, that means it
interprets the synthesis semantics of the
HDL design and monitors the coverage of
the FSM representation of control logic
blocks.
54
DrUshaMehta02-08-2019
Limitations of Code Coverage
• 100% code coverage is difficult to achieve
• Further, 100% Code coverage does not
prove that a design is functionally correct!
55
DrUshaMehta02-08-2019
Functional Coverage
• Code coverage measures how much of
the implementation has been exercised
• functional coverage measures how much
of the original design specification has
been exercised
• Specification as reference.
• List all functions as list of items
• Check that each item of list is
encountered.
• Goal : 100% Functional Coverage
56
DrUshaMehta02-08-2019
Code Coverage v/s Functional Coverage
57
DrUshaMehta02-08-2019
Bug Tracking System (BTS)
• When a bug found by verification
engineer, it is reported ( logged) into BTS
• It sends notification to designer
• Stages:
• Open
• When it is filed
• Verified
• When designer confirms that it is bug!
• Fixed
• When it is removed from design
• Closed
• When everything else works fine with new
58
DrUshaMehta02-08-2019
Regression and Revision
Control
• Regression
• Return to the normal state.
• New features + bug fixes are made available to the
team.
• Revision Control
• When multiple users accessing the same data,
data loss may result.
• e.g. trying to write to the same file simultaneously.
• Prevent multiple writes.
59
DrUshaMehta02-08-2019
Hardware Modeler
• You can buy IP for standard verification
• It is cheaper to buy than write them yourself
• Your model is not reliable as the one you buy
• What if you cannot find a model to buy?
60
DrUshaMehta02-08-2019
Verification Language
Hardware Description
Languages
• VHDL, Verilog
• concurrent mechanisms
for controlling traffic
streams to device input
ports, and for checking
outstanding transactions
at the output ports
• but not suitable for
building complex
verification environment
Software
Languages
• C, C++
• Suitable for building
complex environment
• but No built-in
constructs for modeling
hardware concepts such
as concurrency,
operating in simulation
time, or manipulating
vectors of various bit
widths.
61
DrUshaMehta02-08-2019
Hardware Verification
Languages
• Why Verification languages
• Raised the abstraction level hence productivity
• Can automate verification
• Commercial
• e from Verisity
• Openvera from Synopsys
• RAVE from Forte
• Public domain or open source
• System C from Cadence
• Jeda from Juniper Networks
62
DrUshaMehta02-08-2019
System Verilog:
Hardware Description and Verification Language
63
DrUshaMehta02-08-2019
Cost of Verification
• What if your testbench itself is buggy?
• Should test bench be verified? How? 64
DrUshaMehta02-08-2019
Type I
False Negative
Bad Design
Good Design
Pass
Type II
False Positive
Fail
How to reduce verification
time and efforts?
• Verification is a bottleneck in project’s time-to-
profit goal so verification is the target of new
tools and methodology.
• All these tools and methodology attempts to
reduce verification efforts and time by
1. Parallelism of efforts
2. Higher abstraction level
3. Automation
• Some new concepts are
1. Design for verification
2. Verification of a Reusable Design
3. Verification Reuse (Verification IP –VIP)
65
DrUshaMehta02-08-2019
Parallelism of Efforts
• Additional resource applied effectively to
reduce the total verification efforts
• e.g. to dig a hole more workers armed with
shovels
• To be able to write – debug testbenches
parallel to each other as well as parallel to
design implementation.
66
DrUshaMehta02-08-2019
Higher Level of Abstraction
• Enables to work more efficiently without worrying
about low level details.
• Reduction in control
• Additional training to understand the abstraction
mechanism and how desired effect is produced.
• To work at transaction levels or bus cycle levels in
stead of dealing with ones and zeroes.
67
DrUshaMehta02-08-2019
Automation
• A machine completes the task autonomously
• Faster
• Predictable result
• It requires a well defined inputs and a standard process.
• When variety of work exists, automation is difficult.
• Variety of functions, interfaces, protocols and transformation
makes automation in verification difficult.
• Tools automates various parts of verification process but not
the complete process.
• Randomization of input generation is one way to automate
verification process.
68
DrUshaMehta02-08-2019
Design for Verification
• It is reasonable to require additional design
effort to simplify verification.
• Not only should the architect of the design
answer the question
“what is this supposed to do?”
• but also
“how is this thing going to be verified?’
• It includes:
• Well defined interfaces
• Clear separation of functions in relatively
independent units
• Providing additional software accessible
registers to control and observe internal
locations
69
DrUshaMehta02-08-2019
Verification Reuse
• Improving verification productivity is an
economic necessity. Verification reuse
directly addresses higher productivity
• If a bus functional model used to verify a
design block can be reused to verify the
system that uses that block.
• All components be built and packaged
uniformly.
• Verification reuse has its challenges. At the
component level, to reuse the test cases or
test benches is a simpler task but to reuse
a test bench component at different
projects or between two different level of
70
DrUshaMehta02-08-2019
Verification of Reusable
Design
• It is proven that design reuse is more
problematic because “ Reuse is about
trust”.
• Functional verification matrix can only give
that trust to design reuser.
• The reusable design should be verified to a
greater degree of confidence than custom
designs
• Reusable designs need to be verified for all
future possible configuration and possible
uses
71
DrUshaMehta02-08-2019
Some Terminology….
• When is testing performed?
• As a separate activity – off line testing
• Concurrent with normal system operation- on line testing
• Where is the source of stimuli?
• Within the system itself – self testing
• Applied by an external device/tester – external testing
• What do we test for?
• Design Errors – Verification
• Fabrication Errors – Acceptance Testing
• Fabrication Defects- Burn In
• In fancy Physical Failure – Quality Assurance Testing
• Physical Failures – Field Testing/ Maintenance Testing
72
DrUshaMehta02-08-2019
Terminology……
• How are the stimuli and expected response
produced?
• Received from storage-Stored pattern testing
• Generated during testing – algorithmic testing ( stimuli),
comparison testing (response)
• How are the stimuli applied?
• In a fixed order
• Depending upon the result obtained so far – adaptive
testing
• How fast are the stimuli applied?
• Much slower than the normal speed – DC/Static testing
• At normal operating speed – AC / At speed testing
73
DrUshaMehta02-08-2019
Some terminology….
• What are the observed results?
• The entire output pattern
• Some function of output pattern – compact/signature
testing
• Which lines are accessible for testing?
• Only the I/O lines –edge pin testing
• I/O and Internal Lines – Guided Probe testing, Bed of nails
testing, electron beam testing, In circuit emulation, in-
circuit testing ( tester will automatically isolate the IC
already mounted on board.
• Who checks the results?
• The system itself – Self testing/checking
• An external device/checker – External testing
74
DrUshaMehta02-08-2019
2019 2 testing and verification of vlsi design_verification
Thanks……

More Related Content

What's hot (20)

PDF
6 verification tools
Usha Mehta
 
PDF
Introduction of testing and verification of vlsi design
Usha Mehta
 
PDF
2019 5 testing and verification of vlsi design_fault_modeling
Usha Mehta
 
PDF
Automatic Test Pattern Generation (Testing of VLSI Design)
Usha Mehta
 
PDF
5 verification methods
Usha Mehta
 
PDF
Fault Simulation (Testing of VLSI Design)
Usha Mehta
 
PPTX
SOC Verification using SystemVerilog
Ramdas Mozhikunnath
 
PDF
ATPG Methods and Algorithms
Deiptii Das
 
PDF
10 static timing_analysis_1_concept_of_timing_analysis
Usha Mehta
 
PPTX
Scan insertion
kumar gavanurmath
 
PDF
CPU Verification
Ramdas Mozhikunnath
 
PDF
12 static timing_analysis_3_clocked_design
Usha Mehta
 
PDF
2019 1 testing and verification of vlsi design_introduction
Usha Mehta
 
PPTX
Dft (design for testability)
shaik sharief
 
PDF
BUilt-In-Self-Test for VLSI Design
Usha Mehta
 
PDF
Verification flow and_planning_vlsi_design
Usha Mehta
 
PDF
UVM Methodology Tutorial
Arrow Devices
 
PDF
Design-for-Test (Testing of VLSI Design)
Usha Mehta
 
PDF
Design for Testability
kumar gavanurmath
 
PDF
11 static timing_analysis_2_combinational_design
Usha Mehta
 
6 verification tools
Usha Mehta
 
Introduction of testing and verification of vlsi design
Usha Mehta
 
2019 5 testing and verification of vlsi design_fault_modeling
Usha Mehta
 
Automatic Test Pattern Generation (Testing of VLSI Design)
Usha Mehta
 
5 verification methods
Usha Mehta
 
Fault Simulation (Testing of VLSI Design)
Usha Mehta
 
SOC Verification using SystemVerilog
Ramdas Mozhikunnath
 
ATPG Methods and Algorithms
Deiptii Das
 
10 static timing_analysis_1_concept_of_timing_analysis
Usha Mehta
 
Scan insertion
kumar gavanurmath
 
CPU Verification
Ramdas Mozhikunnath
 
12 static timing_analysis_3_clocked_design
Usha Mehta
 
2019 1 testing and verification of vlsi design_introduction
Usha Mehta
 
Dft (design for testability)
shaik sharief
 
BUilt-In-Self-Test for VLSI Design
Usha Mehta
 
Verification flow and_planning_vlsi_design
Usha Mehta
 
UVM Methodology Tutorial
Arrow Devices
 
Design-for-Test (Testing of VLSI Design)
Usha Mehta
 
Design for Testability
kumar gavanurmath
 
11 static timing_analysis_2_combinational_design
Usha Mehta
 

Similar to 2019 2 testing and verification of vlsi design_verification (20)

PPTX
ASIC design verification
Gireesh Kallihal
 
PDF
verification_planning_systemverilog_uvm_2020
Sameh El-Ashry
 
PPTX
Ten query tuning techniques every SQL Server programmer should know
Kevin Kline
 
PPTX
class 3.pptx
KarthicaMarasamy
 
PPTX
Small is Beautiful- Fully Automate your Test Case Design
Georgina Tilby
 
PPT
Understanding printed board assembly using simulation with design of experime...
Kiran Hanjar
 
PDF
Getting started with RISC-V verification what's next after compliance testing
RISC-V International
 
PDF
Basics of Functional Verification - Arrow Devices
Arrow Devices
 
PPTX
Improving the Quality of Existing Software
Steven Smith
 
PPT
Testing of Object-Oriented Software
Praveen Penumathsa
 
PPTX
Connect Data Strategy Deep Dive - MAZ Workshop (1).pptx
joel804321
 
PDF
Making Model-Driven Verification Practical and Scalable: Experiences and Less...
Lionel Briand
 
PPTX
Arizona State University Test Lecture
Pete Sarson, PH.D
 
PDF
6 Steps to Implementing a World Class Testing Ecosystem Final
Eggplant
 
PPTX
ADLV UNIT 1 STUDENT .N (1) (1).pptx_____
Varunkulkarni63
 
PDF
Project Controls Expo - 31st Oct 2012 - Accurate Management Reports on 1me, e...
Project Controls Expo
 
PPTX
PROJECT.ppt (6).pptx
PraveenaModinipally
 
PDF
6 Top Tips to a Testing Strategy That Works
Eggplant
 
PPTX
Improving The Quality of Existing Software
Steven Smith
 
PPTX
Improving the Quality of Existing Software - DevIntersection April 2016
Steven Smith
 
ASIC design verification
Gireesh Kallihal
 
verification_planning_systemverilog_uvm_2020
Sameh El-Ashry
 
Ten query tuning techniques every SQL Server programmer should know
Kevin Kline
 
class 3.pptx
KarthicaMarasamy
 
Small is Beautiful- Fully Automate your Test Case Design
Georgina Tilby
 
Understanding printed board assembly using simulation with design of experime...
Kiran Hanjar
 
Getting started with RISC-V verification what's next after compliance testing
RISC-V International
 
Basics of Functional Verification - Arrow Devices
Arrow Devices
 
Improving the Quality of Existing Software
Steven Smith
 
Testing of Object-Oriented Software
Praveen Penumathsa
 
Connect Data Strategy Deep Dive - MAZ Workshop (1).pptx
joel804321
 
Making Model-Driven Verification Practical and Scalable: Experiences and Less...
Lionel Briand
 
Arizona State University Test Lecture
Pete Sarson, PH.D
 
6 Steps to Implementing a World Class Testing Ecosystem Final
Eggplant
 
ADLV UNIT 1 STUDENT .N (1) (1).pptx_____
Varunkulkarni63
 
Project Controls Expo - 31st Oct 2012 - Accurate Management Reports on 1me, e...
Project Controls Expo
 
PROJECT.ppt (6).pptx
PraveenaModinipally
 
6 Top Tips to a Testing Strategy That Works
Eggplant
 
Improving The Quality of Existing Software
Steven Smith
 
Improving the Quality of Existing Software - DevIntersection April 2016
Steven Smith
 
Ad

More from Usha Mehta (19)

PDF
Basic Design Flow for Field Programmable Gate Arrays
Usha Mehta
 
PDF
Field Programmable Gate Arrays : Architecture
Usha Mehta
 
PDF
Programmable Logic Devices : SPLD and CPLD
Usha Mehta
 
PDF
Programmable Switches for Programmable Logic Devices
Usha Mehta
 
PDF
2_DVD_ASIC_Design_FLow.pdf
Usha Mehta
 
PDF
3_DVD_IC_Fabrication_Flow_designer_perspective.pdf
Usha Mehta
 
PDF
7_DVD_Combinational_MOS_Logic_Circuits.pdf
Usha Mehta
 
PDF
5_DVD_VLSI Technology Trends.pdf
Usha Mehta
 
PDF
8_DVD_Sequential_MOS_logic_circuits.pdf
Usha Mehta
 
PDF
9_DVD_Dynamic_logic_circuits.pdf
Usha Mehta
 
PDF
13_DVD_Latch-up_prevention.pdf
Usha Mehta
 
PDF
Static_Timing_Analysis_in_detail.pdf
Usha Mehta
 
PDF
9 semiconductor memory
Usha Mehta
 
PDF
13 static timing_analysis_4_set_up_and_hold_time_violation_remedy
Usha Mehta
 
PDF
4 verification flow_planning
Usha Mehta
 
PDF
3 test economic_test_equipments_yield
Usha Mehta
 
PDF
2 when to_test_role_of_testing
Usha Mehta
 
PDF
1 why to_test
Usha Mehta
 
PDF
1 why to_test
Usha Mehta
 
Basic Design Flow for Field Programmable Gate Arrays
Usha Mehta
 
Field Programmable Gate Arrays : Architecture
Usha Mehta
 
Programmable Logic Devices : SPLD and CPLD
Usha Mehta
 
Programmable Switches for Programmable Logic Devices
Usha Mehta
 
2_DVD_ASIC_Design_FLow.pdf
Usha Mehta
 
3_DVD_IC_Fabrication_Flow_designer_perspective.pdf
Usha Mehta
 
7_DVD_Combinational_MOS_Logic_Circuits.pdf
Usha Mehta
 
5_DVD_VLSI Technology Trends.pdf
Usha Mehta
 
8_DVD_Sequential_MOS_logic_circuits.pdf
Usha Mehta
 
9_DVD_Dynamic_logic_circuits.pdf
Usha Mehta
 
13_DVD_Latch-up_prevention.pdf
Usha Mehta
 
Static_Timing_Analysis_in_detail.pdf
Usha Mehta
 
9 semiconductor memory
Usha Mehta
 
13 static timing_analysis_4_set_up_and_hold_time_violation_remedy
Usha Mehta
 
4 verification flow_planning
Usha Mehta
 
3 test economic_test_equipments_yield
Usha Mehta
 
2 when to_test_role_of_testing
Usha Mehta
 
1 why to_test
Usha Mehta
 
1 why to_test
Usha Mehta
 
Ad

Recently uploaded (20)

PDF
POV_ Why Enterprises Need to Find Value in ZERO.pdf
darshakparmar
 
PDF
Using FME to Develop Self-Service CAD Applications for a Major UK Police Force
Safe Software
 
PDF
From Code to Challenge: Crafting Skill-Based Games That Engage and Reward
aiyshauae
 
PPTX
Building Search Using OpenSearch: Limitations and Workarounds
Sease
 
PPTX
Webinar: Introduction to LF Energy EVerest
DanBrown980551
 
PDF
Newgen Beyond Frankenstein_Build vs Buy_Digital_version.pdf
darshakparmar
 
PDF
HubSpot Main Hub: A Unified Growth Platform
Jaswinder Singh
 
PDF
HCIP-Data Center Facility Deployment V2.0 Training Material (Without Remarks ...
mcastillo49
 
PPTX
WooCommerce Workshop: Bring Your Laptop
Laura Hartwig
 
PDF
Achieving Consistent and Reliable AI Code Generation - Medusa AI
medusaaico
 
PDF
DevBcn - Building 10x Organizations Using Modern Productivity Metrics
Justin Reock
 
PDF
Exolore The Essential AI Tools in 2025.pdf
Srinivasan M
 
PDF
[Newgen] NewgenONE Marvin Brochure 1.pdf
darshakparmar
 
PDF
Smart Trailers 2025 Update with History and Overview
Paul Menig
 
PDF
Presentation - Vibe Coding The Future of Tech
yanuarsinggih1
 
PDF
"AI Transformation: Directions and Challenges", Pavlo Shaternik
Fwdays
 
PPTX
AUTOMATION AND ROBOTICS IN PHARMA INDUSTRY.pptx
sameeraaabegumm
 
PDF
Chris Elwell Woburn, MA - Passionate About IT Innovation
Chris Elwell Woburn, MA
 
PPTX
OpenID AuthZEN - Analyst Briefing July 2025
David Brossard
 
PDF
July Patch Tuesday
Ivanti
 
POV_ Why Enterprises Need to Find Value in ZERO.pdf
darshakparmar
 
Using FME to Develop Self-Service CAD Applications for a Major UK Police Force
Safe Software
 
From Code to Challenge: Crafting Skill-Based Games That Engage and Reward
aiyshauae
 
Building Search Using OpenSearch: Limitations and Workarounds
Sease
 
Webinar: Introduction to LF Energy EVerest
DanBrown980551
 
Newgen Beyond Frankenstein_Build vs Buy_Digital_version.pdf
darshakparmar
 
HubSpot Main Hub: A Unified Growth Platform
Jaswinder Singh
 
HCIP-Data Center Facility Deployment V2.0 Training Material (Without Remarks ...
mcastillo49
 
WooCommerce Workshop: Bring Your Laptop
Laura Hartwig
 
Achieving Consistent and Reliable AI Code Generation - Medusa AI
medusaaico
 
DevBcn - Building 10x Organizations Using Modern Productivity Metrics
Justin Reock
 
Exolore The Essential AI Tools in 2025.pdf
Srinivasan M
 
[Newgen] NewgenONE Marvin Brochure 1.pdf
darshakparmar
 
Smart Trailers 2025 Update with History and Overview
Paul Menig
 
Presentation - Vibe Coding The Future of Tech
yanuarsinggih1
 
"AI Transformation: Directions and Challenges", Pavlo Shaternik
Fwdays
 
AUTOMATION AND ROBOTICS IN PHARMA INDUSTRY.pptx
sameeraaabegumm
 
Chris Elwell Woburn, MA - Passionate About IT Innovation
Chris Elwell Woburn, MA
 
OpenID AuthZEN - Analyst Briefing July 2025
David Brossard
 
July Patch Tuesday
Ivanti
 

2019 2 testing and verification of vlsi design_verification

  • 1. Introduction to Verification of VLSI Design and Functional Verification 1 DrUshaMehta02-08-2019 Dr Usha Mehta [email protected] [email protected]
  • 2. Acknowledgement….. This presentation has been summarized from various books, papers, websites and presentations on VLSI Design and its various topics all over the world. I couldn’t item-wise mention from where these large pull of hints and work come. However, I’d like to thank all professors and scientists who created such a good work on this emerging field. Without those efforts in this very emerging technology, these notes and slides can’t be finished. 2 DrUshaMehta02-08-2019
  • 8. Source of Errors • Errors in Specification • Unspecified Functionality • Conflicting requirements • Unrealized features • No model for checking as it is at top of abstraction hierarchy • Errors in Implementation • human error in interpreting design functionality 8 DrUshaMehta02-08-2019
  • 9. How to reduce human introduced errors in interpretation? • Automation • Poka-Yoke 9 DrUshaMehta02-08-2019
  • 10. • Automation • The obvious way to reduce human introduced error • It is not always possible specially when the processes are not well defined and requires a human ingenuity and creativity. • Poka-Yoke • A Japanese term that means "mistake- proofing" or “inadvertent error prevention” • Towards the fool automation but not complete automation • Human intervention is needed only to decide on the particular sequence or steps required to obtain the desired results. • Verification now a days remains an art. 10 DrUshaMehta02-08-2019
  • 11. Redundancy • Most costly but highly efficient approach • Most widely used for ASICs 11 DrUshaMehta02-08-2019
  • 12. Reconvergence Model It consists the following steps: 1. Creating the design at a higher level of abstraction 2. Verifying the design at that level of abstraction 3. Translating the design to a lower level of abstraction 4. Verifying the consistency between steps 1 and 3 5. Steps 2, 3, and 4 are repeated until tapeout The transformation can be any process like : • RTL coding from specification • Insertion of a scan chain • Synthesizing a RTL code into gate level netlist • Synthesizing a gate level netlist in to lay out ….. 12 DrUshaMehta02-08-2019
  • 14. Verification Methods • Functional Verification • Formal Verification • Equivalence Checking • Model Checking • Semiformal Verification • Assertion Based Methods 14 DrUshaMehta02-08-2019
  • 15. Verification Techniques • Simulation (functional and timing) • Behavioral • RTL • Gate-level (pre-layout and post-layout) • Switch-level • Transistor-level • Formal Verification (functional) • Binary Decision Diagrams • Equivalence Checking • Model Checking • Static Timing Analysis (timing) 15 DrUshaMehta02-08-2019
  • 17. Functional Verification Approaches: Black Box Approach • Can not look into the design • Functional verification to be performed without any internal implementation knowledge • Through available interfaces only, no internal state access • Examples: • Check a multiplier by supplying random numbers to multiply • Check a braking system by hitting the brakes at different speeds 17 DrUshaMehta02-08-2019
  • 18. Black Box….. • Advantage • Independent of implementation • Verification process parallel with design process • Less efforts and time consumption • Disadvantage • Lack of visibility and controllability • Difficult to set interesting state/combinations • Difficult to locate the source of problem • Difficulty rises when there is a long delay between occurrence of a problem and its symptom is visible 18 DrUshaMehta02-08-2019
  • 19. Functional Verification Approaches • White Box • Intimate knowledge and controls of internals of a design • This approach can ensure that implementation specific features behave properly • Pure white box approach is being used at system level where modules are treated like black boxes but system itself is treated like white box. • Grey Box • Black box test cases written with full knowledge of internal details. • Mostly written to increase code coverage 19 DrUshaMehta02-08-2019
  • 21. Test Bench • TestBench mimic the environment in which the design will reside. • It checks whether the RTL Implementation meets the design spec or not. • This Environment creates invalid and unexpected as well as valid and expected conditions to test the design. • It does three functions: • To generate the stimulus for simulation • To apply this stimulus to the module under test and collect output response • To compare the output response with expected golden values 21 DrUshaMehta02-08-2019
  • 22. Test Bench Architecture 22 DrUshaMehta02-08-2019 Design Under Verification Input generator Golden Output generator Compara tor Pass/Fail
  • 23. Input Generation • Repetitive Input Generator • Using specific syntax/code • Using counter/LFSR etc • Directed Input Generation • By specifically writing the input pattern • Using text file • Very lengthy and time consuming method • Very narrow but focused coverage • Random Input Generation • Using specific syntax/code • Very simple and speedy process • Very broad but shallow coverage 23 DrUshaMehta02-08-2019
  • 24. Repetitive Waveform Generatorinitial begin reset =0; #100 reset =1; #80 reset =0; #30 reset = 1; end module check_clock(my_clk2); output my_clk2; reg my_clk2; parameter tp=10; initial my_clk2 = 0; always #tp my_clk2 = ~ my_clk2; endmodule module my_clock( my_clk); output my_clk; reg start; initial begin start = 1; #5 start = 0; end nor #2(my_clk, start, my_clk); endmodule 24 DrUshaMehta02-08-2019
  • 25. Directed Input Generator • module adder(a,b,c); //DUT code start input [15:0] a; input [15:0] b; output [16:0] c; assign c = a + b; endmodule //DUT code end module top(); //TestBench code start reg [15:0] a; reg [15:0] b; wire [16:0] c; adder DUT(a,b,c); //DUT Instantiation initial begin a = 16'h45; //apply the stimulus b = 16'h12; #10 $display(" a=%0d,b=%0d,c=%0d",a,b,c); //send the output to terminal for visual inspection end endmodule //TestBench code end 25 DrUshaMehta02-08-2019
  • 26. Directed Input Generation • Using text file containing inputs and expected outputs 26 DrUshaMehta02-08-2019
  • 27. Random Input Generation module adder(a,b,c); //DUT code start input [15:0] a,b; output [16:0] c; assign c = a + b; endmodule //DUT code end module top(); //TestBench code start reg [15:0] at; reg [15:0] bt; wire [16:0] ct; adder DUT(at,bt,ct); //DUT Instantiation initial repeat(100) begin a = $random; //apply random stimulus b = $random; #10 $display(" a=%0d,b=%0d,c=%0d",a,b,c); end endmodule //TestBench code end 27 DrUshaMehta02-08-2019
  • 29. How to check the results… • Use waveform viewers for debugging designs, not for testbench. • Most of the operation in TestBench executes in zero time, where waveform viewer will not be helpful. • Check in message window • Store in the log file 29 DrUshaMehta02-08-2019
  • 30. Writing outputs in test file 30 DrUshaMehta02-08-2019
  • 31. Self Checking Test benches • module top(); //TB code start reg [15:0] a; reg [15:0] b; wire [16:0] c; adder DUT(a,b,c); //DUT Instantiation initial repeat(100) begin a = $random; //apply random stimulus b = $random; #10 $display(" a=%0d,b=%0d,c=%0d",a,b,c); if( a + b != c) // monitor logic. $display(" *ERROR* "); end endmodule //TB code end 31 DrUshaMehta02-08-2019
  • 32. Simulation Based Functional Verification Flow 32 DrUshaMehta02-08-2019
  • 33. Limitations of Functional Verification• Large numbers of simulation vectors are needed to provide confidence that the design meets the required specifications. • Logic simulators must process more events for each stimulus vector because of increased design size and complexity. • More vectors and larger design sizes cause increased memory swapping, slowing down performance • Once the Behavioural design is verified, there are many requirements for small non- functional modifications in RTL. • Ideally, after each such modification, there must be a round of verification which is not practical. 33 DrUshaMehta02-08-2019
  • 34. Examples of Non-Functional Changes in RTL of Design • Adding clock gating circuitry for power reduction • Restructuring critical paths • Reorganizing logic for area reduction • Adding test logic (scan circuitry) to a design • Reordering a scan chain in a design • Inserting a clock tree into a design • Adding I/O pads to the netlist • Performing design layout • Performing flattening and cell sizing 34 DrUshaMehta02-08-2019
  • 35. Formal Verification Methods • Technique to prove or disprove the functional equivalence of two designs. • The techniques used are static and do not require simulation vectors. • You only need to provide a functionally correct, or “golden” design (called the reference design),and a modified version of the design (called the implementation). • By comparing the implementation against the reference design, you can determine whether the implementation is functionally equivalent to the reference design • Methods • Equivalence Checking • Modal Checking 35 DrUshaMehta02-08-2019
  • 37. Linting • It finds common programmer mistake • It will allow programmer to find mistakes quickly and efficiently very early instead of at the end waiting for full programme to fail • Checks for static errors or potential errors and coding style guideline violations. • Static errors: Errors that do not require input vectors. • E.g. • A bus without driver, • mismatch of port width in module definition and instantiation. • dangling input of a gate. 37 DrUshaMehta02-08-2019
  • 38. Simulator • Most common and familiar verification tool. • Its role is limited to approximate reality. • Simulators attempt to create an artificial universe that mimic the future real design. • This lets the designer interact with the design before it is manufactured and correct flaws and problems earlier. • Functional correctness and accuracy is a big issue as errors can not be proven not to exist • Simulator makes a computing model of the circuit, executes the model for a set of input signals (stimuli, patterns, or vector), and verifies the output signals. • Limitations of simulation • Timing issues with the simulator. • The simulator can never mimic the real signal where actual electron flows at a speed of light. • Can’t be exhaustive for non-trivial designs • Performance bottleneck 38 DrUshaMehta02-08-2019
  • 39. Simulators at different abstraction level • System level –everything electrical, mechanical, optical etc. • Behavioral level – algorithm or data flow graph by HDL • Instruction set level – for CPU • Register Transfer level + combinational level • Gate level – gate as a basic element • Switch level - transistor as a switch • Circuit level - current and voltage parameter • Device level - fabrication parameter • Timing simulation – timing model • Fault simulation- checks a test vector for fault 39 DrUshaMehta02-08-2019
  • 40. RTL Level Simulators Type: Event Driven • Event: change in logic value at a node, at a certain instant of time  (V,T) • Performs both timing and functional verification – All nodes are visible – Glitches are detected • Most heavily used and well-suited for all types of designs • Uses a timewheel to manage the relationship between components • Timewheel = list of all events not processed yet, sorted in time (complete ordering) • When event is generated, it is put in the appropriate point in the timewheel to ensure causality 40 DrUshaMehta02-08-2019
  • 41. RTL Level Simulators Type: Cycle Based • Take advantage of the fact that most digital designs are largely synchronous (state elements change value on active edge of clock) • Compute steady-state response of the circuit – at each clock cycle – at each boundary node • Only boundary nodes are evaluated 41 DrUshaMehta02-08-2019 Internal Node Boundary Node L a t c h e s L a t c h e s
  • 42. Comparison: Event Driven vs. Cycle Based• Cycle-based is 10x-100x faster than event-driven (and less memory usage) • Cycle-based does not detect glitches and setup/hold time violations, while event-driven does. 42 DrUshaMehta02-08-2019 • Cycle-based: – Only boundary nodes – No delay information • Event-driven: – Each internal node – Need scheduling and functions may be evaluated multiple times
  • 43. Common Simulators used in Industry… • NC-Sim • Verilog-XL • VCS • Modelsim • More….. 43 DrUshaMehta02-08-2019
  • 44. Co-Simulators…. • VHDL-Verilog • Analog-Digital • Hardware-Software… • Performance is reduced by communication and Synchronization overhead. • Translating events and values from one simulator to other can create ambiguities 44 DrUshaMehta02-08-2019
  • 45. Waveform Viewer • It can play back the events that occurred during the simulation that were recorded in some trace file • Recording waveform trace data is a overburden on simulation and decreases its performance 45 DrUshaMehta02-08-2019
  • 46. Verification Matrices • Code Coverage • % of total code executed by given test cases • Functional Coverage • % of total functions executed by given test cases 46 DrUshaMehta02-08-2019
  • 47. Code Coverage Tools • To expose bugs, you should exercise as many path as possible • It shows which part of DUT is exercised by testbench so it shows how good the DUT is verified. • To find new holes • To measure the progress in test plan • Bugs are often sensitive to branches and conditions. For example, incorrectly writing a condition such as i<=n rather than i<n may cause a boundary error bug. 47 DrUshaMehta02-08-2019
  • 48. Types of Code Coverage • Statement/Line Coverage • Block Coverage • Branch/Decision Coverage • Condition/Expression Coverage • Toggle Coverage • FSM Coverage 48 DrUshaMehta02-08-2019
  • 49. Statement / Line Coverage • An indication of how many statements (lines) are covered in the simulation, by excluding lines like module, endmodule, comments, timescale etc. • This is important in all kinds of design and has to be 100% for verification closure. • Statement coverage includes procedural statements 49 DrUshaMehta02-08-2019
  • 50. Block Coverage • A group of statements which are in the begin-end or if-else or case or wait or while loop or for loop etc. is called a block. • The dead-code in design code can be found by analyzing block coverage. 50 DrUshaMehta02-08-2019
  • 51. Branch / Decision Coverage • In Branch coverage or Decision coverage reports, conditions like if-else, case and the ternary operator (?: ) statements are evaluated in both true and false cases. 51 DrUshaMehta02-08-2019
  • 52. Condition / Expression Coverage • This gives an indication how well variables and expressions (with logical operators) in conditional statements are evaluated. • Conditional coverage is the ratio of number of cases evaluated to the total number of cases present. • If an expression has Boolean operations like XOR, AND ,OR as follows, the entries which is given to that expression to the total possibilities are indicated by expression coverage. 52 DrUshaMehta02-08-2019
  • 53. Toggle Coverage • Toggle coverage gives a report that how many times signals and ports are toggled during a simulation run. • It also measures activity in the design, such as unused signals or signals that remain constant or less value changes. 53 DrUshaMehta02-08-2019
  • 54. State / FSM Coverage • FSM coverage reports, whether the simulation run could reach all of the states and cover all possible transitions or arcs in a given state machine. • This is a complex coverage type as it works on behaviour of the design, that means it interprets the synthesis semantics of the HDL design and monitors the coverage of the FSM representation of control logic blocks. 54 DrUshaMehta02-08-2019
  • 55. Limitations of Code Coverage • 100% code coverage is difficult to achieve • Further, 100% Code coverage does not prove that a design is functionally correct! 55 DrUshaMehta02-08-2019
  • 56. Functional Coverage • Code coverage measures how much of the implementation has been exercised • functional coverage measures how much of the original design specification has been exercised • Specification as reference. • List all functions as list of items • Check that each item of list is encountered. • Goal : 100% Functional Coverage 56 DrUshaMehta02-08-2019
  • 57. Code Coverage v/s Functional Coverage 57 DrUshaMehta02-08-2019
  • 58. Bug Tracking System (BTS) • When a bug found by verification engineer, it is reported ( logged) into BTS • It sends notification to designer • Stages: • Open • When it is filed • Verified • When designer confirms that it is bug! • Fixed • When it is removed from design • Closed • When everything else works fine with new 58 DrUshaMehta02-08-2019
  • 59. Regression and Revision Control • Regression • Return to the normal state. • New features + bug fixes are made available to the team. • Revision Control • When multiple users accessing the same data, data loss may result. • e.g. trying to write to the same file simultaneously. • Prevent multiple writes. 59 DrUshaMehta02-08-2019
  • 60. Hardware Modeler • You can buy IP for standard verification • It is cheaper to buy than write them yourself • Your model is not reliable as the one you buy • What if you cannot find a model to buy? 60 DrUshaMehta02-08-2019
  • 61. Verification Language Hardware Description Languages • VHDL, Verilog • concurrent mechanisms for controlling traffic streams to device input ports, and for checking outstanding transactions at the output ports • but not suitable for building complex verification environment Software Languages • C, C++ • Suitable for building complex environment • but No built-in constructs for modeling hardware concepts such as concurrency, operating in simulation time, or manipulating vectors of various bit widths. 61 DrUshaMehta02-08-2019
  • 62. Hardware Verification Languages • Why Verification languages • Raised the abstraction level hence productivity • Can automate verification • Commercial • e from Verisity • Openvera from Synopsys • RAVE from Forte • Public domain or open source • System C from Cadence • Jeda from Juniper Networks 62 DrUshaMehta02-08-2019
  • 63. System Verilog: Hardware Description and Verification Language 63 DrUshaMehta02-08-2019
  • 64. Cost of Verification • What if your testbench itself is buggy? • Should test bench be verified? How? 64 DrUshaMehta02-08-2019 Type I False Negative Bad Design Good Design Pass Type II False Positive Fail
  • 65. How to reduce verification time and efforts? • Verification is a bottleneck in project’s time-to- profit goal so verification is the target of new tools and methodology. • All these tools and methodology attempts to reduce verification efforts and time by 1. Parallelism of efforts 2. Higher abstraction level 3. Automation • Some new concepts are 1. Design for verification 2. Verification of a Reusable Design 3. Verification Reuse (Verification IP –VIP) 65 DrUshaMehta02-08-2019
  • 66. Parallelism of Efforts • Additional resource applied effectively to reduce the total verification efforts • e.g. to dig a hole more workers armed with shovels • To be able to write – debug testbenches parallel to each other as well as parallel to design implementation. 66 DrUshaMehta02-08-2019
  • 67. Higher Level of Abstraction • Enables to work more efficiently without worrying about low level details. • Reduction in control • Additional training to understand the abstraction mechanism and how desired effect is produced. • To work at transaction levels or bus cycle levels in stead of dealing with ones and zeroes. 67 DrUshaMehta02-08-2019
  • 68. Automation • A machine completes the task autonomously • Faster • Predictable result • It requires a well defined inputs and a standard process. • When variety of work exists, automation is difficult. • Variety of functions, interfaces, protocols and transformation makes automation in verification difficult. • Tools automates various parts of verification process but not the complete process. • Randomization of input generation is one way to automate verification process. 68 DrUshaMehta02-08-2019
  • 69. Design for Verification • It is reasonable to require additional design effort to simplify verification. • Not only should the architect of the design answer the question “what is this supposed to do?” • but also “how is this thing going to be verified?’ • It includes: • Well defined interfaces • Clear separation of functions in relatively independent units • Providing additional software accessible registers to control and observe internal locations 69 DrUshaMehta02-08-2019
  • 70. Verification Reuse • Improving verification productivity is an economic necessity. Verification reuse directly addresses higher productivity • If a bus functional model used to verify a design block can be reused to verify the system that uses that block. • All components be built and packaged uniformly. • Verification reuse has its challenges. At the component level, to reuse the test cases or test benches is a simpler task but to reuse a test bench component at different projects or between two different level of 70 DrUshaMehta02-08-2019
  • 71. Verification of Reusable Design • It is proven that design reuse is more problematic because “ Reuse is about trust”. • Functional verification matrix can only give that trust to design reuser. • The reusable design should be verified to a greater degree of confidence than custom designs • Reusable designs need to be verified for all future possible configuration and possible uses 71 DrUshaMehta02-08-2019
  • 72. Some Terminology…. • When is testing performed? • As a separate activity – off line testing • Concurrent with normal system operation- on line testing • Where is the source of stimuli? • Within the system itself – self testing • Applied by an external device/tester – external testing • What do we test for? • Design Errors – Verification • Fabrication Errors – Acceptance Testing • Fabrication Defects- Burn In • In fancy Physical Failure – Quality Assurance Testing • Physical Failures – Field Testing/ Maintenance Testing 72 DrUshaMehta02-08-2019
  • 73. Terminology…… • How are the stimuli and expected response produced? • Received from storage-Stored pattern testing • Generated during testing – algorithmic testing ( stimuli), comparison testing (response) • How are the stimuli applied? • In a fixed order • Depending upon the result obtained so far – adaptive testing • How fast are the stimuli applied? • Much slower than the normal speed – DC/Static testing • At normal operating speed – AC / At speed testing 73 DrUshaMehta02-08-2019
  • 74. Some terminology…. • What are the observed results? • The entire output pattern • Some function of output pattern – compact/signature testing • Which lines are accessible for testing? • Only the I/O lines –edge pin testing • I/O and Internal Lines – Guided Probe testing, Bed of nails testing, electron beam testing, In circuit emulation, in- circuit testing ( tester will automatically isolate the IC already mounted on board. • Who checks the results? • The system itself – Self testing/checking • An external device/checker – External testing 74 DrUshaMehta02-08-2019