SlideShare a Scribd company logo
Performance Testing in the Software Lifecycle
1 Basic Concepts
2 Performance
Measurements
3 Performance
Testing in SDLC
Software Testing - ISTQB Specialist
Performance Tester Exam Preparation
Chapter 3
Neeraj Kumar Singh
4 Performance
Testing Tasks
5 Tools
Performance Testing in the Software Lifecycle
Contents
3.1 Principal Performance Testing Activities
3.2 Performance Risks for Different Architectures
3.3 Performance Risks Across the Software
Development Lifecycle
3.4 Performance Testing Activities
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Principal Performance Testing Activities
Performance testing is iterative in nature. Each test provides valuable insights into application and system
performance. The information gathered from one test is used to correct or optimize application and system
parameters. The next test iteration will then show the results of modifications, and so on until test objectives are
reached.
Test Planning
Test planning is particularly important for performance testing due to the need for the allocation of test
environments, test data, tools and human resources. In addition, this is the activity in which the scope of
performance testing is established.
During test planning, risk identification and risk analysis activities are completed and relevant information is
updated in any test planning documentation (e.g., test plan, level test plan). Just as test planning is revisited and
modified as needed, so are risks, risk levels and risk status modified to reflect changes in risk conditions.
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Principal Performance Testing Activities
Test Monitoring and Control
Control measures are defined to provide action plans should issues be encountered which might impact
performance efficiency, such as
 increasing the load generation capacity if the infrastructure does not generate the desired loads as planned for
particular performance tests
 changed, new or replaced hardware
 changes to network components
 changes to software implementation The performance test objectives are evaluated to check for exit criteria
achievement.
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Principal Performance Testing Activities
Test Analysis
Effective performance tests are based on an analysis of performance requirements, test objectives, Service Level
Agreements (SLA), IT architecture, process models and other items that comprise the test basis. This activity may
be supported by modeling and analysis of system resource requirements and/or behavior using spreadsheets or
capacity planning tools.
Specific test conditions are identified such as load levels, timing conditions, and transactions to be tested. The
required type(s) of performance test (e.g., load, stress, scalability) are then decided.
Test Design
Performance test cases are designed. These are generally created in modular form so that they may be used as the
building blocks of larger, more complex performance tests (see section 4.2).
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Principal Performance Testing Activities
Test Implementation
In the implementation phase, performance test cases are ordered into performance test procedures. These
performance test procedures should reflect the steps normally taken by the user and other functional activities
that are to be covered during performance testing.
A test implementation activity is establishing and/or resetting the test environment before each test execution.
Since performance testing is typically data-driven, a process is needed to establish test data that is representative
of actual production data in volume and type so that production use can be simulated.
Test Execution
Test execution occurs when the performance test is conducted, often by using performance test tools. Test results
are evaluated to determine if the system’s performance meets the requirements and other stated objectives. Any
defects are reported.
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Principal Performance Testing Activities
Test Completion
Performance test results are provided to the stakeholders (e.g., architects, managers, product owners) in a test
summary report. The results are expressed through metrics which are often aggregated to simplify the meaning of
the test results. Visual means of reporting such as dashboards are often used to express performance test results in
ways that are easier to understand than text-based metrics.
Performance testing is often considered to be an ongoing activity in that it is performed at multiple times and at
all test levels (component, integration, system, system integration and acceptance testing). At the close of a
defined period of performance testing, a point of test closure may be reached where designed tests, test tool
assets (test cases and test procedures), test data and other testware are archived or passed on to other testers for
later use during system maintenance activities.
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Contents
3.1 Principal Performance Testing Activities
3.2 Performance Risks for Different Architectures
3.3 Performance Risks Across the Software
Development Lifecycle
3.4 Performance Testing Activities
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Performance Risks for Different Architectures
As mentioned previously, application or system performance varies considerably based on the architecture,
application and host environment. While it is not possible to provide a complete list of performance risks for all
systems, the list below includes some typical types of risks associated with particular architectures:
Single Computer Systems
These are systems or applications that runs entirely on one non-virtualized computer. Performance can degrade
due to
 excessive resource consumption including memory leaks, background activities such as security software, slow
storage subsystems (e.g., low-speed external devices or disk fragmentation), and operating system
mismanagement.
 inefficient implementation of algorithms which do not make use of available resources (e.g., main memory)
and as a result execute slower than required.
Multi-tier Systems
These are systems of systems that run on multiple servers, each of which performs a specific set of tasks, such as
database server, application server, and presentation server. Each server is, of course, a computer and subject to
the risks given earlier. In addition, performance can degrade due to poor or non-scalable database design, network
bottlenecks, and inadequate bandwidth or capacity on any single server.
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Performance Risks for Different Architectures
Distributed Systems
These are systems of systems, similar to a multi-tier architecture, but the various servers may change dynamically,
such as an e-commerce system that accesses different inventory databases depending on the geographic location
of the person placing the order. In addition to the risks associated with multi-tier architectures, this architecture
can experience performance problems due to critical workflows or dataflows to, from, or through unreliable or
unpredictable remote servers, especially when such servers suffer periodic connection problems or intermittent
periods of intense load.
Virtualized Systems
These are systems where the physical hardware hosts multiple virtual computers. These virtual machines may host
single-computer systems and applications as well as servers that are part of a multi-tier or distributed
architecture. Performance risks that arise specifically from virtualization include excessive load on the hardware
across all the virtual machines or improper configuration of the host virtual machine resulting in inadequate
resources.
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Performance Risks for Different Architectures
Dynamic/Cloud-based Systems
These are systems that offer the ability to scale on demand, increasing capacity as the level of load increases.
These systems are typically distributed and virtualized multitier systems, albeit with self-scaling features designed
specifically to mitigate some of the performance risks associated with those architectures. However, there are
risks associated with failures to properly configure these features during initial setup or subsequent updates.
Client –Server Systems
These are systems running on a client that communicate via a user interface with a single server, multi-tier server,
or distributed server. Since there is code running on the client, the single computer risks apply to that code, while
the server-side issues mentioned above apply as well. Further, performance risks exist due to connection speed and
reliability issues, network congestion at the client connection point (e.g., public Wi-Fi), and potential problems
due to firewalls, packet inspection and server load balancing.
Mobile Applications
These are applications running on a smartphone, tablet, or other mobile device. Such applications are subject to
the risks mentioned for client-server and browser-based (web apps) applications. In addition, performance issues
can arise due to the limited and variable resources and connectivity available on the mobile device (which can be
affected by location, battery life, charge state, available memory on the device and temperature).
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Performance Risks for Different Architectures
Embedded Real-time Systems
These are systems that work within or even control everyday things such as cars (e.g., entertainment systems and
intelligent braking systems), elevators, traffic signals, Heating, Ventilation and Air Conditioning (HVAC) systems,
and more. These systems often have many of the risks of mobile devices, including (increasingly) connectivity-
related issues since these devices are connected to the Internet. However the diminished performance of a mobile
video game is usually not a safety hazard for the user, while such slowdowns in a vehicle braking system could
prove catastrophic.
Mainframe Applications
These are applications—in many cases decades-old applications—supporting often mission-critical business
functions in a data center, sometimes via batch processing. Most are quite predictable and fast when used as
originally designed, but many of these are now accessible via APIs, web services, or through their database, which
can result in unexpected loads that affect throughput of established applications.
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Contents
3.1 Principal Performance Testing Activities
3.2 Performance Risks for Different Architectures
3.3 Performance Risks Across the Software
Development Lifecycle
3.4 Performance Testing Activities
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Performance Risks Across the Software Lifecycle
The process of analyzing risks to the quality of a software product in general. You can also find discussions of
specific risks and considerations associated with particular quality characteristics, and from a business or technical
perspective. In this section, the focus is on performance-related risks to product quality, including ways that the
process, the participants, and the considerations change.
For performance-related risks to the quality of the product, the process is:
1. Identify risks to product quality, focusing on characteristics such as time behavior, resource utilization, and
capacity.
2. Assess the identified risks, ensuring that the relevant architecture categories are addressed. Evaluate the
overall level of risk for each identified risk in terms of likelihood and impact using clearly defined criteria.
3. Take appropriate risk mitigation actions for each risk item based on the nature of the risk item and the level of
risk.
4. Manage risks on an ongoing basis to ensure that the risks are adequately mitigated prior to release.
As with quality risk analysis in general, the participants in this process should include both business and technical
stakeholders. For performance-related risk analysis the business stakeholders must include those with a particular
awareness of how performance problems in production will actually affect customers, users, the business, and
other downstream stakeholders.
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Performance Risks Across the Software Lifecycle
Further, the technical stakeholders must include those with a deep understanding of the performance implications
of relevant requirements, architecture, design, and implementation decisions.
The specific risk analysis process chosen should have the appropriate level of formality and rigor. For performance-
related risks, it is especially important that the risk analysis process be started early and is repeated regularly.
In addition, risk mitigation and management must span and influence the entire software development process,
not just dynamic testing.
Good performance engineering can help project teams avoid the late discovery of critical performance defects
during higher test levels, such as system integration testing or user acceptance testing. Performance defects found
at a late stage in the project can be extremely costly and may even lead to the cancellation of entire projects.
As with any type of quality risk, performance-related risks can never be avoided completely, i.e., some risk of
performance-related production failure will always exist. Therefore, the risk management process must include
providing a realistic and specific evaluation of the residual level of risk to the business and technical stakeholders
involved in the process.
For example, simply saying, “Yes, it’s still possible for customers to experience long delays during check out,” is
not helpful, as it gives no idea of what amount of risk mitigation has occurred or of the level of risk that remains.
Instead, providing clear insight into the percentage of customers likely to experience delays equal to or exceeding
certain thresholds will help people understand the status.
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Contents
3.1 Principal Performance Testing Activities
3.2 Performance Risks for Different Architectures
3.3 Performance Risks Across the Software
Development Lifecycle
3.4 Performance Testing Activities
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Performance Testing Activities
Performance testing activities will be organized and performed differently, depending on the type of software
development lifecycle in use.
Sequential Development Models
The ideal practice of performance testing in sequential development models is to include performance criteria as a
part of the acceptance criteria which are defined at the outset of a project. Reinforcing the lifecycle view of
testing, performance testing activities should be conducted throughout the software development lifecycle. As the
project progresses, each successive performance test activity should be based on items defined in the prior
activities as shown below.
 Concept – Verify that system performance goals are defined as acceptance criteria for the project.
 Requirements – Verify that performance requirements are defined and represent stakeholder needs correctly.
 Analysis and Design – Verify that the system design reflects the performance requirements.
 Coding/Implementation – Verify that the code is efficient and reflects the requirements and design in terms of
performance.
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Performance Testing Activities
 Component Testing – Conduct component level performance testing.
 Component Integration Testing – Conduct performance testing at the component integration level.
 System Testing – Conduct performance testing at the system level, which includes hardware, software,
procedures and data that are representative of the production environment.
 System Integration Testing– Conduct performance testing with the entire system which is representative of
the production environment.
 Acceptance Testing – Validate that system performance meets the originally stated user needs and acceptance
criteria.
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Performance Testing Activities
Iterative and Incremental Development Models
In these development models, such as Agile, performance testing is also seen as an iterative and incremental
activity. Performance testing can occur as part of the first iteration, or as an iteration dedicated entirely to
performance testing. However, with these lifecycle models, the execution of performance testing may be
performed by a separate team tasked with performance testing.
Continuous Integration (CI) is commonly performed in iterative and incremental software development lifecycles,
which facilitates a highly automated execution of tests. The most common objective of testing in CI is to perform
regression testing and ensure each build is stable.
Performance testing can be part of the automated tests performed in CI if the tests are designed in such a way as
to be executed at a build level.
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Performance Testing Activities
However, unlike functional automated tests, there are additional concerns such as the following:
 The setup of the performance test environment – This often requires a test environment that is available on
demand, such as a cloud-based performance test environment.
 Determining which performance tests to automate in CI – Due to the short timeframe available for CI tests, CI
performance tests may be a subset of more extensive performance tests that are conducted by a specialist
team at other times during an iteration.
 Creating the performance tests for CI – The main objective of performance tests as part of CI is to ensure a
change does not negatively impact performance. Depending on the changes made for any given build, new
performance tests may be required.
 Executing performance tests on portions of an application or system – This often requires the tools and test
environments to be capable of rapid performance testing including the ability to select subsets of applicable
tests.
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Performance Testing Activities
Performance testing in the iterative and incremental software development lifecycles can also have its own lifecycle
activities:
 Release Planning – In this activity, performance testing is considered from the perspective of all iterations in a release,
from the first iteration to the final iteration. Performance risks are identified and assessed, and mitigation measures
planned. This often includes planning of any final performance testing before the release of the application.
 Iteration Planning – In the context of each iteration, performance testing may be performed within the iteration and
as each iteration is completed. Performance risks are assessed in more detail for each user story.
 User Story Creation – User stories often form the basis of performance requirements in Agile methodologies, with the
specific performance criteria described in the associated acceptance criteria. These are referred to as “nonfunctional”
user stories.
 Design of performance tests –performance requirements and criteria which are described in particular user stories are
used for the design of tests
 Coding/Implementation – During coding, performance testing may be performed at a component level. An example of
this would be the tuning of algorithms for optimum performance efficiency.
 Testing/Evaluation – While testing is typically performed in close proximity to development activities, performance
testing may be performed as a separate activity, depending on the scope and objectives of performance testing during
the iteration.
 Delivery – Since delivery will introduce the application to the production environment, performance will need to be
monitored to determine if the application achieves the desired levels of performance in actual usage.
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Performance Testing Activities
Commercial Off-the-Shelf (COTS) and other Supplier/Acquirer Models
Many organizations do not develop applications and systems themselves, but instead are in the position of
acquiring software from vendor sources or from open-source projects.
In such supplier/acquirer models, performance is an important consideration that requires testing from both the
supplier (vendor/developer) and acquirer (customer) perspectives.
Regardless of the source of the application, it is often the responsibility of the customer to validate that the
performance meets their requirements.
In the case of customized vendor-developed software, performance requirements and associated acceptance
criteria which should be specified as part of the contract between the vendor and customer.
In the case of COTS applications, the customer has sole responsibility to test the performance of the product in a
realistic test environment prior to deployment.
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
1 Basic Concepts
2 Performance
Measurements
3 Performance
Testing in SDLC
Software Testing - ISTQB Specialist
Performance Tester Exam Preparation
Chapter 3
Neeraj Kumar Singh
4 Performance
Testing Tasks
5 Tools
Neeraj Kumar Singh
Performance Testing in the Software Lifecycle
Sample Questions Pattern
Source : istqb.org
1. When applying the principal performance testing activities, when should the test cases be ordered into
performance test procedures? ?
Select ONE option.
Answer Set
a. Test planning
b. Test analysis and design
c. Test implementation and execution
d. Test closure
Performance Testing in the Software Lifecycle
Sample Question
2. Consider the following technical environments:
1. Virtualized
2. Dynamic/Cloud-based
3. Client/Server and Browser-based
4. Mobile
5. Embedded
6. Mainframe
Which of these is most likely to have a performance risk due to memory leaks?
Select ONE option.
Answer Set
a. 1, 2, 3, 6
b. 2, 3, 4, 5
c. 1, 2, 4, 6
d. 1, 3, 4, 5
Performance Testing in the Software Lifecycle
Sample Question
3. You are working on a project that tracks health history information for patients across a region. The number of
records handled by the system is in the millions due to the large number of patients in the region. Patient
information must be accessible to doctors in offices, hospitals and urgent care facilities. The information should be
presented to the requestor within three seconds of request, particularly for patients with critical allergies and
preconditions.
Given this information, when is the best time in the project to analyze and assess the performance risks?
Select ONE option.
Answer Set
a. During the requirements phase and again just prior to executing the performance tests
b. After design but prior to coding
c. During system testing and again prior to the performance tests
d. Repeatedly throughout the requirements, development and performance testing
Performance Testing in the Software Lifecycle
Sample Question

More Related Content

What's hot (20)

PPTX
Chapter 6 - Tool Support for Testing
Neeraj Kumar Singh
 
PDF
Test Automation Framework Design | www.idexcel.com
Idexcel Technologies
 
PPT
Automation With A Tool Demo
Nivetha Padmanaban
 
PPT
Performance testing : An Overview
sharadkjain
 
PPTX
Chapter 4 - Test Design Techniques
Neeraj Kumar Singh
 
PDF
Automation testing introduction for FujiNet
Hai Tran Son
 
PPTX
Chapter 6 - Test Tools and Automation
Neeraj Kumar Singh
 
PPTX
Chapter 2 - Test Management
Neeraj Kumar Singh
 
PPTX
Chapter 3 - Agile Testing Methods, Techniques and Tools
Neeraj Kumar Singh
 
PPTX
Chapter 1 - Fundamentals of Testing
Neeraj Kumar Singh
 
PPTX
Chapter 3 - Test Techniques
Neeraj Kumar Singh
 
PPTX
Performance Testing using LoadRunner
Kumar Gupta
 
PPTX
Performance testing
Jyoti Babbar
 
PPTX
Chapter 5 - Test Management
Neeraj Kumar Singh
 
PPTX
Chapter 6 - Test Tools and Automation
Neeraj Kumar Singh
 
PPTX
Automation Testing
Sun Technlogies
 
PPS
ISTQB Foundation - Chapter 2
Chandukar
 
PDF
Chapter 4 - Mobile Application Platforms, Tools and Environment
Neeraj Kumar Singh
 
PPTX
Chapter 1 - Testing Process
Neeraj Kumar Singh
 
PDF
What is Performance Testing?
QA InfoTech
 
Chapter 6 - Tool Support for Testing
Neeraj Kumar Singh
 
Test Automation Framework Design | www.idexcel.com
Idexcel Technologies
 
Automation With A Tool Demo
Nivetha Padmanaban
 
Performance testing : An Overview
sharadkjain
 
Chapter 4 - Test Design Techniques
Neeraj Kumar Singh
 
Automation testing introduction for FujiNet
Hai Tran Son
 
Chapter 6 - Test Tools and Automation
Neeraj Kumar Singh
 
Chapter 2 - Test Management
Neeraj Kumar Singh
 
Chapter 3 - Agile Testing Methods, Techniques and Tools
Neeraj Kumar Singh
 
Chapter 1 - Fundamentals of Testing
Neeraj Kumar Singh
 
Chapter 3 - Test Techniques
Neeraj Kumar Singh
 
Performance Testing using LoadRunner
Kumar Gupta
 
Performance testing
Jyoti Babbar
 
Chapter 5 - Test Management
Neeraj Kumar Singh
 
Chapter 6 - Test Tools and Automation
Neeraj Kumar Singh
 
Automation Testing
Sun Technlogies
 
ISTQB Foundation - Chapter 2
Chandukar
 
Chapter 4 - Mobile Application Platforms, Tools and Environment
Neeraj Kumar Singh
 
Chapter 1 - Testing Process
Neeraj Kumar Singh
 
What is Performance Testing?
QA InfoTech
 

Similar to Chapter 3 - Performance Testing in the Software Lifecycle (20)

PPTX
performance testing training in hyderabad
neeraja0480
 
PDF
performance testing training in hyderabad
shaikmadarbi3zen
 
PPTX
Performance Testing Using VS 2010 - Part 1
Mohamed Tarek
 
PPTX
performance testing training in hyderabad
aparna3zen
 
PPTX
performance testing training in hyderabad
madhupriya3zen
 
PDF
Performance Testing.3zen.pdf
swathi3zen
 
PPTX
Performance Testing Training in Hyderabad
rajasrichalamala3zen
 
PPT
Software Performance
Prabhanshu Saraswat
 
PDF
Mastering performance testing_ a comprehensive guide to optimizing applicatio...
kalichargn70th171
 
PPTX
Software Testing Company in India.pptx
SakshiPatel82
 
PPTX
Alexander Podelko - Context-Driven Performance Testing
Neotys_Partner
 
PPTX
Understanding the importance of software performance testing and its types
TestingXperts
 
PDF
Primer on application_performance_testing_v0.2
Trevor Warren
 
PPTX
Introduction to performance testing
Richard Bishop
 
PDF
What is Performance Testing? - A Comprehensive Guide
flufftailshop
 
PPTX
Application performance testing services
Alisha Henderson
 
PDF
Gm assessing performance
Satya Harish
 
PPT
Software Testing Life Cycle
Udayakumar Sree
 
PPT
Best Practices for Applications Performance Testing
Bhaskara Reddy Sannapureddy
 
PDF
Mistakes we make_and_howto_avoid_them_v0.12
Trevor Warren
 
performance testing training in hyderabad
neeraja0480
 
performance testing training in hyderabad
shaikmadarbi3zen
 
Performance Testing Using VS 2010 - Part 1
Mohamed Tarek
 
performance testing training in hyderabad
aparna3zen
 
performance testing training in hyderabad
madhupriya3zen
 
Performance Testing.3zen.pdf
swathi3zen
 
Performance Testing Training in Hyderabad
rajasrichalamala3zen
 
Software Performance
Prabhanshu Saraswat
 
Mastering performance testing_ a comprehensive guide to optimizing applicatio...
kalichargn70th171
 
Software Testing Company in India.pptx
SakshiPatel82
 
Alexander Podelko - Context-Driven Performance Testing
Neotys_Partner
 
Understanding the importance of software performance testing and its types
TestingXperts
 
Primer on application_performance_testing_v0.2
Trevor Warren
 
Introduction to performance testing
Richard Bishop
 
What is Performance Testing? - A Comprehensive Guide
flufftailshop
 
Application performance testing services
Alisha Henderson
 
Gm assessing performance
Satya Harish
 
Software Testing Life Cycle
Udayakumar Sree
 
Best Practices for Applications Performance Testing
Bhaskara Reddy Sannapureddy
 
Mistakes we make_and_howto_avoid_them_v0.12
Trevor Warren
 
Ad

More from Neeraj Kumar Singh (20)

PDF
Chapter 6 - Test Tools Considerations V4.0
Neeraj Kumar Singh
 
PDF
Chapter 5 - Managing Test Activities V4.0
Neeraj Kumar Singh
 
PDF
Chapter 4 - Test Analysis & Design Techniques V4.0
Neeraj Kumar Singh
 
PDF
Chapter 3 - Static Testing (Review) V4.0
Neeraj Kumar Singh
 
PDF
Chapter 2 - Testing Throughout SDLC V4.0
Neeraj Kumar Singh
 
PDF
Chapter 1 - Fundamentals of Testing V4.0
Neeraj Kumar Singh
 
PDF
Chapter 3 - Common Test Types and Test Process for Mobile Applications
Neeraj Kumar Singh
 
PDF
Chapter 2 - Mobile Application Test Types
Neeraj Kumar Singh
 
PDF
Chapter 1 - Mobile World - Business and Technology Drivers
Neeraj Kumar Singh
 
PDF
ISTQB Performance Tester Sample Questions
Neeraj Kumar Singh
 
PDF
ISTQB Performance Tester Sample Questions' Answers
Neeraj Kumar Singh
 
PDF
ISTQB Performance Tester Certification Syllabus and Study Material
Neeraj Kumar Singh
 
PDF
Chapter 6 - Test Tools and Automation
Neeraj Kumar Singh
 
PDF
Chapter 5 - Improving the Testing Process
Neeraj Kumar Singh
 
PDF
Chapter 4 - Defect Management
Neeraj Kumar Singh
 
PDF
Chapter 3 - Reviews
Neeraj Kumar Singh
 
PDF
Chapter 2 - Test Management
Neeraj Kumar Singh
 
PDF
Chapter 1 - Testing Process
Neeraj Kumar Singh
 
PDF
ISTQB Technical Test Analyst Answers to Sample Question Paper
Neeraj Kumar Singh
 
PDF
ISTQB Technical Test Analyst Sample Question Paper
Neeraj Kumar Singh
 
Chapter 6 - Test Tools Considerations V4.0
Neeraj Kumar Singh
 
Chapter 5 - Managing Test Activities V4.0
Neeraj Kumar Singh
 
Chapter 4 - Test Analysis & Design Techniques V4.0
Neeraj Kumar Singh
 
Chapter 3 - Static Testing (Review) V4.0
Neeraj Kumar Singh
 
Chapter 2 - Testing Throughout SDLC V4.0
Neeraj Kumar Singh
 
Chapter 1 - Fundamentals of Testing V4.0
Neeraj Kumar Singh
 
Chapter 3 - Common Test Types and Test Process for Mobile Applications
Neeraj Kumar Singh
 
Chapter 2 - Mobile Application Test Types
Neeraj Kumar Singh
 
Chapter 1 - Mobile World - Business and Technology Drivers
Neeraj Kumar Singh
 
ISTQB Performance Tester Sample Questions
Neeraj Kumar Singh
 
ISTQB Performance Tester Sample Questions' Answers
Neeraj Kumar Singh
 
ISTQB Performance Tester Certification Syllabus and Study Material
Neeraj Kumar Singh
 
Chapter 6 - Test Tools and Automation
Neeraj Kumar Singh
 
Chapter 5 - Improving the Testing Process
Neeraj Kumar Singh
 
Chapter 4 - Defect Management
Neeraj Kumar Singh
 
Chapter 3 - Reviews
Neeraj Kumar Singh
 
Chapter 2 - Test Management
Neeraj Kumar Singh
 
Chapter 1 - Testing Process
Neeraj Kumar Singh
 
ISTQB Technical Test Analyst Answers to Sample Question Paper
Neeraj Kumar Singh
 
ISTQB Technical Test Analyst Sample Question Paper
Neeraj Kumar Singh
 
Ad

Recently uploaded (20)

PDF
CIFDAQ Token Spotlight for 9th July 2025
CIFDAQ
 
PDF
LLMs.txt: Easily Control How AI Crawls Your Site
Keploy
 
PPTX
Q2 FY26 Tableau User Group Leader Quarterly Call
lward7
 
PDF
Building Real-Time Digital Twins with IBM Maximo & ArcGIS Indoors
Safe Software
 
PDF
How Startups Are Growing Faster with App Developers in Australia.pdf
India App Developer
 
PDF
Blockchain Transactions Explained For Everyone
CIFDAQ
 
PDF
July Patch Tuesday
Ivanti
 
PPTX
AI Penetration Testing Essentials: A Cybersecurity Guide for 2025
defencerabbit Team
 
PPTX
WooCommerce Workshop: Bring Your Laptop
Laura Hartwig
 
PPTX
Webinar: Introduction to LF Energy EVerest
DanBrown980551
 
PDF
Reverse Engineering of Security Products: Developing an Advanced Microsoft De...
nwbxhhcyjv
 
PPTX
From Sci-Fi to Reality: Exploring AI Evolution
Svetlana Meissner
 
PDF
What Makes Contify’s News API Stand Out: Key Features at a Glance
Contify
 
PDF
"AI Transformation: Directions and Challenges", Pavlo Shaternik
Fwdays
 
PDF
Jak MŚP w Europie Środkowo-Wschodniej odnajdują się w świecie AI
dominikamizerska1
 
PDF
NewMind AI - Journal 100 Insights After The 100th Issue
NewMind AI
 
PDF
Biography of Daniel Podor.pdf
Daniel Podor
 
PDF
The Rise of AI and IoT in Mobile App Tech.pdf
IMG Global Infotech
 
PPTX
COMPARISON OF RASTER ANALYSIS TOOLS OF QGIS AND ARCGIS
Sharanya Sarkar
 
PPTX
"Autonomy of LLM Agents: Current State and Future Prospects", Oles` Petriv
Fwdays
 
CIFDAQ Token Spotlight for 9th July 2025
CIFDAQ
 
LLMs.txt: Easily Control How AI Crawls Your Site
Keploy
 
Q2 FY26 Tableau User Group Leader Quarterly Call
lward7
 
Building Real-Time Digital Twins with IBM Maximo & ArcGIS Indoors
Safe Software
 
How Startups Are Growing Faster with App Developers in Australia.pdf
India App Developer
 
Blockchain Transactions Explained For Everyone
CIFDAQ
 
July Patch Tuesday
Ivanti
 
AI Penetration Testing Essentials: A Cybersecurity Guide for 2025
defencerabbit Team
 
WooCommerce Workshop: Bring Your Laptop
Laura Hartwig
 
Webinar: Introduction to LF Energy EVerest
DanBrown980551
 
Reverse Engineering of Security Products: Developing an Advanced Microsoft De...
nwbxhhcyjv
 
From Sci-Fi to Reality: Exploring AI Evolution
Svetlana Meissner
 
What Makes Contify’s News API Stand Out: Key Features at a Glance
Contify
 
"AI Transformation: Directions and Challenges", Pavlo Shaternik
Fwdays
 
Jak MŚP w Europie Środkowo-Wschodniej odnajdują się w świecie AI
dominikamizerska1
 
NewMind AI - Journal 100 Insights After The 100th Issue
NewMind AI
 
Biography of Daniel Podor.pdf
Daniel Podor
 
The Rise of AI and IoT in Mobile App Tech.pdf
IMG Global Infotech
 
COMPARISON OF RASTER ANALYSIS TOOLS OF QGIS AND ARCGIS
Sharanya Sarkar
 
"Autonomy of LLM Agents: Current State and Future Prospects", Oles` Petriv
Fwdays
 

Chapter 3 - Performance Testing in the Software Lifecycle

  • 1. Performance Testing in the Software Lifecycle 1 Basic Concepts 2 Performance Measurements 3 Performance Testing in SDLC Software Testing - ISTQB Specialist Performance Tester Exam Preparation Chapter 3 Neeraj Kumar Singh 4 Performance Testing Tasks 5 Tools
  • 2. Performance Testing in the Software Lifecycle Contents 3.1 Principal Performance Testing Activities 3.2 Performance Risks for Different Architectures 3.3 Performance Risks Across the Software Development Lifecycle 3.4 Performance Testing Activities Neeraj Kumar Singh
  • 3. Performance Testing in the Software Lifecycle Principal Performance Testing Activities Performance testing is iterative in nature. Each test provides valuable insights into application and system performance. The information gathered from one test is used to correct or optimize application and system parameters. The next test iteration will then show the results of modifications, and so on until test objectives are reached. Test Planning Test planning is particularly important for performance testing due to the need for the allocation of test environments, test data, tools and human resources. In addition, this is the activity in which the scope of performance testing is established. During test planning, risk identification and risk analysis activities are completed and relevant information is updated in any test planning documentation (e.g., test plan, level test plan). Just as test planning is revisited and modified as needed, so are risks, risk levels and risk status modified to reflect changes in risk conditions. Neeraj Kumar Singh
  • 4. Performance Testing in the Software Lifecycle Principal Performance Testing Activities Test Monitoring and Control Control measures are defined to provide action plans should issues be encountered which might impact performance efficiency, such as  increasing the load generation capacity if the infrastructure does not generate the desired loads as planned for particular performance tests  changed, new or replaced hardware  changes to network components  changes to software implementation The performance test objectives are evaluated to check for exit criteria achievement. Neeraj Kumar Singh
  • 5. Performance Testing in the Software Lifecycle Principal Performance Testing Activities Test Analysis Effective performance tests are based on an analysis of performance requirements, test objectives, Service Level Agreements (SLA), IT architecture, process models and other items that comprise the test basis. This activity may be supported by modeling and analysis of system resource requirements and/or behavior using spreadsheets or capacity planning tools. Specific test conditions are identified such as load levels, timing conditions, and transactions to be tested. The required type(s) of performance test (e.g., load, stress, scalability) are then decided. Test Design Performance test cases are designed. These are generally created in modular form so that they may be used as the building blocks of larger, more complex performance tests (see section 4.2). Neeraj Kumar Singh
  • 6. Performance Testing in the Software Lifecycle Principal Performance Testing Activities Test Implementation In the implementation phase, performance test cases are ordered into performance test procedures. These performance test procedures should reflect the steps normally taken by the user and other functional activities that are to be covered during performance testing. A test implementation activity is establishing and/or resetting the test environment before each test execution. Since performance testing is typically data-driven, a process is needed to establish test data that is representative of actual production data in volume and type so that production use can be simulated. Test Execution Test execution occurs when the performance test is conducted, often by using performance test tools. Test results are evaluated to determine if the system’s performance meets the requirements and other stated objectives. Any defects are reported. Neeraj Kumar Singh
  • 7. Performance Testing in the Software Lifecycle Principal Performance Testing Activities Test Completion Performance test results are provided to the stakeholders (e.g., architects, managers, product owners) in a test summary report. The results are expressed through metrics which are often aggregated to simplify the meaning of the test results. Visual means of reporting such as dashboards are often used to express performance test results in ways that are easier to understand than text-based metrics. Performance testing is often considered to be an ongoing activity in that it is performed at multiple times and at all test levels (component, integration, system, system integration and acceptance testing). At the close of a defined period of performance testing, a point of test closure may be reached where designed tests, test tool assets (test cases and test procedures), test data and other testware are archived or passed on to other testers for later use during system maintenance activities. Neeraj Kumar Singh
  • 8. Performance Testing in the Software Lifecycle Contents 3.1 Principal Performance Testing Activities 3.2 Performance Risks for Different Architectures 3.3 Performance Risks Across the Software Development Lifecycle 3.4 Performance Testing Activities Neeraj Kumar Singh
  • 9. Performance Testing in the Software Lifecycle Performance Risks for Different Architectures As mentioned previously, application or system performance varies considerably based on the architecture, application and host environment. While it is not possible to provide a complete list of performance risks for all systems, the list below includes some typical types of risks associated with particular architectures: Single Computer Systems These are systems or applications that runs entirely on one non-virtualized computer. Performance can degrade due to  excessive resource consumption including memory leaks, background activities such as security software, slow storage subsystems (e.g., low-speed external devices or disk fragmentation), and operating system mismanagement.  inefficient implementation of algorithms which do not make use of available resources (e.g., main memory) and as a result execute slower than required. Multi-tier Systems These are systems of systems that run on multiple servers, each of which performs a specific set of tasks, such as database server, application server, and presentation server. Each server is, of course, a computer and subject to the risks given earlier. In addition, performance can degrade due to poor or non-scalable database design, network bottlenecks, and inadequate bandwidth or capacity on any single server. Neeraj Kumar Singh
  • 10. Performance Testing in the Software Lifecycle Performance Risks for Different Architectures Distributed Systems These are systems of systems, similar to a multi-tier architecture, but the various servers may change dynamically, such as an e-commerce system that accesses different inventory databases depending on the geographic location of the person placing the order. In addition to the risks associated with multi-tier architectures, this architecture can experience performance problems due to critical workflows or dataflows to, from, or through unreliable or unpredictable remote servers, especially when such servers suffer periodic connection problems or intermittent periods of intense load. Virtualized Systems These are systems where the physical hardware hosts multiple virtual computers. These virtual machines may host single-computer systems and applications as well as servers that are part of a multi-tier or distributed architecture. Performance risks that arise specifically from virtualization include excessive load on the hardware across all the virtual machines or improper configuration of the host virtual machine resulting in inadequate resources. Neeraj Kumar Singh
  • 11. Performance Testing in the Software Lifecycle Performance Risks for Different Architectures Dynamic/Cloud-based Systems These are systems that offer the ability to scale on demand, increasing capacity as the level of load increases. These systems are typically distributed and virtualized multitier systems, albeit with self-scaling features designed specifically to mitigate some of the performance risks associated with those architectures. However, there are risks associated with failures to properly configure these features during initial setup or subsequent updates. Client –Server Systems These are systems running on a client that communicate via a user interface with a single server, multi-tier server, or distributed server. Since there is code running on the client, the single computer risks apply to that code, while the server-side issues mentioned above apply as well. Further, performance risks exist due to connection speed and reliability issues, network congestion at the client connection point (e.g., public Wi-Fi), and potential problems due to firewalls, packet inspection and server load balancing. Mobile Applications These are applications running on a smartphone, tablet, or other mobile device. Such applications are subject to the risks mentioned for client-server and browser-based (web apps) applications. In addition, performance issues can arise due to the limited and variable resources and connectivity available on the mobile device (which can be affected by location, battery life, charge state, available memory on the device and temperature). Neeraj Kumar Singh
  • 12. Performance Testing in the Software Lifecycle Performance Risks for Different Architectures Embedded Real-time Systems These are systems that work within or even control everyday things such as cars (e.g., entertainment systems and intelligent braking systems), elevators, traffic signals, Heating, Ventilation and Air Conditioning (HVAC) systems, and more. These systems often have many of the risks of mobile devices, including (increasingly) connectivity- related issues since these devices are connected to the Internet. However the diminished performance of a mobile video game is usually not a safety hazard for the user, while such slowdowns in a vehicle braking system could prove catastrophic. Mainframe Applications These are applications—in many cases decades-old applications—supporting often mission-critical business functions in a data center, sometimes via batch processing. Most are quite predictable and fast when used as originally designed, but many of these are now accessible via APIs, web services, or through their database, which can result in unexpected loads that affect throughput of established applications. Neeraj Kumar Singh
  • 13. Performance Testing in the Software Lifecycle Contents 3.1 Principal Performance Testing Activities 3.2 Performance Risks for Different Architectures 3.3 Performance Risks Across the Software Development Lifecycle 3.4 Performance Testing Activities Neeraj Kumar Singh
  • 14. Performance Testing in the Software Lifecycle Performance Risks Across the Software Lifecycle The process of analyzing risks to the quality of a software product in general. You can also find discussions of specific risks and considerations associated with particular quality characteristics, and from a business or technical perspective. In this section, the focus is on performance-related risks to product quality, including ways that the process, the participants, and the considerations change. For performance-related risks to the quality of the product, the process is: 1. Identify risks to product quality, focusing on characteristics such as time behavior, resource utilization, and capacity. 2. Assess the identified risks, ensuring that the relevant architecture categories are addressed. Evaluate the overall level of risk for each identified risk in terms of likelihood and impact using clearly defined criteria. 3. Take appropriate risk mitigation actions for each risk item based on the nature of the risk item and the level of risk. 4. Manage risks on an ongoing basis to ensure that the risks are adequately mitigated prior to release. As with quality risk analysis in general, the participants in this process should include both business and technical stakeholders. For performance-related risk analysis the business stakeholders must include those with a particular awareness of how performance problems in production will actually affect customers, users, the business, and other downstream stakeholders. Neeraj Kumar Singh
  • 15. Performance Testing in the Software Lifecycle Performance Risks Across the Software Lifecycle Further, the technical stakeholders must include those with a deep understanding of the performance implications of relevant requirements, architecture, design, and implementation decisions. The specific risk analysis process chosen should have the appropriate level of formality and rigor. For performance- related risks, it is especially important that the risk analysis process be started early and is repeated regularly. In addition, risk mitigation and management must span and influence the entire software development process, not just dynamic testing. Good performance engineering can help project teams avoid the late discovery of critical performance defects during higher test levels, such as system integration testing or user acceptance testing. Performance defects found at a late stage in the project can be extremely costly and may even lead to the cancellation of entire projects. As with any type of quality risk, performance-related risks can never be avoided completely, i.e., some risk of performance-related production failure will always exist. Therefore, the risk management process must include providing a realistic and specific evaluation of the residual level of risk to the business and technical stakeholders involved in the process. For example, simply saying, “Yes, it’s still possible for customers to experience long delays during check out,” is not helpful, as it gives no idea of what amount of risk mitigation has occurred or of the level of risk that remains. Instead, providing clear insight into the percentage of customers likely to experience delays equal to or exceeding certain thresholds will help people understand the status. Neeraj Kumar Singh
  • 16. Performance Testing in the Software Lifecycle Contents 3.1 Principal Performance Testing Activities 3.2 Performance Risks for Different Architectures 3.3 Performance Risks Across the Software Development Lifecycle 3.4 Performance Testing Activities Neeraj Kumar Singh
  • 17. Performance Testing in the Software Lifecycle Performance Testing Activities Performance testing activities will be organized and performed differently, depending on the type of software development lifecycle in use. Sequential Development Models The ideal practice of performance testing in sequential development models is to include performance criteria as a part of the acceptance criteria which are defined at the outset of a project. Reinforcing the lifecycle view of testing, performance testing activities should be conducted throughout the software development lifecycle. As the project progresses, each successive performance test activity should be based on items defined in the prior activities as shown below.  Concept – Verify that system performance goals are defined as acceptance criteria for the project.  Requirements – Verify that performance requirements are defined and represent stakeholder needs correctly.  Analysis and Design – Verify that the system design reflects the performance requirements.  Coding/Implementation – Verify that the code is efficient and reflects the requirements and design in terms of performance. Neeraj Kumar Singh
  • 18. Performance Testing in the Software Lifecycle Performance Testing Activities  Component Testing – Conduct component level performance testing.  Component Integration Testing – Conduct performance testing at the component integration level.  System Testing – Conduct performance testing at the system level, which includes hardware, software, procedures and data that are representative of the production environment.  System Integration Testing– Conduct performance testing with the entire system which is representative of the production environment.  Acceptance Testing – Validate that system performance meets the originally stated user needs and acceptance criteria. Neeraj Kumar Singh
  • 19. Performance Testing in the Software Lifecycle Performance Testing Activities Iterative and Incremental Development Models In these development models, such as Agile, performance testing is also seen as an iterative and incremental activity. Performance testing can occur as part of the first iteration, or as an iteration dedicated entirely to performance testing. However, with these lifecycle models, the execution of performance testing may be performed by a separate team tasked with performance testing. Continuous Integration (CI) is commonly performed in iterative and incremental software development lifecycles, which facilitates a highly automated execution of tests. The most common objective of testing in CI is to perform regression testing and ensure each build is stable. Performance testing can be part of the automated tests performed in CI if the tests are designed in such a way as to be executed at a build level. Neeraj Kumar Singh
  • 20. Performance Testing in the Software Lifecycle Performance Testing Activities However, unlike functional automated tests, there are additional concerns such as the following:  The setup of the performance test environment – This often requires a test environment that is available on demand, such as a cloud-based performance test environment.  Determining which performance tests to automate in CI – Due to the short timeframe available for CI tests, CI performance tests may be a subset of more extensive performance tests that are conducted by a specialist team at other times during an iteration.  Creating the performance tests for CI – The main objective of performance tests as part of CI is to ensure a change does not negatively impact performance. Depending on the changes made for any given build, new performance tests may be required.  Executing performance tests on portions of an application or system – This often requires the tools and test environments to be capable of rapid performance testing including the ability to select subsets of applicable tests. Neeraj Kumar Singh
  • 21. Performance Testing in the Software Lifecycle Performance Testing Activities Performance testing in the iterative and incremental software development lifecycles can also have its own lifecycle activities:  Release Planning – In this activity, performance testing is considered from the perspective of all iterations in a release, from the first iteration to the final iteration. Performance risks are identified and assessed, and mitigation measures planned. This often includes planning of any final performance testing before the release of the application.  Iteration Planning – In the context of each iteration, performance testing may be performed within the iteration and as each iteration is completed. Performance risks are assessed in more detail for each user story.  User Story Creation – User stories often form the basis of performance requirements in Agile methodologies, with the specific performance criteria described in the associated acceptance criteria. These are referred to as “nonfunctional” user stories.  Design of performance tests –performance requirements and criteria which are described in particular user stories are used for the design of tests  Coding/Implementation – During coding, performance testing may be performed at a component level. An example of this would be the tuning of algorithms for optimum performance efficiency.  Testing/Evaluation – While testing is typically performed in close proximity to development activities, performance testing may be performed as a separate activity, depending on the scope and objectives of performance testing during the iteration.  Delivery – Since delivery will introduce the application to the production environment, performance will need to be monitored to determine if the application achieves the desired levels of performance in actual usage. Neeraj Kumar Singh
  • 22. Performance Testing in the Software Lifecycle Performance Testing Activities Commercial Off-the-Shelf (COTS) and other Supplier/Acquirer Models Many organizations do not develop applications and systems themselves, but instead are in the position of acquiring software from vendor sources or from open-source projects. In such supplier/acquirer models, performance is an important consideration that requires testing from both the supplier (vendor/developer) and acquirer (customer) perspectives. Regardless of the source of the application, it is often the responsibility of the customer to validate that the performance meets their requirements. In the case of customized vendor-developed software, performance requirements and associated acceptance criteria which should be specified as part of the contract between the vendor and customer. In the case of COTS applications, the customer has sole responsibility to test the performance of the product in a realistic test environment prior to deployment. Neeraj Kumar Singh
  • 23. Performance Testing in the Software Lifecycle 1 Basic Concepts 2 Performance Measurements 3 Performance Testing in SDLC Software Testing - ISTQB Specialist Performance Tester Exam Preparation Chapter 3 Neeraj Kumar Singh 4 Performance Testing Tasks 5 Tools
  • 24. Neeraj Kumar Singh Performance Testing in the Software Lifecycle Sample Questions Pattern Source : istqb.org
  • 25. 1. When applying the principal performance testing activities, when should the test cases be ordered into performance test procedures? ? Select ONE option. Answer Set a. Test planning b. Test analysis and design c. Test implementation and execution d. Test closure Performance Testing in the Software Lifecycle Sample Question
  • 26. 2. Consider the following technical environments: 1. Virtualized 2. Dynamic/Cloud-based 3. Client/Server and Browser-based 4. Mobile 5. Embedded 6. Mainframe Which of these is most likely to have a performance risk due to memory leaks? Select ONE option. Answer Set a. 1, 2, 3, 6 b. 2, 3, 4, 5 c. 1, 2, 4, 6 d. 1, 3, 4, 5 Performance Testing in the Software Lifecycle Sample Question
  • 27. 3. You are working on a project that tracks health history information for patients across a region. The number of records handled by the system is in the millions due to the large number of patients in the region. Patient information must be accessible to doctors in offices, hospitals and urgent care facilities. The information should be presented to the requestor within three seconds of request, particularly for patients with critical allergies and preconditions. Given this information, when is the best time in the project to analyze and assess the performance risks? Select ONE option. Answer Set a. During the requirements phase and again just prior to executing the performance tests b. After design but prior to coding c. During system testing and again prior to the performance tests d. Repeatedly throughout the requirements, development and performance testing Performance Testing in the Software Lifecycle Sample Question