SlideShare a Scribd company logo
PROSA
A framework for Failure Prediction through Online Testing

Osama Sammodi

                S‐Cube Industrial Workshop
                 24.02.2012, Thales, Paris




                                                                     1
                                    University of Duisburg‐Essen (UniDue)
Agenda




• PROSA: Online Testing & Failure Prediction
  Framework
• Evaluation
• PROSA and fi‐ware
• Benefits and Use Case
                                           2
Current Situation
Failure Prediction
 • Web services provide opportunities for building highly dynamic 
   systems by integrating 3rd party services
    – Which are under the sole control of service providers
    – May behave in ways not anticipated during design time
       (e.g., degradation of QoS such as reduced performance or low reliability)


 • Online failure prediction allows anticipating deviations in the 
   expected QoS
    – And thus, planning and implementing proactive repair or compensation 
      activities




                                                                              3
Goal
Accurate Failure Prediction
 Inaccurate failure prediction leads to:
    – False Positives: prediction predicts a failure although the service turns out 
      to work when invoked
    – False Negatives: prediction doesn’t predict a failure although the service 
      turns out to fail when invoked 


    – Higher operational cost
      (e.g., use of expensive alternative service)
    – Performance issues (in the worst case, leaving less 
      time to address true failures)
    – Failures and financial loses (e.g., use of a “buggy” 
      service) 
                                                                                 4
Failure Prediction & Monitoring
Problem
  •     Monitoring (prominent for SBAs)
        – Observe the software during its 
          current execution (i.e., actual use / 
                                                 End-user
          operation)
        – End‐user interacts with the 
          system

                                                        input   output
            Cannot guarantee 
               comprehensive / timely 
               coverage of the ’test object’
             Can reduce the accuracy of 
            failure prediction


[for more details, see deliverable JRA-1.3.1; S-                     5
Cube KM]
Failure Prediction & OT
Solution
 Solution: Online Testing (OT) = Extend testing to the operation phase
                                            Identify 
                                            Adaptation                                    Requirements
                                            Need                                          Engineering
                                                                    Operation &
                                                                    Management
   “Actively (& systematically)                           Testing
   execute services in parallel    Identify 
                                   Adaptation         Adaptation
                                                                                                   Design
                                                                                   Evolution
   to their normal use in SBA”     Strategy

                                                                    Deployment &
                                             Enact                  Provisioning               Realization
                                             Adaptation




                                                 Tester




                                                             input                    output
                                                                                                             6
PROSA
Failure Prediction Framework
                                                             PROSA Framework
• Online testing
                                          Online Testing                      Monitoring Module
   – Triggers testing the                   Module                                                Usage Rates

     services based on usage                                        Test
                                                                    Input
                       Usage rate 
                                           Online Testing                       Monitoring  QoS Data         Monitoring 
     rates              threshold
                                              Engine              Time 
                                                                                 Engine     Usage Data
                                                                                                               Data
                                                                Interval
• Monitoring
                           Online test       Test Input 
                                                                                                                   QoS Data
   – Performs actual              rate       Repository

     observation of QoS data                                                  Prediction Module

     for both testing and                                                           Prediction 
                                                                                      Model
                                                                                                            Prediction 
                                                                                                              Engine
                                                                                                                                  Pred. 
     monitoring                                                                                                                   model

• Prediction
   – Use combined                                                                                                             Prediction 
     monitoring and testing                          Monitoring             Test Input                                          Result 

                                                      Events
     results (QoS data) for                                                                       Service

     predicting failures                                                                 Service‐oriented System          7
Agenda




• PROSA: Online Testing & Failure Prediction
  Framework
• Evaluation
• PROSA and fi‐ware
• Benefits and Use Cases
                                               8
Experiments
Goal
 • To understand the accuracy gains achieved by OT in a 
   broad setting 

 • Analyse the factors that can influence the gains in 
   prediction accuracy achieved by OT.
    – Usage rate
    – Prediction model
    – Online test rates




                                                           9
Experiments
Results
       • Influence of usage rate and online test rate
            – results using: Last and 25% failure rate




                                                                                                     0.25
0.25




                                                 0.25
       ΔF                   Difference (OT‐M)           ΔF                   Difference (OT‐M)              ΔF                   Difference (OT‐M)




                                                                                                     0.20
0.20




                                                 0.20




                                                                                                     0.15
0.15




                                                 0.15




                                                                                                     0.10
0.10




                                                 0.10
0.05




                                                                                                     0.05
                                                 0.05
0.00




                                                 0.00




                                                                                                     0.00
             0.05    0.10      0.15       0.20               0.05     0.10      0.15          0.20               0.05     0.10      0.15          0.20
                                  usage rate                                           usage rate                                          usage rate

             Online test rate = 0.15                           Online test rate = 0.30                             Online test rate = 0.60


                                                                                                                                             10
Experiments
  Results
       • Influence of prediction model
               – results using: 0.15 online test rate and 25% failure rate
0.25




                                                 0.25




                                                                                                 0.25




                                                                                                                                                 0.25
       ΔF                   Difference (OT‐M)           ΔF                  Difference (OT‐M)           ΔF                  Difference (OT‐M)           ΔF                  Difference (OT‐M)
0.20




                                                 0.20




                                                                                                 0.20




                                                                                                                                                 0.20
0.15




                                                 0.15




                                                                                                 0.15




                                                                                                                                                 0.15
0.10




                                                 0.10




                                                                                                 0.10




                                                                                                                                                 0.10
0.05




                                                 0.05




                                                                                                 0.05




                                                                                                                                                 0.05
0.00




                                                 0.00




                                                                                                 0.00




                                                                                                                                                 0.00
            0.05     0.10        0.15     0.20               0.05    0.10        0.15     0.20               0.05    0.10        0.15     0.20               0.05    0.10        0.15     0.20
                                  usage rate                                      usage rate                                      usage rate                                      usage rate
                   Last                                             BM(10)                                          BM(5)                                           SEM
            Last observed value is                             Arithmetic average                            Arithmetic average                         Simple Exponential Smoothing
            the prediction value                               of last 10 points                             of last 5 points


                                                                                                                                                                                 11
Agenda




• PROSA: Online Testing & Failure Prediction
  Framework
• Evaluation
• PROSA and fi‐ware
• Benefits and Use Cases
                                               12
PROSA & fi‐ware
                           Analysis and        Construction          Deployment             Execution and 
  PERFORMANCE TESTING        Design             and Testing                                  Monitoring

                                                    Trace Analyzer


                                  Software Performance Cockpit


                                                                                                  PROSA
 DEPLOY‐
  MENT




                                                                     ENG Deployment Tool
                                                                     (for IaaS/PaaS Deployment)




                                                                                         On‐going work
                                                                                                          13
                        Future Internet Core Platform: http://www.fi‐ware.eu/
Agenda




• PROSA: Online Testing & Failure Prediction
  Framework
• Evaluation
• PROSA and fi‐ware
• Benefits and Use Cases
                                               14
Benefits and Use Cases
PROSA Constant availability of QoS data
Use Cases:
•   Accurate failure prediction
•   Integration in FI application core platform (fi‐ware)
•   Combining/using PROSA with approaches for failure 
    prediction of a composite service
     – UniDue’s approach (SPADE) : e‐2‐e requirement violation 
       prediction through runtime verification (see S‐Cube Virtual 
       Campus)




                                                                      SLA violation 
                                                                       prediction
• Avoiding unnecessary costs
• Avoiding unnecessary repair/        Failure
  compensation efforts              Prediction


• Avoiding SLA violations for service 
  providers                                                               15
Thank You!


             16
References
[Sammodi et al. 2011] O. Sammodi, A. Metzger, X. Franch, M. Oriol, J. Marco, and K. Pohl. Usage‐based online testing for proactive
adaptation of service‐based applications. In COMPSAC 2011
[Metzger 2011] A. Metzger. Towards Accurate Failure Prediction for the Proactive Adaptation of Service‐oriented Systems (Invited Paper). 
In ASAS@ESEC 2011
[Salfner et al. 2010] F. Salfner, M. Lenk, and M. Malek. A survey of online failure prediction methods. ACM Comput. Surv., 42(3), 2010
[PO‐JRA‐1.3.1] S‐Cube deliverable # PO‐JRA‐1.3.1: Survey of Quality Related Aspects Relevant for Service‐based Applications; 
      http://www.s‐cube‐network.eu/results/deliverables/wp‐jra‐1.3
[S‐Cube KM] S‐Cube Knowledge Model: http://www.s‐cube‐network.eu/knowledge‐model




                                                                                                                                      17

More Related Content

Similar to PROSA - A Framework for Online Failure Prediction through Online Testing (20)

PPT
Chapter 1 ASE Slides ppt
Mr SMAK
 
PDF
Energy and engineering services leverages growth
Hazelknight Media & Entertainment Pvt Ltd
 
PDF
8 - Architetture Software - Architecture centric processes
Majong DevJfu
 
PDF
Industrialization Of Testing Softec2012 Ramesh
aralikatte
 
PPTX
Releasing fast code - The DevOps approach
Michael Kopp
 
PDF
Paderborn
Carlo Ghezzi
 
PDF
Enabling agility with continuous integration testing
IBM Rational software
 
PDF
Need of-the-hour-zsl-performance-testing-framework
zslmarketing
 
PPT
Smartere test og udvikling med virtualiserede miljøer, Mark Garcia, IBM US
IBM Danmark
 
PDF
Verteilte Synchronisierung von Modellen in automatisierten Entwicklungsprozessen
Intland Software GmbH
 
PDF
Optimal and Heuristic Techniques for Fault Detection
kannikaslide
 
PDF
Test factory approach to automated testing
Hazelknight Media & Entertainment Pvt Ltd
 
PDF
Testing for continuous delivery with visual studio 2012
Cristiano Caetano
 
PDF
Brochure Datametrie
JETLAG
 
PPTX
Unosquare SlideShare Presentation
Michael Barrett
 
PPT
Sop test planning
Frank Gielen
 
PPT
T12 exploitation testing - presentation 1.2
Edwin Loon, van
 
PPTX
Oss Bss Testing
Ahmed Adel
 
PDF
SOA Testing As A Service
Libero Maesano
 
PDF
SwissQ Testing Trends & Benchmarks 2012 (Englisch)
SwissQ Consulting AG
 
Chapter 1 ASE Slides ppt
Mr SMAK
 
Energy and engineering services leverages growth
Hazelknight Media & Entertainment Pvt Ltd
 
8 - Architetture Software - Architecture centric processes
Majong DevJfu
 
Industrialization Of Testing Softec2012 Ramesh
aralikatte
 
Releasing fast code - The DevOps approach
Michael Kopp
 
Paderborn
Carlo Ghezzi
 
Enabling agility with continuous integration testing
IBM Rational software
 
Need of-the-hour-zsl-performance-testing-framework
zslmarketing
 
Smartere test og udvikling med virtualiserede miljøer, Mark Garcia, IBM US
IBM Danmark
 
Verteilte Synchronisierung von Modellen in automatisierten Entwicklungsprozessen
Intland Software GmbH
 
Optimal and Heuristic Techniques for Fault Detection
kannikaslide
 
Test factory approach to automated testing
Hazelknight Media & Entertainment Pvt Ltd
 
Testing for continuous delivery with visual studio 2012
Cristiano Caetano
 
Brochure Datametrie
JETLAG
 
Unosquare SlideShare Presentation
Michael Barrett
 
Sop test planning
Frank Gielen
 
T12 exploitation testing - presentation 1.2
Edwin Loon, van
 
Oss Bss Testing
Ahmed Adel
 
SOA Testing As A Service
Libero Maesano
 
SwissQ Testing Trends & Benchmarks 2012 (Englisch)
SwissQ Consulting AG
 

Recently uploaded (20)

PDF
CIFDAQ Weekly Market Wrap for 11th July 2025
CIFDAQ
 
PDF
Wojciech Ciemski for Top Cyber News MAGAZINE. June 2025
Dr. Ludmila Morozova-Buss
 
PDF
Building Real-Time Digital Twins with IBM Maximo & ArcGIS Indoors
Safe Software
 
PDF
Smart Air Quality Monitoring with Serrax AQM190 LITE
SERRAX TECHNOLOGIES LLP
 
PDF
Empowering Cloud Providers with Apache CloudStack and Stackbill
ShapeBlue
 
PDF
Persuasive AI: risks and opportunities in the age of digital debate
Speck&Tech
 
PPTX
✨Unleashing Collaboration: Salesforce Channels & Community Power in Patna!✨
SanjeetMishra29
 
PDF
Women in Automation Presents: Reinventing Yourself — Bold Career Pivots That ...
DianaGray10
 
PDF
CIFDAQ Token Spotlight for 9th July 2025
CIFDAQ
 
PDF
Français Patch Tuesday - Juillet
Ivanti
 
PDF
Apache CloudStack 201: Let's Design & Build an IaaS Cloud
ShapeBlue
 
PDF
CloudStack GPU Integration - Rohit Yadav
ShapeBlue
 
PDF
HCIP-Data Center Facility Deployment V2.0 Training Material (Without Remarks ...
mcastillo49
 
PDF
DevBcn - Building 10x Organizations Using Modern Productivity Metrics
Justin Reock
 
PPTX
Webinar: Introduction to LF Energy EVerest
DanBrown980551
 
PDF
Chris Elwell Woburn, MA - Passionate About IT Innovation
Chris Elwell Woburn, MA
 
PPTX
Building a Production-Ready Barts Health Secure Data Environment Tooling, Acc...
Barts Health
 
PPTX
WooCommerce Workshop: Bring Your Laptop
Laura Hartwig
 
PDF
Rethinking Security Operations - SOC Evolution Journey.pdf
Haris Chughtai
 
PDF
Empower Inclusion Through Accessible Java Applications
Ana-Maria Mihalceanu
 
CIFDAQ Weekly Market Wrap for 11th July 2025
CIFDAQ
 
Wojciech Ciemski for Top Cyber News MAGAZINE. June 2025
Dr. Ludmila Morozova-Buss
 
Building Real-Time Digital Twins with IBM Maximo & ArcGIS Indoors
Safe Software
 
Smart Air Quality Monitoring with Serrax AQM190 LITE
SERRAX TECHNOLOGIES LLP
 
Empowering Cloud Providers with Apache CloudStack and Stackbill
ShapeBlue
 
Persuasive AI: risks and opportunities in the age of digital debate
Speck&Tech
 
✨Unleashing Collaboration: Salesforce Channels & Community Power in Patna!✨
SanjeetMishra29
 
Women in Automation Presents: Reinventing Yourself — Bold Career Pivots That ...
DianaGray10
 
CIFDAQ Token Spotlight for 9th July 2025
CIFDAQ
 
Français Patch Tuesday - Juillet
Ivanti
 
Apache CloudStack 201: Let's Design & Build an IaaS Cloud
ShapeBlue
 
CloudStack GPU Integration - Rohit Yadav
ShapeBlue
 
HCIP-Data Center Facility Deployment V2.0 Training Material (Without Remarks ...
mcastillo49
 
DevBcn - Building 10x Organizations Using Modern Productivity Metrics
Justin Reock
 
Webinar: Introduction to LF Energy EVerest
DanBrown980551
 
Chris Elwell Woburn, MA - Passionate About IT Innovation
Chris Elwell Woburn, MA
 
Building a Production-Ready Barts Health Secure Data Environment Tooling, Acc...
Barts Health
 
WooCommerce Workshop: Bring Your Laptop
Laura Hartwig
 
Rethinking Security Operations - SOC Evolution Journey.pdf
Haris Chughtai
 
Empower Inclusion Through Accessible Java Applications
Ana-Maria Mihalceanu
 
Ad

PROSA - A Framework for Online Failure Prediction through Online Testing

  • 1. PROSA A framework for Failure Prediction through Online Testing Osama Sammodi S‐Cube Industrial Workshop 24.02.2012, Thales, Paris 1 University of Duisburg‐Essen (UniDue)
  • 2. Agenda • PROSA: Online Testing & Failure Prediction Framework • Evaluation • PROSA and fi‐ware • Benefits and Use Case 2
  • 3. Current Situation Failure Prediction • Web services provide opportunities for building highly dynamic  systems by integrating 3rd party services – Which are under the sole control of service providers – May behave in ways not anticipated during design time (e.g., degradation of QoS such as reduced performance or low reliability) • Online failure prediction allows anticipating deviations in the  expected QoS – And thus, planning and implementing proactive repair or compensation  activities 3
  • 4. Goal Accurate Failure Prediction Inaccurate failure prediction leads to: – False Positives: prediction predicts a failure although the service turns out  to work when invoked – False Negatives: prediction doesn’t predict a failure although the service  turns out to fail when invoked  – Higher operational cost (e.g., use of expensive alternative service) – Performance issues (in the worst case, leaving less  time to address true failures) – Failures and financial loses (e.g., use of a “buggy”  service)  4
  • 5. Failure Prediction & Monitoring Problem • Monitoring (prominent for SBAs) – Observe the software during its  current execution (i.e., actual use /  End-user operation) – End‐user interacts with the  system input output Cannot guarantee  comprehensive / timely  coverage of the ’test object’  Can reduce the accuracy of  failure prediction [for more details, see deliverable JRA-1.3.1; S- 5 Cube KM]
  • 6. Failure Prediction & OT Solution Solution: Online Testing (OT) = Extend testing to the operation phase Identify  Adaptation  Requirements Need Engineering Operation & Management “Actively (& systematically)  Testing execute services in parallel  Identify  Adaptation  Adaptation Design Evolution to their normal use in SBA” Strategy Deployment & Enact Provisioning Realization Adaptation Tester input output 6
  • 7. PROSA Failure Prediction Framework PROSA Framework • Online testing Online Testing  Monitoring Module – Triggers testing the  Module Usage Rates services based on usage  Test Input Usage rate  Online Testing  Monitoring  QoS Data Monitoring  rates threshold Engine Time  Engine Usage Data Data Interval • Monitoring Online test  Test Input  QoS Data – Performs actual  rate Repository observation of QoS data  Prediction Module for both testing and  Prediction  Model Prediction  Engine Pred.  monitoring model • Prediction – Use combined  Prediction  monitoring and testing  Monitoring   Test Input Result  Events results (QoS data) for  Service predicting failures Service‐oriented System 7
  • 8. Agenda • PROSA: Online Testing & Failure Prediction Framework • Evaluation • PROSA and fi‐ware • Benefits and Use Cases 8
  • 9. Experiments Goal • To understand the accuracy gains achieved by OT in a  broad setting  • Analyse the factors that can influence the gains in  prediction accuracy achieved by OT. – Usage rate – Prediction model – Online test rates 9
  • 10. Experiments Results • Influence of usage rate and online test rate – results using: Last and 25% failure rate 0.25 0.25 0.25 ΔF Difference (OT‐M) ΔF Difference (OT‐M) ΔF Difference (OT‐M) 0.20 0.20 0.20 0.15 0.15 0.15 0.10 0.10 0.10 0.05 0.05 0.05 0.00 0.00 0.00 0.05 0.10 0.15 0.20 0.05 0.10 0.15 0.20 0.05 0.10 0.15 0.20 usage rate usage rate usage rate Online test rate = 0.15 Online test rate = 0.30 Online test rate = 0.60 10
  • 11. Experiments Results • Influence of prediction model – results using: 0.15 online test rate and 25% failure rate 0.25 0.25 0.25 0.25 ΔF Difference (OT‐M) ΔF Difference (OT‐M) ΔF Difference (OT‐M) ΔF Difference (OT‐M) 0.20 0.20 0.20 0.20 0.15 0.15 0.15 0.15 0.10 0.10 0.10 0.10 0.05 0.05 0.05 0.05 0.00 0.00 0.00 0.00 0.05 0.10 0.15 0.20 0.05 0.10 0.15 0.20 0.05 0.10 0.15 0.20 0.05 0.10 0.15 0.20 usage rate usage rate usage rate usage rate Last BM(10) BM(5) SEM Last observed value is  Arithmetic average  Arithmetic average  Simple Exponential Smoothing the prediction value of last 10 points of last 5 points 11
  • 12. Agenda • PROSA: Online Testing & Failure Prediction Framework • Evaluation • PROSA and fi‐ware • Benefits and Use Cases 12
  • 13. PROSA & fi‐ware Analysis and  Construction Deployment Execution and  PERFORMANCE TESTING Design and Testing Monitoring Trace Analyzer Software Performance Cockpit PROSA DEPLOY‐ MENT ENG Deployment Tool (for IaaS/PaaS Deployment) On‐going work 13 Future Internet Core Platform: http://www.fi‐ware.eu/
  • 14. Agenda • PROSA: Online Testing & Failure Prediction Framework • Evaluation • PROSA and fi‐ware • Benefits and Use Cases 14
  • 15. Benefits and Use Cases PROSA Constant availability of QoS data Use Cases: • Accurate failure prediction • Integration in FI application core platform (fi‐ware) • Combining/using PROSA with approaches for failure  prediction of a composite service – UniDue’s approach (SPADE) : e‐2‐e requirement violation  prediction through runtime verification (see S‐Cube Virtual  Campus) SLA violation  prediction • Avoiding unnecessary costs • Avoiding unnecessary repair/  Failure compensation efforts Prediction • Avoiding SLA violations for service  providers 15
  • 17. References [Sammodi et al. 2011] O. Sammodi, A. Metzger, X. Franch, M. Oriol, J. Marco, and K. Pohl. Usage‐based online testing for proactive adaptation of service‐based applications. In COMPSAC 2011 [Metzger 2011] A. Metzger. Towards Accurate Failure Prediction for the Proactive Adaptation of Service‐oriented Systems (Invited Paper).  In ASAS@ESEC 2011 [Salfner et al. 2010] F. Salfner, M. Lenk, and M. Malek. A survey of online failure prediction methods. ACM Comput. Surv., 42(3), 2010 [PO‐JRA‐1.3.1] S‐Cube deliverable # PO‐JRA‐1.3.1: Survey of Quality Related Aspects Relevant for Service‐based Applications;  http://www.s‐cube‐network.eu/results/deliverables/wp‐jra‐1.3 [S‐Cube KM] S‐Cube Knowledge Model: http://www.s‐cube‐network.eu/knowledge‐model 17