SlideShare a Scribd company logo
A 5-step methodology for
complex E&P data
management

                       Raising data
                       management
                        standards

www.etlsolutions.com
The increasing complexity of E&P data
 New devices are being used                         Timescales are collapsing.
in every phase of Exploration                        Once, drilling and logging
& Production (E&P) in the Oil                       data were distinct activities
  & Gas industry, gathering                         separated by days, but now
 more data with which better                               they happen
   decisions can be made.                                simultaneously.




                                                     Metadata (in the Dublin core
 These changes are being                              and ISO 19115 sense) are
     factored into the                                    becoming ever more
  development of industry                               important in providing
    standards (such as                                 context. This has a direct
    PPDM), driving their                                 impact on proprietary
    evolution to ensure                                  database design and
      continued use.                                         functionality.


            The price of progress is growing data complexity.
A 5-step methodology for managing this data


To make a robust and repeatable
     approach work, we use
Transformation Manager, our data        The Transformation Manager
       integration toolset.         software is coupled with the approach
                                    we have adopted over many years in
                                            the Oil & Gas industry




              The result is a five-stage methodology.
Step 1

  Separate source and target data models and the logic which
  lies between them.

• This means that we can isolate the pure model structure and
  clearly see the elements, attributes and relationships in each
  model.
• We can also see detail such as database primary keys and
  comments.
• As exposing relationships is the key in handling PPDM and
  other highly normalized models, this is a critical step.
Step 2

  Separate the model from the mechanics of data storage.


• The mechanics define physical characteristics such as ‘this is
  an Oracle database’ or ‘this flat file uses a particular delimiter
  or character set’. It is the model that tells us things like ‘a well
  can have many bores’, ‘a wellbore many logs’, and that ‘log
  trace mnemonics’ are catalogue controlled.
• At a stroke, this separation abolishes a whole category of
  complexity.
• For both source and target we need a formal data
  model, because this enables us to read or write to
  database, XML, flat file, or any other data format.
Step 3

    Specify relationships between source and target.

•   In all data integration projects, determining the rules for the data
    transfer is a fundamental requirement usually defined by analysts
    working in this field, often using spreadsheets.
•   But based on these or other forms of specification, we can create the
    integration components in Transformation Manager using its descriptive
    mapping language. This enables us to create a precisely defined
    description of the link between the two data models.
•   From this we can generate a runtime system which will execute the
    formal definitions. Even if we chose not to create an executable
    link, the formal definition of the mappings is still useful, because it
    shows where the complexity in the PPDM integration is and the formal
    syntax can be shared with others to verify our interpretation of their
    rules.
Step 4
    Follow an error detection procedure.
•   To ensure that only good data is stored, Transformation Manager has a robust
    process of error detection that operates like a series of filters. For each phase, we
    detect errors relevant to that phase and we don't send bad data to the next
    phase, where detection becomes even more complex.
•   We detect mechanical and logical errors separately. If the source is a flat file, a
    mechanical error could be malformed lines; logical errors could include dangling
    foreign key references or missing data values.
•   Next, we can detect errors at the mapping level, inconsistencies that are a
    consequence of the map itself. Here, for example, we could detect that we are trying
    to load production data for a source well which does not exist in the target.
•   Finally there are errors where the data is inconsistent with the target logical model.
    Here, simple tests (a string value is too long, a number is negative) can often be
    automatically constructed from the model. More complex tests (well bores cannot
    curve so sharply, these production figures are for an abandoned well) are built using
    the semantics of the model.
•   A staging store is very useful in providing an isolated area where we can disinfect the
    data before letting it out onto a master system. Staging stores were an integral part of
    the best practice data loaders we helped build for a major E&P company, and it is
    now common practice that these are stored until issues are resolved.
Step 5

  Execute a runtime link to generate the code required to
  generate the integration.

• This will generate integration components, in the form of Java
  code, which can reside anywhere in the architecture.
• This could be on the source, target or any other system to
  manage the integration between PPDM and non-PPDM data
  sources.
Our offerings: E&P data management


                                 Transformation                 Support,
 Transformation                  Manager data                 training and
    Manager                          loader                    mentoring
    software                     developer kits                 services




                   Data loader                      Data
                       and                        migration
                    connector                     packaged
                  development                      services
Why Transformation Manager?

For the user:    Everything under one roof
                 Greater control and
                  transparency
                 Identify and test against errors
                  iteratively
                 Greater understanding of the
                  transformation requirement
                 Automatically document
                 Re-use and change
                  management
                 Uses domain specific
                  terminology in the mapping
Why Transformation Manager?

For the business:    Reduces cost and effort
                     Reduces risk in the project
                     Delivers higher quality and
                      reduces error
                     Increases control and
                      transparency in the
                      development
                     Single product
                     Reduces time to market
Contact information
   Karl Glenn
   kg@etlsolutions.com
   +44 (0) 1912 894040




                         Raising data
                         management
                          standards
www.etlsolutions.com

More Related Content

What's hot (12)

PDF
BORANG GURU CEMERLANG
zalizria
 
DOCX
plan strategik pso kokurikulum.docx
afrizal935416
 
PDF
Academic Transcript (job 1759126)
Michael Graham
 
DOC
Analisis swot koko
SMK Wira Penrissen
 
PPT
Pengurusan Panitia Berkesan
Nasrul Hakim Zakaria
 
DOCX
347640080 pelan strategik_bahasa
Roslan Sulaiman
 
DOCX
Penilaian pertandingan keceriaan kelas
ministry of education malaysia
 
DOCX
Senarai semak persediaan tutup dan buka sekolah
ZeatieHaney
 
PDF
Collateral Warranties and Third Party Rights
Francis Ho
 
PDF
Factors Affecting Port Operation
Florlyn Matildo
 
PDF
Academic Transcript
Robin Argueyrolles
 
PPSX
Project Procurement - Principles and Case Study
Kiran Radhakrishnan
 
BORANG GURU CEMERLANG
zalizria
 
plan strategik pso kokurikulum.docx
afrizal935416
 
Academic Transcript (job 1759126)
Michael Graham
 
Analisis swot koko
SMK Wira Penrissen
 
Pengurusan Panitia Berkesan
Nasrul Hakim Zakaria
 
347640080 pelan strategik_bahasa
Roslan Sulaiman
 
Penilaian pertandingan keceriaan kelas
ministry of education malaysia
 
Senarai semak persediaan tutup dan buka sekolah
ZeatieHaney
 
Collateral Warranties and Third Party Rights
Francis Ho
 
Factors Affecting Port Operation
Florlyn Matildo
 
Academic Transcript
Robin Argueyrolles
 
Project Procurement - Principles and Case Study
Kiran Radhakrishnan
 

Viewers also liked (13)

PPTX
E&P data management: Implementing data standards
ETLSolutions
 
PPTX
DMS data integration: 6 ways to get it right
ETLSolutions
 
PDF
Integrated petrophysical parameters and petrographic analysis characterizing ...
Khalid Al-Khidir
 
PDF
SPWLA-2015-GGGG
Ramy Essam
 
PPT
Overview of Experimental works conducted in this work
mohull
 
PPS
Petrophysics More Important Than Ever
Graham Davis
 
PPTX
Petrophysics and Big Data by Elephant Scale training and consultin
elephantscale
 
PDF
Selecting Data Management Tools - A practical approach
Christopher Bradley
 
PDF
Well logging and interpretation techniques asin b000bhl7ou
Ahmed Raafat
 
PDF
Basic well log interpretation
Shahnawaz Mustafa
 
PDF
Well logging analysis: methods and interpretation
Cristiano Ascolani
 
PDF
Basic Petrophysics
abdelrahman2012
 
E&P data management: Implementing data standards
ETLSolutions
 
DMS data integration: 6 ways to get it right
ETLSolutions
 
Integrated petrophysical parameters and petrographic analysis characterizing ...
Khalid Al-Khidir
 
SPWLA-2015-GGGG
Ramy Essam
 
Overview of Experimental works conducted in this work
mohull
 
Petrophysics More Important Than Ever
Graham Davis
 
Petrophysics and Big Data by Elephant Scale training and consultin
elephantscale
 
Selecting Data Management Tools - A practical approach
Christopher Bradley
 
Well logging and interpretation techniques asin b000bhl7ou
Ahmed Raafat
 
Basic well log interpretation
Shahnawaz Mustafa
 
Well logging analysis: methods and interpretation
Cristiano Ascolani
 
Basic Petrophysics
abdelrahman2012
 
Ad

Similar to A 5-step methodology for complex E&P data management (20)

PPTX
Hadoop Migration to databricks cloud project plan.pptx
yashodhannn
 
PDF
Summary of Accelerate - 2019 State of Devops report by Google Cloud's DORA
Ragavendra Prasath
 
PPTX
Data integration case study: Oil & Gas industry
ETLSolutions
 
PPTX
Data Governance for the Cloud with Oracle DRM
US-Analytics
 
PPT
09 mdm tool comaprison
Sneha Kulkarni
 
PDF
Make A Stress Free Move To The Cloud: Application Modernization and Managemen...
Dell World
 
PDF
The Xoriant Whitepaper: Last Mile Soa Implementation
Xoriant Corporation
 
PDF
Rapidly Enable Tangible Business Value through Data Virtualization
Denodo
 
PDF
Logical Data Fabric: An Introduction
Denodo
 
PPTX
Data summit connect fall 2020 - rise of data ops
Ryan Gross
 
PPTX
ATAGTR2017 Performance Testing and Non-Functional Testing Strategy for Big Da...
Agile Testing Alliance
 
PDF
Performance tuning datasheet
GlobalSoftUSA
 
PPTX
How to add security in dataops and devops
Ulf Mattsson
 
PPTX
it-Craft-a-Cloud-Strategy-for-the-Enterprise-Database-Phases-1-3.pptx
Eric Amarasinghe
 
DOCX
What Is ERP Data Migration? Everything Global Companies Need to Know
sarmahpratik398
 
PDF
Mapping Manager Brochure
Rakesh Kumar
 
PPSX
M.S. Dissertation in Salesforce on Force.com
Arun Somu Panneerselvam
 
PDF
Iod session 3423 analytics patterns of expertise, the fast path to amazing ...
Rachel Bland
 
PPTX
From Chaos to Compliance: The New Digital Governance for DevOps
XebiaLabs
 
PDF
Data Mesh
Piethein Strengholt
 
Hadoop Migration to databricks cloud project plan.pptx
yashodhannn
 
Summary of Accelerate - 2019 State of Devops report by Google Cloud's DORA
Ragavendra Prasath
 
Data integration case study: Oil & Gas industry
ETLSolutions
 
Data Governance for the Cloud with Oracle DRM
US-Analytics
 
09 mdm tool comaprison
Sneha Kulkarni
 
Make A Stress Free Move To The Cloud: Application Modernization and Managemen...
Dell World
 
The Xoriant Whitepaper: Last Mile Soa Implementation
Xoriant Corporation
 
Rapidly Enable Tangible Business Value through Data Virtualization
Denodo
 
Logical Data Fabric: An Introduction
Denodo
 
Data summit connect fall 2020 - rise of data ops
Ryan Gross
 
ATAGTR2017 Performance Testing and Non-Functional Testing Strategy for Big Da...
Agile Testing Alliance
 
Performance tuning datasheet
GlobalSoftUSA
 
How to add security in dataops and devops
Ulf Mattsson
 
it-Craft-a-Cloud-Strategy-for-the-Enterprise-Database-Phases-1-3.pptx
Eric Amarasinghe
 
What Is ERP Data Migration? Everything Global Companies Need to Know
sarmahpratik398
 
Mapping Manager Brochure
Rakesh Kumar
 
M.S. Dissertation in Salesforce on Force.com
Arun Somu Panneerselvam
 
Iod session 3423 analytics patterns of expertise, the fast path to amazing ...
Rachel Bland
 
From Chaos to Compliance: The New Digital Governance for DevOps
XebiaLabs
 
Ad

More from ETLSolutions (8)

PPTX
How to create a successful proof of concept
ETLSolutions
 
PPTX
WITSML to PPDM mapping project
ETLSolutions
 
PPTX
How to prepare data before a data migration
ETLSolutions
 
PPTX
An example of a successful proof of concept
ETLSolutions
 
PPTX
Data integration case study: Automotive industry
ETLSolutions
 
PPTX
Migrating data: How to reduce risk
ETLSolutions
 
PPTX
Preparing a data migration plan: A practical guide
ETLSolutions
 
PPTX
Automotive data integration: An example of a successful project structure
ETLSolutions
 
How to create a successful proof of concept
ETLSolutions
 
WITSML to PPDM mapping project
ETLSolutions
 
How to prepare data before a data migration
ETLSolutions
 
An example of a successful proof of concept
ETLSolutions
 
Data integration case study: Automotive industry
ETLSolutions
 
Migrating data: How to reduce risk
ETLSolutions
 
Preparing a data migration plan: A practical guide
ETLSolutions
 
Automotive data integration: An example of a successful project structure
ETLSolutions
 

Recently uploaded (20)

PDF
AI Unleashed - Shaping the Future -Starting Today - AIOUG Yatra 2025 - For Co...
Sandesh Rao
 
PDF
Presentation about Hardware and Software in Computer
snehamodhawadiya
 
PDF
MASTERDECK GRAPHSUMMIT SYDNEY (Public).pdf
Neo4j
 
PDF
A Strategic Analysis of the MVNO Wave in Emerging Markets.pdf
IPLOOK Networks
 
PDF
CIFDAQ's Market Wrap : Bears Back in Control?
CIFDAQ
 
PPTX
Introduction to Flutter by Ayush Desai.pptx
ayushdesai204
 
PDF
Brief History of Internet - Early Days of Internet
sutharharshit158
 
PDF
Trying to figure out MCP by actually building an app from scratch with open s...
Julien SIMON
 
PPTX
IT Runs Better with ThousandEyes AI-driven Assurance
ThousandEyes
 
PDF
Make GenAI investments go further with the Dell AI Factory
Principled Technologies
 
PPTX
AI in Daily Life: How Artificial Intelligence Helps Us Every Day
vanshrpatil7
 
PPTX
Farrell_Programming Logic and Design slides_10e_ch02_PowerPoint.pptx
bashnahara11
 
PDF
Per Axbom: The spectacular lies of maps
Nexer Digital
 
PPTX
Agile Chennai 18-19 July 2025 Ideathon | AI Powered Microfinance Literacy Gui...
AgileNetwork
 
PDF
Generative AI vs Predictive AI-The Ultimate Comparison Guide
Lily Clark
 
PPTX
The Future of AI & Machine Learning.pptx
pritsen4700
 
PDF
Researching The Best Chat SDK Providers in 2025
Ray Fields
 
PDF
Data_Analytics_vs_Data_Science_vs_BI_by_CA_Suvidha_Chaplot.pdf
CA Suvidha Chaplot
 
PDF
OFFOFFBOX™ – A New Era for African Film | Startup Presentation
ambaicciwalkerbrian
 
PPTX
Simple and concise overview about Quantum computing..pptx
mughal641
 
AI Unleashed - Shaping the Future -Starting Today - AIOUG Yatra 2025 - For Co...
Sandesh Rao
 
Presentation about Hardware and Software in Computer
snehamodhawadiya
 
MASTERDECK GRAPHSUMMIT SYDNEY (Public).pdf
Neo4j
 
A Strategic Analysis of the MVNO Wave in Emerging Markets.pdf
IPLOOK Networks
 
CIFDAQ's Market Wrap : Bears Back in Control?
CIFDAQ
 
Introduction to Flutter by Ayush Desai.pptx
ayushdesai204
 
Brief History of Internet - Early Days of Internet
sutharharshit158
 
Trying to figure out MCP by actually building an app from scratch with open s...
Julien SIMON
 
IT Runs Better with ThousandEyes AI-driven Assurance
ThousandEyes
 
Make GenAI investments go further with the Dell AI Factory
Principled Technologies
 
AI in Daily Life: How Artificial Intelligence Helps Us Every Day
vanshrpatil7
 
Farrell_Programming Logic and Design slides_10e_ch02_PowerPoint.pptx
bashnahara11
 
Per Axbom: The spectacular lies of maps
Nexer Digital
 
Agile Chennai 18-19 July 2025 Ideathon | AI Powered Microfinance Literacy Gui...
AgileNetwork
 
Generative AI vs Predictive AI-The Ultimate Comparison Guide
Lily Clark
 
The Future of AI & Machine Learning.pptx
pritsen4700
 
Researching The Best Chat SDK Providers in 2025
Ray Fields
 
Data_Analytics_vs_Data_Science_vs_BI_by_CA_Suvidha_Chaplot.pdf
CA Suvidha Chaplot
 
OFFOFFBOX™ – A New Era for African Film | Startup Presentation
ambaicciwalkerbrian
 
Simple and concise overview about Quantum computing..pptx
mughal641
 

A 5-step methodology for complex E&P data management

  • 1. A 5-step methodology for complex E&P data management Raising data management standards www.etlsolutions.com
  • 2. The increasing complexity of E&P data New devices are being used Timescales are collapsing. in every phase of Exploration Once, drilling and logging & Production (E&P) in the Oil data were distinct activities & Gas industry, gathering separated by days, but now more data with which better they happen decisions can be made. simultaneously. Metadata (in the Dublin core These changes are being and ISO 19115 sense) are factored into the becoming ever more development of industry important in providing standards (such as context. This has a direct PPDM), driving their impact on proprietary evolution to ensure database design and continued use. functionality. The price of progress is growing data complexity.
  • 3. A 5-step methodology for managing this data To make a robust and repeatable approach work, we use Transformation Manager, our data The Transformation Manager integration toolset. software is coupled with the approach we have adopted over many years in the Oil & Gas industry The result is a five-stage methodology.
  • 4. Step 1 Separate source and target data models and the logic which lies between them. • This means that we can isolate the pure model structure and clearly see the elements, attributes and relationships in each model. • We can also see detail such as database primary keys and comments. • As exposing relationships is the key in handling PPDM and other highly normalized models, this is a critical step.
  • 5. Step 2 Separate the model from the mechanics of data storage. • The mechanics define physical characteristics such as ‘this is an Oracle database’ or ‘this flat file uses a particular delimiter or character set’. It is the model that tells us things like ‘a well can have many bores’, ‘a wellbore many logs’, and that ‘log trace mnemonics’ are catalogue controlled. • At a stroke, this separation abolishes a whole category of complexity. • For both source and target we need a formal data model, because this enables us to read or write to database, XML, flat file, or any other data format.
  • 6. Step 3 Specify relationships between source and target. • In all data integration projects, determining the rules for the data transfer is a fundamental requirement usually defined by analysts working in this field, often using spreadsheets. • But based on these or other forms of specification, we can create the integration components in Transformation Manager using its descriptive mapping language. This enables us to create a precisely defined description of the link between the two data models. • From this we can generate a runtime system which will execute the formal definitions. Even if we chose not to create an executable link, the formal definition of the mappings is still useful, because it shows where the complexity in the PPDM integration is and the formal syntax can be shared with others to verify our interpretation of their rules.
  • 7. Step 4 Follow an error detection procedure. • To ensure that only good data is stored, Transformation Manager has a robust process of error detection that operates like a series of filters. For each phase, we detect errors relevant to that phase and we don't send bad data to the next phase, where detection becomes even more complex. • We detect mechanical and logical errors separately. If the source is a flat file, a mechanical error could be malformed lines; logical errors could include dangling foreign key references or missing data values. • Next, we can detect errors at the mapping level, inconsistencies that are a consequence of the map itself. Here, for example, we could detect that we are trying to load production data for a source well which does not exist in the target. • Finally there are errors where the data is inconsistent with the target logical model. Here, simple tests (a string value is too long, a number is negative) can often be automatically constructed from the model. More complex tests (well bores cannot curve so sharply, these production figures are for an abandoned well) are built using the semantics of the model. • A staging store is very useful in providing an isolated area where we can disinfect the data before letting it out onto a master system. Staging stores were an integral part of the best practice data loaders we helped build for a major E&P company, and it is now common practice that these are stored until issues are resolved.
  • 8. Step 5 Execute a runtime link to generate the code required to generate the integration. • This will generate integration components, in the form of Java code, which can reside anywhere in the architecture. • This could be on the source, target or any other system to manage the integration between PPDM and non-PPDM data sources.
  • 9. Our offerings: E&P data management Transformation Support, Transformation Manager data training and Manager loader mentoring software developer kits services Data loader Data and migration connector packaged development services
  • 10. Why Transformation Manager? For the user:  Everything under one roof  Greater control and transparency  Identify and test against errors iteratively  Greater understanding of the transformation requirement  Automatically document  Re-use and change management  Uses domain specific terminology in the mapping
  • 11. Why Transformation Manager? For the business:  Reduces cost and effort  Reduces risk in the project  Delivers higher quality and reduces error  Increases control and transparency in the development  Single product  Reduces time to market
  • 12. Contact information Karl Glenn [email protected] +44 (0) 1912 894040 Raising data management standards www.etlsolutions.com