SlideShare a Scribd company logo
1
What I Learned at Gartner
Summit 2019
Ulf Mattsson www.TokenEx.com
What is tokenization in
Blockchain?
2
Ulf Mattsson, BIO
+ Mr. Mattsson is the inventor of 73 patents in the area of Cybersecurity.
+ He managed joint R&D projects with research and development teams at IBM, Microsoft, Hewlett-Packard, Oracle,
Teradata, and RSA Security (Dell).
+ Mr. Mattsson is currently the Head of Innovation at TokenEx, a cloud-based data security company, was previously
Chief Technology Officer at Atlantic BT Security Solutions, and earlier Chief Technology Officer at Compliance
Engineering.
+ He was the Chief Technology Officer and a technology founder of Protegrity.
+ Prior to Protegrity, Mr. Mattsson worked 20 years at IBM's Research and Development organization, in the areas of
Application development, Databases and Security.
+ He also worked at companies providing Data Discovery Services, Cloud Application Security Brokers, Web
Application Firewalls, Managed Security Service, Security Operation Center, and Cybersecurity consulting.
+ Mr. Mattsson is a also advising in the area of AI and Machine learning technology.
+ He owns and manages the BrightTALK “Cybersecurity - The No Spin Zone” and “The Blockchain Channel.”
3
Ulf Mattsson
cont. …
4
Blockchai
n
5
What does Blockchain Offer? (Gartner, 2019)
6
Blockchain Strengths, Weaknesses, Opportunities and Threats (SWOT), Gartner
7
Gartner Forecast: Blockchain Business Value, Worldwide
8
Board-Level Opinions on Blockchain and Digital Currencies, Gartner
9
The Gartner Top op Strategig Technology Trends for 2019
Exploiting AI in the Development Process
10
Blockchai
n Security
11
11Source: IBM
Blockchain Security – What keeps your transaction data safe?
12
12Source: IBM
Blockchain is decentralized
13
13Source: IBM
Blockchain is virtually impossible to hack
14
14Source: IBM
Blockchains can be private or public
15
15Source: IBM
Blockchain offers validation, encryption and potentially tokenization
16
Blockchai
n &
Tokenization
17
The idea behind asset tokenization
• It allows to convert the rights to assets with economic value into a digital token.
• Such tokens can be stored and managed on a blockchain network.
• Tokenization on Blockchain is a steady trend of 2018.
• It seems that everything is being tokenized on Blockchain from paintings, diamonds
and company stocks to real estate.
• Let us forget about Blockchain and smart contracts for a moment.
• Imagine you want to invest in real estate but your initial investment is modest — say
$5,000.
• Perhaps you want to start small and increase your investment gradually.
• For instance you decide to invest a couple thousand every three or four months.
• Obviously, with traditional real estate market this is quite awkward to do.
• How are you supposed to buy two or three square meters in an apartment?
18
The idea behind asset tokenization
• Let us reverse the situation.
• Imagine that you have some property — say an apartment.
• You need cash quickly.
• The apartment is valued at $150,000 but you just need $10,000.
• Can you do this quickly without much friction?
• To my best knowledge, this is next to impossible.
19
Enter tokenization
• Tokenization is a method that converts rights to an asset into a digital token
• Suppose there is a $200,000 apartment
• Tokenization can transform this apartment into 200,000 tokens (the number is totally
arbitrary we could have issued 2 million tokens)
• Thus, each token represents a 0.0005% share of the underlying asset
• Finally, we issue the token on some sort of a platform supporting smart contracts,
for example on Ethereum, so that the tokens can be freely bought and sold on
different exchanges
20
New Requirements
from Regulations
21
Pseudonymisation Under the GDPR
Within the text of the GDPR, there are multiple references to
pseudonymisation as an appropriate mechanism for protecting personal
data.
Pseudonymisation—replacing identifying or sensitive data with
pseudonyms, is synonymous with tokenization—replacing identifying or
sensitive data with tokens.
Article 4 – Definitions
• (1) ‘personal data’ means any information relating to an identified
or identifiable natural person (‘data subject’); …such as a name, an
identification number, location data, an online identifier…
• (5) ‘pseudonymisation’ means the processing personal data in such
a manner that the data can no longer be attributed to a specific
data subject without the use of additional information, provided that
such additional information is kept separately…
What is Personal Data according to GDPR?
22
GDPR
23
Data sources
Data
Warehouse
In Italy
Complete policy-
enforced de-
identification of
sensitive data across
all bank entities
Example of Cross Border Data-centric Security
• Protecting Personally Identifiable Information
(PII), including names, addresses, phone, email,
policy and account numbers
• Compliance with EU Cross Border Data
Protection Laws
• Utilizing Data Tokenization, and centralized
policy, key management, auditing, and
reporting
24
Gartner Hype Cycle for Data Security
Data Classification
Blockchain for Data
Security
25
Source: IBM
Encryption and
TokenizationDiscover
Data Assets
Security
by Design
GDPR Security Requirements – Encryption and Tokenization
26
Data Minimization
• Increasingly organizations are adopting data minimization strategies for security and
privacy reasons. By deleting or reducing inessential duplicate or unused data,
organizations can minimize potential attack vectors.
• Unlike prior discovery tools, BigID can both quickly report on duplicate data but also
provide residency and usage detail so minimization strategies can be based on secondary
factors like jurisdiction and activity history.
• BigID is transforming enterprise protection and privacy of personal data.
• Organizations are facing record breaches of personal information and proliferating global
privacy regulations with fines reaching 10% of annual revenue.
• Today enterprises lack dedicated purpose built technology to help them track and govern
their customer data.
• By bringing data science to data privacy, BigID aims to give enterprises the software to
safeguard and steward the most important asset organizations manage: their customer
data.
Source: BigID (TokenEx partner)
27
ML Driven Data Classification
• The definition of sensitive data is no longer readily encapsulated in a
regular expression.
• Increasingly, companies need to classify data that is sensitive based on
context to a person, or a thing like patent or account.
• This requires a new approach to classification that can identify contextually
sensitive data across all modern data stores - unstructured, structured, Big
Data, Cloud and enterprise applications like SAP.
• BigID provides a first of its kind approach that combines Machine Learning
and Contextual Intelligence to deliver on advanced data classification,
categorization, cataloging and correlation for privacy.
Source: BigID (TokenEx partner)
28
ML-Driven Classification
• Traditional pattern matching approaches to discovery and classification still
struggle with accurately identifying contextually sensitive data like Personal
Information (PI) and disambiguating similar looking information.
• Moreover, regular expression based classifiers which predominate in data
loss prevention, database activity monitoring, and data access governance
products tend to operate on a limited number of data sources, like
relational databases or on-prem unstructured file shares.
• BigID leverages machine learning to classify, categorize and compare data
and files across structured, unstructured, semistructured and Big Data in
the cloud or on-prem.
• BigID can resolve similar looking entities and build association graphs to
correlate data back to a specific entity or person - essential for meeting
emerging privacy use cases like personal data rights
Source: BigID (TokenEx partner)
29
Correlation plus classification
• Even with AI and ML classification approaches like clustering or random
forest, classifiers can improve accuracy through smarter matching and
comparison analysis - but lack the context to understand who the data
relates to.
• This is a common problem for privacy requirements and regulated
industries. The capability to build a graph of connected or relevant data can
be characterized as a correlation problem.
• Correlation helps an organization find sensitive data because of its
association to other sensitive data.
• BigID provides a first of its kind model that can, not only match similar data
within the same class based on ML analysis, but also match connected data
of different classes based on relevancy and connectedness.
• This correlation-based classification is critical to privacy.
Source: BigID (TokenEx partner)
30
Cataloging plus Classification
• BigID's ML-based classifiers use advanced AI techniques to match data
within a class and also correlate data of different classes that have a
common sensitivity level owing to a shared association.
• But, there is a third way sensitivity can be measured. Most data also has
certain attributes associated with it, such as date of creation, last
modification of ownership and access details.
• Unlike traditional classifiers, BigID can also integrate meta-data analysis to
provide a richer view of the data and its usage.
• This meta-data input can be used to better and more automatically catalog
data for easier discovery via search as well as measure sensitivity risk.
• The combination of intelligent classification, correlation and cataloging give
organizations the unique ability to find, inventory and map sensitive data
by additional dimensions than just data class or category.
• These include finding data by person, residency, application and ownership.
Source: BigID (TokenEx partner)
31
Intelligent labeling and tagging
• Enforcement of security protection and privacy compliance requires data risk and
sensitivity knowledge.
• BigID helps organizations understand data sensitivity through advanced ML-based
classification, correlation and cataloging to provide a complete view of data.
• To simplify enforcement on classified data, BigID enables customers to
automatically assign data tags for files and objects.
• These classification tags can be consumed through Microsoft's Azure Information
Protection framework as policy labels, BigID's labeling APIs or additional
frameworks like Box.
• Using these labels, organizations can classify or categorize data - such as Highly
Sensitive, as well as Personal Data based on privacy, health or financial services
compliance mandates.
• These tags can then be used for more granular policy enforcement actions by DLP,
information rights management, database activity monitoring or other
enforcement products.
Source: BigID (TokenEx partner)
32
Tokens in
Digital Business
Ecosystems
33
Main Purpose of Tokens in Digital Business Ecosystems (Value Proposition)
While a large proportion of new token use cases focuses on monetary value representation enabled by
blockchain technology, tokenization will achieve its real potential with value creation. An example of such value
creation is enabling the design of new markets for data assets, autonomous organizations and labor.
34
Encryption &
Tokenization
35
36
What is the difference?
• Encryption - A data security measure using mathematic algorithms to generate rule-based values in place of original data
• Tokenization - A data security measure using mathematic algorithms to generate randomized values in place of original data
Encryption alone is not a full solution
• With encryption, sensitive data remains in business systems. With tokenization, sensitive data is removed completely from business systems and
securely vaulted.
Tokens are versatile
• Format-preserving tokens can be utilized where masked information is required
Encryption vs Tokenization
37
Examples of Protected Data
Field Real Data Tokenized / Pseudonymized
Name Joe Smith csu wusoj
Address 100 Main Street, Pleasantville, CA 476 srta coetse, cysieondusbak, CA
Date of Birth 12/25/1966 01/02/1966
Telephone 760-278-3389 760-389-2289
E-Mail Address joe.smith@surferdude.org eoe.nwuer@beusorpdqo.org
SSN 076-39-2778 076-28-3390
CC Number 3678 2289 3907 3378 3846 2290 3371 3378
Business URL www.surferdude.com www.sheyinctao.com
Fingerprint Encrypted
Photo Encrypted
X-Ray Encrypted
Healthcare /
Financial
Services
Dr. visits, prescriptions, hospital stays and
discharges, clinical, billing, etc.
Financial Services Consumer Products and
activities
Protection methods can be equally applied
to the actual data, but not needed with de-
identification
38
Type of
Data
Use
Case
I
Structured
How Should I Secure Different Types of Data?
I
Un-structured
Simple –
Complex –
PCI
PHI
PII
Encryption
of Files
Card
Holder
Data
Tokenization
of Fields
Protected
Health
Information
Personally Identifiable Information
39
Balance
Risk
40
Access
to Data
High -
Low -
I I
User Productivity
Low High
User Productivity, Creativity and Data
41
Access to
Tokenized
DataLow High
High -
Low - I I
Risk Exposure
Risk Adjusted Data Security – Tokenized Data
User Productivity and
Creativity
42
Minimization Devaluation/Pseudonymisation
Data Hashing/Masking Encryption
DataUtility
Data Protection
Max
Utility
Min
Utility
Min
Protection
Max
Protection
Source:TokenEx
Data Security Approaches
43
Reduction of Pain with Different Protection Techniques
1970 2000 2005 2010
High
Low
Pain
& TCO
Strong Encryption Output:
AES, 3DES
Format Preserving Encryption
DTP, FPE
Vault-based Tokenization
Vaultless Tokenization
Input Value: 3872 3789 1620 3675
!@#$%a^.,mhu7///&*B()_+!@
8278 2789 2990 2789
8278 2789 2990 2789
Format Preserving
Greatly reduced Key
Management
No Vault
8278 2789 2990 2789
Year
44
Different Tokenization Approaches
Property Dynamic Pre-generated
Vault-based Vaultless
45
10 000 000 -
1 000 000 -
100 000 -
10 000 -
1 000 -
100 -
Transactions per second*
I
Format
Preserving
Encryption
Local Speed of Fine Grained Protection Algorithms
I
Vaultless
Data
Tokenization
I
AES CBC
Encryption
Standard
I
Vault-based
Data
Tokenization
*: Speed will depend on the configuration
46
D E S C O P I N G A N
E C O M M E R C E
S O L U T I O N
A PCI SAQ A contains 22 controls compared to more than 300 for the full PCI DSS
• Use a hosted iFrame or payments page provided by a validated service provider to capture and tokenize CHD
• Do not transmit, process or store CHD via any other acceptance channel and utilize payment services of
tokenization provider to process transactions
Minimize Cost of PCI Tokenization
47
Cybercriminal
Sweet Spot
Source: calnet
Cloud can Help Mid-size Business
47
48
On Premise tokenization
• Limited PCI DSS scope reduction - must still maintain a
CDE with PCI data
• Higher risk – sensitive data still resident in environment
• Associated personnel and hardware costs
Cloud-Based tokenization
• Significant reduction in PCI DSS scope
• Reduced risk – sensitive data removed from the
environment
• Platform-focused security
• Lower associated costs – cyber insurance, PCI audit,
maintenance
Total Cost and Risk of Tokenization
49
On-premises,
public / private
clouds
50
• Verizon Data Breach Investigations Report
• Enterprises are losing ground in the fight against
persistent cyber-attacks
• We simply cannot catch the bad guys until it is too
late. This picture is not improving
• Verizon reports concluded that less than 14% of
breaches are detected by internal monitoring tools
• JP Morgan Chase data breach
• Hackers were in the bank’s network for months
undetected
• Network configuration errors are inevitable, even at
the larges banks
• Capital One data breach
• A hacker gained access to 100 million credit card
applications and accounts
• Amazon Web Services, the cloud hosting company
that Capital One was using
Enterprises Losing Ground Against Cyber-attacks
50
51
Cloud and Threat Vector Inheritance
52
Cloud Data Security
Operating System
Security Controls
OS File System
Database
Application Framework
Application Source Code
Application
Data
Network
External Network
Internal Network
Application Server
52
Publi
c
Cloud
Secure
Cloud
Security
Separation
Armor.com
53
Security Separation in Cloud
Internal Network
Administrator
Remote User
Internal User
Public Cloud Examples
Each
authorized
field is in
clear
Cloud
Gateway
Data Security for including encryption, tokenization or
masking of fields or files (at transit and rest)
Secure Cloud
Security
Separation
Armor.com
54
Thank You!
Ulf Mattsson, TokenEx
www.TokenEx.com

More Related Content

What's hot (20)

PPTX
Understanding Blockchain
Ogilvy Consulting
 
PPTX
Blockchain
PedramDehghanpour
 
PPTX
Cryptocurrency
Sarvesh Meena
 
PDF
Upvest - Asset Tokenization - A practical deep dive
Alexander Reichhardt
 
PDF
Blockchain
Venkatesh Jambulingam
 
PPTX
DOA TOKENOMICS FOR DUMMIES.pptx
Andy Martin
 
PPTX
Bitcoin
Libu Thomas
 
PPTX
Blockchain technology
hellygeorge
 
PPTX
Blockchain and Cybersecurity
gppcpa
 
PPTX
What is DeFi ? | Decentralized Finance
zaarahary
 
PPT
Introduction to Tokenization
Nabeel Yoosuf
 
PDF
Blockchain Security Issues and Challenges
Merlec Mpyana
 
PDF
Blockchain Presentation
Zied GUESMI
 
PPTX
The Blockchain and the Future of Cybersecurity
Kevin Cedeño, CISM, CISA
 
PDF
How does blockchain work
Shishir Aryal
 
PDF
Cryptocurrencies and Blockchain technology
Sabrina Kirrane
 
PPTX
Blockchain concepts
Murughan Palaniachari
 
PDF
Tokenization
Pavel Kravchenko, PhD
 
PDF
Bitcoin: The Internet of Money
winklevosscap
 
PDF
An Introduction to Blockchain Technology
Niuversity
 
Understanding Blockchain
Ogilvy Consulting
 
Blockchain
PedramDehghanpour
 
Cryptocurrency
Sarvesh Meena
 
Upvest - Asset Tokenization - A practical deep dive
Alexander Reichhardt
 
DOA TOKENOMICS FOR DUMMIES.pptx
Andy Martin
 
Bitcoin
Libu Thomas
 
Blockchain technology
hellygeorge
 
Blockchain and Cybersecurity
gppcpa
 
What is DeFi ? | Decentralized Finance
zaarahary
 
Introduction to Tokenization
Nabeel Yoosuf
 
Blockchain Security Issues and Challenges
Merlec Mpyana
 
Blockchain Presentation
Zied GUESMI
 
The Blockchain and the Future of Cybersecurity
Kevin Cedeño, CISM, CISA
 
How does blockchain work
Shishir Aryal
 
Cryptocurrencies and Blockchain technology
Sabrina Kirrane
 
Blockchain concepts
Murughan Palaniachari
 
Tokenization
Pavel Kravchenko, PhD
 
Bitcoin: The Internet of Money
winklevosscap
 
An Introduction to Blockchain Technology
Niuversity
 

Similar to What is tokenization in blockchain? (20)

PPTX
What is tokenization in blockchain?
Ulf Mattsson
 
PPTX
Safeguarding customer and financial data in analytics and machine learning
Ulf Mattsson
 
PPTX
Evolving regulations are changing the way we think about tools and technology
Ulf Mattsson
 
PPTX
Machine learning and ai in a brave new cloud world
Ulf Mattsson
 
PPTX
Protecting data privacy in analytics and machine learning ISACA London UK
Ulf Mattsson
 
PPTX
Privacy preserving computing and secure multi-party computation ISACA Atlanta
Ulf Mattsson
 
PPTX
ISSA Atlanta - Emerging application and data protection for multi cloud
Ulf Mattsson
 
PDF
Barcelona presentationv6
Mohan Venkataraman
 
PPTX
What i learned at the infosecurity isaca north america expo and conference 2019
Ulf Mattsson
 
PDF
Mastering the Dark Data Challenge - Harnessing AI for Enhanced Data Governanc...
Enterprise Knowledge
 
PDF
Blockchain for industry 4.0 HMI 2018
Mark Mueller-Eberstein
 
PPTX
ISC2 Privacy-Preserving Analytics and Secure Multiparty Computation
UlfMattsson7
 
PPTX
New technologies for data protection
Ulf Mattsson
 
PDF
How blockchain will defend iot
Hitesh Malviya
 
PPTX
Leveraging IOT and Latest Technologies
Mithileysh Sathiyanarayanan
 
PPTX
A beginner's guide to Big data
AnushkaGupta763558
 
PDF
¿Cómo puede ayudarlo Qlik a descubrir más valor en sus datos de IoT?
Data IQ Argentina
 
PPTX
Nov 2 security for blockchain and analytics ulf mattsson 2020 nov 2b
Ulf Mattsson
 
PPTX
Jun 15 privacy in the cloud at financial institutions at the object managemen...
Ulf Mattsson
 
PPTX
Secure Identity management blockchain ppt.pptx
ratheetripti
 
What is tokenization in blockchain?
Ulf Mattsson
 
Safeguarding customer and financial data in analytics and machine learning
Ulf Mattsson
 
Evolving regulations are changing the way we think about tools and technology
Ulf Mattsson
 
Machine learning and ai in a brave new cloud world
Ulf Mattsson
 
Protecting data privacy in analytics and machine learning ISACA London UK
Ulf Mattsson
 
Privacy preserving computing and secure multi-party computation ISACA Atlanta
Ulf Mattsson
 
ISSA Atlanta - Emerging application and data protection for multi cloud
Ulf Mattsson
 
Barcelona presentationv6
Mohan Venkataraman
 
What i learned at the infosecurity isaca north america expo and conference 2019
Ulf Mattsson
 
Mastering the Dark Data Challenge - Harnessing AI for Enhanced Data Governanc...
Enterprise Knowledge
 
Blockchain for industry 4.0 HMI 2018
Mark Mueller-Eberstein
 
ISC2 Privacy-Preserving Analytics and Secure Multiparty Computation
UlfMattsson7
 
New technologies for data protection
Ulf Mattsson
 
How blockchain will defend iot
Hitesh Malviya
 
Leveraging IOT and Latest Technologies
Mithileysh Sathiyanarayanan
 
A beginner's guide to Big data
AnushkaGupta763558
 
¿Cómo puede ayudarlo Qlik a descubrir más valor en sus datos de IoT?
Data IQ Argentina
 
Nov 2 security for blockchain and analytics ulf mattsson 2020 nov 2b
Ulf Mattsson
 
Jun 15 privacy in the cloud at financial institutions at the object managemen...
Ulf Mattsson
 
Secure Identity management blockchain ppt.pptx
ratheetripti
 
Ad

More from Ulf Mattsson (20)

PPTX
Jun 29 new privacy technologies for unicode and international data standards ...
Ulf Mattsson
 
PPTX
Book
Ulf Mattsson
 
PPTX
May 6 evolving international privacy regulations and cross border data tran...
Ulf Mattsson
 
PPTX
Qubit conference-new-york-2021
Ulf Mattsson
 
PDF
Secure analytics and machine learning in cloud use cases
Ulf Mattsson
 
PPTX
Evolving international privacy regulations and cross border data transfer - g...
Ulf Mattsson
 
PDF
Data encryption and tokenization for international unicode
Ulf Mattsson
 
PPTX
The future of data security and blockchain
Ulf Mattsson
 
PPTX
GDPR and evolving international privacy regulations
Ulf Mattsson
 
PPTX
New opportunities and business risks with evolving privacy regulations
Ulf Mattsson
 
PPTX
What is tokenization in blockchain - BCS London
Ulf Mattsson
 
PPTX
Protecting data privacy in analytics and machine learning - ISACA
Ulf Mattsson
 
PPTX
What is tokenization in blockchain?
Ulf Mattsson
 
PPTX
Unlock the potential of data security 2020
Ulf Mattsson
 
PPTX
What is tokenization in blockchain?
Ulf Mattsson
 
PPTX
Protecting Data Privacy in Analytics and Machine Learning
Ulf Mattsson
 
PPTX
ISACA Houston - How to de-classify data and rethink transfer of data between ...
Ulf Mattsson
 
PPTX
Isaca atlanta - practical data security and privacy
Ulf Mattsson
 
PPTX
ISACA Houston - Practical data privacy and de-identification techniques
Ulf Mattsson
 
PPTX
Jul 16 isaca london data protection, security and privacy risks - on premis...
Ulf Mattsson
 
Jun 29 new privacy technologies for unicode and international data standards ...
Ulf Mattsson
 
May 6 evolving international privacy regulations and cross border data tran...
Ulf Mattsson
 
Qubit conference-new-york-2021
Ulf Mattsson
 
Secure analytics and machine learning in cloud use cases
Ulf Mattsson
 
Evolving international privacy regulations and cross border data transfer - g...
Ulf Mattsson
 
Data encryption and tokenization for international unicode
Ulf Mattsson
 
The future of data security and blockchain
Ulf Mattsson
 
GDPR and evolving international privacy regulations
Ulf Mattsson
 
New opportunities and business risks with evolving privacy regulations
Ulf Mattsson
 
What is tokenization in blockchain - BCS London
Ulf Mattsson
 
Protecting data privacy in analytics and machine learning - ISACA
Ulf Mattsson
 
What is tokenization in blockchain?
Ulf Mattsson
 
Unlock the potential of data security 2020
Ulf Mattsson
 
What is tokenization in blockchain?
Ulf Mattsson
 
Protecting Data Privacy in Analytics and Machine Learning
Ulf Mattsson
 
ISACA Houston - How to de-classify data and rethink transfer of data between ...
Ulf Mattsson
 
Isaca atlanta - practical data security and privacy
Ulf Mattsson
 
ISACA Houston - Practical data privacy and de-identification techniques
Ulf Mattsson
 
Jul 16 isaca london data protection, security and privacy risks - on premis...
Ulf Mattsson
 
Ad

Recently uploaded (20)

PDF
TrustArc Webinar - Data Privacy Trends 2025: Mid-Year Insights & Program Stra...
TrustArc
 
PPTX
Top Managed Service Providers in Los Angeles
Captain IT
 
PPTX
Darren Mills The Migration Modernization Balancing Act: Navigating Risks and...
AWS Chicago
 
PDF
Novus Safe Lite- What is Novus Safe Lite.pdf
Novus Hi-Tech
 
PDF
Human-centred design in online workplace learning and relationship to engagem...
Tracy Tang
 
PDF
Upgrading to z_OS V2R4 Part 01 of 02.pdf
Flavio787771
 
PPTX
Building a Production-Ready Barts Health Secure Data Environment Tooling, Acc...
Barts Health
 
PDF
Rethinking Security Operations - Modern SOC.pdf
Haris Chughtai
 
PDF
CIFDAQ'S Token Spotlight for 16th July 2025 - ALGORAND
CIFDAQ
 
PDF
CloudStack GPU Integration - Rohit Yadav
ShapeBlue
 
PPTX
python advanced data structure dictionary with examples python advanced data ...
sprasanna11
 
PDF
Shuen Mei Parth Sharma Boost Productivity, Innovation and Efficiency wit...
AWS Chicago
 
PDF
Bitcoin+ Escalando sin concesiones - Parte 1
Fernando Paredes García
 
PDF
Meetup Kickoff & Welcome - Rohit Yadav, CSIUG Chairman
ShapeBlue
 
PDF
Productivity Management Software | Workstatus
Lovely Baghel
 
PDF
Trading Volume Explained by CIFDAQ- Secret Of Market Trends
CIFDAQ
 
PPTX
The Yotta x CloudStack Advantage: Scalable, India-First Cloud
ShapeBlue
 
PDF
Ampere Offers Energy-Efficient Future For AI And Cloud
ShapeBlue
 
PPTX
UI5Con 2025 - Beyond UI5 Controls with the Rise of Web Components
Wouter Lemaire
 
PDF
GITLAB-CICD_For_Professionals_KodeKloud.pdf
deepaktyagi0048
 
TrustArc Webinar - Data Privacy Trends 2025: Mid-Year Insights & Program Stra...
TrustArc
 
Top Managed Service Providers in Los Angeles
Captain IT
 
Darren Mills The Migration Modernization Balancing Act: Navigating Risks and...
AWS Chicago
 
Novus Safe Lite- What is Novus Safe Lite.pdf
Novus Hi-Tech
 
Human-centred design in online workplace learning and relationship to engagem...
Tracy Tang
 
Upgrading to z_OS V2R4 Part 01 of 02.pdf
Flavio787771
 
Building a Production-Ready Barts Health Secure Data Environment Tooling, Acc...
Barts Health
 
Rethinking Security Operations - Modern SOC.pdf
Haris Chughtai
 
CIFDAQ'S Token Spotlight for 16th July 2025 - ALGORAND
CIFDAQ
 
CloudStack GPU Integration - Rohit Yadav
ShapeBlue
 
python advanced data structure dictionary with examples python advanced data ...
sprasanna11
 
Shuen Mei Parth Sharma Boost Productivity, Innovation and Efficiency wit...
AWS Chicago
 
Bitcoin+ Escalando sin concesiones - Parte 1
Fernando Paredes García
 
Meetup Kickoff & Welcome - Rohit Yadav, CSIUG Chairman
ShapeBlue
 
Productivity Management Software | Workstatus
Lovely Baghel
 
Trading Volume Explained by CIFDAQ- Secret Of Market Trends
CIFDAQ
 
The Yotta x CloudStack Advantage: Scalable, India-First Cloud
ShapeBlue
 
Ampere Offers Energy-Efficient Future For AI And Cloud
ShapeBlue
 
UI5Con 2025 - Beyond UI5 Controls with the Rise of Web Components
Wouter Lemaire
 
GITLAB-CICD_For_Professionals_KodeKloud.pdf
deepaktyagi0048
 

What is tokenization in blockchain?

  • 1. 1 What I Learned at Gartner Summit 2019 Ulf Mattsson www.TokenEx.com What is tokenization in Blockchain?
  • 2. 2 Ulf Mattsson, BIO + Mr. Mattsson is the inventor of 73 patents in the area of Cybersecurity. + He managed joint R&D projects with research and development teams at IBM, Microsoft, Hewlett-Packard, Oracle, Teradata, and RSA Security (Dell). + Mr. Mattsson is currently the Head of Innovation at TokenEx, a cloud-based data security company, was previously Chief Technology Officer at Atlantic BT Security Solutions, and earlier Chief Technology Officer at Compliance Engineering. + He was the Chief Technology Officer and a technology founder of Protegrity. + Prior to Protegrity, Mr. Mattsson worked 20 years at IBM's Research and Development organization, in the areas of Application development, Databases and Security. + He also worked at companies providing Data Discovery Services, Cloud Application Security Brokers, Web Application Firewalls, Managed Security Service, Security Operation Center, and Cybersecurity consulting. + Mr. Mattsson is a also advising in the area of AI and Machine learning technology. + He owns and manages the BrightTALK “Cybersecurity - The No Spin Zone” and “The Blockchain Channel.”
  • 5. 5 What does Blockchain Offer? (Gartner, 2019)
  • 6. 6 Blockchain Strengths, Weaknesses, Opportunities and Threats (SWOT), Gartner
  • 7. 7 Gartner Forecast: Blockchain Business Value, Worldwide
  • 8. 8 Board-Level Opinions on Blockchain and Digital Currencies, Gartner
  • 9. 9 The Gartner Top op Strategig Technology Trends for 2019 Exploiting AI in the Development Process
  • 11. 11 11Source: IBM Blockchain Security – What keeps your transaction data safe?
  • 13. 13 13Source: IBM Blockchain is virtually impossible to hack
  • 14. 14 14Source: IBM Blockchains can be private or public
  • 15. 15 15Source: IBM Blockchain offers validation, encryption and potentially tokenization
  • 17. 17 The idea behind asset tokenization • It allows to convert the rights to assets with economic value into a digital token. • Such tokens can be stored and managed on a blockchain network. • Tokenization on Blockchain is a steady trend of 2018. • It seems that everything is being tokenized on Blockchain from paintings, diamonds and company stocks to real estate. • Let us forget about Blockchain and smart contracts for a moment. • Imagine you want to invest in real estate but your initial investment is modest — say $5,000. • Perhaps you want to start small and increase your investment gradually. • For instance you decide to invest a couple thousand every three or four months. • Obviously, with traditional real estate market this is quite awkward to do. • How are you supposed to buy two or three square meters in an apartment?
  • 18. 18 The idea behind asset tokenization • Let us reverse the situation. • Imagine that you have some property — say an apartment. • You need cash quickly. • The apartment is valued at $150,000 but you just need $10,000. • Can you do this quickly without much friction? • To my best knowledge, this is next to impossible.
  • 19. 19 Enter tokenization • Tokenization is a method that converts rights to an asset into a digital token • Suppose there is a $200,000 apartment • Tokenization can transform this apartment into 200,000 tokens (the number is totally arbitrary we could have issued 2 million tokens) • Thus, each token represents a 0.0005% share of the underlying asset • Finally, we issue the token on some sort of a platform supporting smart contracts, for example on Ethereum, so that the tokens can be freely bought and sold on different exchanges
  • 21. 21 Pseudonymisation Under the GDPR Within the text of the GDPR, there are multiple references to pseudonymisation as an appropriate mechanism for protecting personal data. Pseudonymisation—replacing identifying or sensitive data with pseudonyms, is synonymous with tokenization—replacing identifying or sensitive data with tokens. Article 4 – Definitions • (1) ‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); …such as a name, an identification number, location data, an online identifier… • (5) ‘pseudonymisation’ means the processing personal data in such a manner that the data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately… What is Personal Data according to GDPR?
  • 23. 23 Data sources Data Warehouse In Italy Complete policy- enforced de- identification of sensitive data across all bank entities Example of Cross Border Data-centric Security • Protecting Personally Identifiable Information (PII), including names, addresses, phone, email, policy and account numbers • Compliance with EU Cross Border Data Protection Laws • Utilizing Data Tokenization, and centralized policy, key management, auditing, and reporting
  • 24. 24 Gartner Hype Cycle for Data Security Data Classification Blockchain for Data Security
  • 25. 25 Source: IBM Encryption and TokenizationDiscover Data Assets Security by Design GDPR Security Requirements – Encryption and Tokenization
  • 26. 26 Data Minimization • Increasingly organizations are adopting data minimization strategies for security and privacy reasons. By deleting or reducing inessential duplicate or unused data, organizations can minimize potential attack vectors. • Unlike prior discovery tools, BigID can both quickly report on duplicate data but also provide residency and usage detail so minimization strategies can be based on secondary factors like jurisdiction and activity history. • BigID is transforming enterprise protection and privacy of personal data. • Organizations are facing record breaches of personal information and proliferating global privacy regulations with fines reaching 10% of annual revenue. • Today enterprises lack dedicated purpose built technology to help them track and govern their customer data. • By bringing data science to data privacy, BigID aims to give enterprises the software to safeguard and steward the most important asset organizations manage: their customer data. Source: BigID (TokenEx partner)
  • 27. 27 ML Driven Data Classification • The definition of sensitive data is no longer readily encapsulated in a regular expression. • Increasingly, companies need to classify data that is sensitive based on context to a person, or a thing like patent or account. • This requires a new approach to classification that can identify contextually sensitive data across all modern data stores - unstructured, structured, Big Data, Cloud and enterprise applications like SAP. • BigID provides a first of its kind approach that combines Machine Learning and Contextual Intelligence to deliver on advanced data classification, categorization, cataloging and correlation for privacy. Source: BigID (TokenEx partner)
  • 28. 28 ML-Driven Classification • Traditional pattern matching approaches to discovery and classification still struggle with accurately identifying contextually sensitive data like Personal Information (PI) and disambiguating similar looking information. • Moreover, regular expression based classifiers which predominate in data loss prevention, database activity monitoring, and data access governance products tend to operate on a limited number of data sources, like relational databases or on-prem unstructured file shares. • BigID leverages machine learning to classify, categorize and compare data and files across structured, unstructured, semistructured and Big Data in the cloud or on-prem. • BigID can resolve similar looking entities and build association graphs to correlate data back to a specific entity or person - essential for meeting emerging privacy use cases like personal data rights Source: BigID (TokenEx partner)
  • 29. 29 Correlation plus classification • Even with AI and ML classification approaches like clustering or random forest, classifiers can improve accuracy through smarter matching and comparison analysis - but lack the context to understand who the data relates to. • This is a common problem for privacy requirements and regulated industries. The capability to build a graph of connected or relevant data can be characterized as a correlation problem. • Correlation helps an organization find sensitive data because of its association to other sensitive data. • BigID provides a first of its kind model that can, not only match similar data within the same class based on ML analysis, but also match connected data of different classes based on relevancy and connectedness. • This correlation-based classification is critical to privacy. Source: BigID (TokenEx partner)
  • 30. 30 Cataloging plus Classification • BigID's ML-based classifiers use advanced AI techniques to match data within a class and also correlate data of different classes that have a common sensitivity level owing to a shared association. • But, there is a third way sensitivity can be measured. Most data also has certain attributes associated with it, such as date of creation, last modification of ownership and access details. • Unlike traditional classifiers, BigID can also integrate meta-data analysis to provide a richer view of the data and its usage. • This meta-data input can be used to better and more automatically catalog data for easier discovery via search as well as measure sensitivity risk. • The combination of intelligent classification, correlation and cataloging give organizations the unique ability to find, inventory and map sensitive data by additional dimensions than just data class or category. • These include finding data by person, residency, application and ownership. Source: BigID (TokenEx partner)
  • 31. 31 Intelligent labeling and tagging • Enforcement of security protection and privacy compliance requires data risk and sensitivity knowledge. • BigID helps organizations understand data sensitivity through advanced ML-based classification, correlation and cataloging to provide a complete view of data. • To simplify enforcement on classified data, BigID enables customers to automatically assign data tags for files and objects. • These classification tags can be consumed through Microsoft's Azure Information Protection framework as policy labels, BigID's labeling APIs or additional frameworks like Box. • Using these labels, organizations can classify or categorize data - such as Highly Sensitive, as well as Personal Data based on privacy, health or financial services compliance mandates. • These tags can then be used for more granular policy enforcement actions by DLP, information rights management, database activity monitoring or other enforcement products. Source: BigID (TokenEx partner)
  • 33. 33 Main Purpose of Tokens in Digital Business Ecosystems (Value Proposition) While a large proportion of new token use cases focuses on monetary value representation enabled by blockchain technology, tokenization will achieve its real potential with value creation. An example of such value creation is enabling the design of new markets for data assets, autonomous organizations and labor.
  • 35. 35
  • 36. 36 What is the difference? • Encryption - A data security measure using mathematic algorithms to generate rule-based values in place of original data • Tokenization - A data security measure using mathematic algorithms to generate randomized values in place of original data Encryption alone is not a full solution • With encryption, sensitive data remains in business systems. With tokenization, sensitive data is removed completely from business systems and securely vaulted. Tokens are versatile • Format-preserving tokens can be utilized where masked information is required Encryption vs Tokenization
  • 37. 37 Examples of Protected Data Field Real Data Tokenized / Pseudonymized Name Joe Smith csu wusoj Address 100 Main Street, Pleasantville, CA 476 srta coetse, cysieondusbak, CA Date of Birth 12/25/1966 01/02/1966 Telephone 760-278-3389 760-389-2289 E-Mail Address [email protected] [email protected] SSN 076-39-2778 076-28-3390 CC Number 3678 2289 3907 3378 3846 2290 3371 3378 Business URL www.surferdude.com www.sheyinctao.com Fingerprint Encrypted Photo Encrypted X-Ray Encrypted Healthcare / Financial Services Dr. visits, prescriptions, hospital stays and discharges, clinical, billing, etc. Financial Services Consumer Products and activities Protection methods can be equally applied to the actual data, but not needed with de- identification
  • 38. 38 Type of Data Use Case I Structured How Should I Secure Different Types of Data? I Un-structured Simple – Complex – PCI PHI PII Encryption of Files Card Holder Data Tokenization of Fields Protected Health Information Personally Identifiable Information
  • 40. 40 Access to Data High - Low - I I User Productivity Low High User Productivity, Creativity and Data
  • 41. 41 Access to Tokenized DataLow High High - Low - I I Risk Exposure Risk Adjusted Data Security – Tokenized Data User Productivity and Creativity
  • 42. 42 Minimization Devaluation/Pseudonymisation Data Hashing/Masking Encryption DataUtility Data Protection Max Utility Min Utility Min Protection Max Protection Source:TokenEx Data Security Approaches
  • 43. 43 Reduction of Pain with Different Protection Techniques 1970 2000 2005 2010 High Low Pain & TCO Strong Encryption Output: AES, 3DES Format Preserving Encryption DTP, FPE Vault-based Tokenization Vaultless Tokenization Input Value: 3872 3789 1620 3675 !@#$%a^.,mhu7///&*B()_+!@ 8278 2789 2990 2789 8278 2789 2990 2789 Format Preserving Greatly reduced Key Management No Vault 8278 2789 2990 2789 Year
  • 44. 44 Different Tokenization Approaches Property Dynamic Pre-generated Vault-based Vaultless
  • 45. 45 10 000 000 - 1 000 000 - 100 000 - 10 000 - 1 000 - 100 - Transactions per second* I Format Preserving Encryption Local Speed of Fine Grained Protection Algorithms I Vaultless Data Tokenization I AES CBC Encryption Standard I Vault-based Data Tokenization *: Speed will depend on the configuration
  • 46. 46 D E S C O P I N G A N E C O M M E R C E S O L U T I O N A PCI SAQ A contains 22 controls compared to more than 300 for the full PCI DSS • Use a hosted iFrame or payments page provided by a validated service provider to capture and tokenize CHD • Do not transmit, process or store CHD via any other acceptance channel and utilize payment services of tokenization provider to process transactions Minimize Cost of PCI Tokenization
  • 47. 47 Cybercriminal Sweet Spot Source: calnet Cloud can Help Mid-size Business 47
  • 48. 48 On Premise tokenization • Limited PCI DSS scope reduction - must still maintain a CDE with PCI data • Higher risk – sensitive data still resident in environment • Associated personnel and hardware costs Cloud-Based tokenization • Significant reduction in PCI DSS scope • Reduced risk – sensitive data removed from the environment • Platform-focused security • Lower associated costs – cyber insurance, PCI audit, maintenance Total Cost and Risk of Tokenization
  • 50. 50 • Verizon Data Breach Investigations Report • Enterprises are losing ground in the fight against persistent cyber-attacks • We simply cannot catch the bad guys until it is too late. This picture is not improving • Verizon reports concluded that less than 14% of breaches are detected by internal monitoring tools • JP Morgan Chase data breach • Hackers were in the bank’s network for months undetected • Network configuration errors are inevitable, even at the larges banks • Capital One data breach • A hacker gained access to 100 million credit card applications and accounts • Amazon Web Services, the cloud hosting company that Capital One was using Enterprises Losing Ground Against Cyber-attacks 50
  • 51. 51 Cloud and Threat Vector Inheritance
  • 52. 52 Cloud Data Security Operating System Security Controls OS File System Database Application Framework Application Source Code Application Data Network External Network Internal Network Application Server 52 Publi c Cloud Secure Cloud Security Separation Armor.com
  • 53. 53 Security Separation in Cloud Internal Network Administrator Remote User Internal User Public Cloud Examples Each authorized field is in clear Cloud Gateway Data Security for including encryption, tokenization or masking of fields or files (at transit and rest) Secure Cloud Security Separation Armor.com
  • 54. 54 Thank You! Ulf Mattsson, TokenEx www.TokenEx.com