SlideShare a Scribd company logo
INFOSEC FRAMEWORKS FOR
MISINFORMATION
SARA “SJ” TERP AND PABLO BREUER
CANSECWEST 2019
TALK OBJECTIVES
• Describe the problem
• Establish a common language
• Introduce a framework
• Talk about what we can do with the framework
Describing the Problem
Misinformation
SOCIAL ENGINEERING AT SCALE
Facebook Group Shares Interactions
Blacktivists 103,767,792 6,182,835
Txrebels 102,950,151 3,453,143
MuslimAmerica 71,355,895 2,128,875
Patriototus 51,139,860 4,438,745
Secured.Borders 5,600,136 1,592,771
Lgbtun 5,187,494 1,262,386
INTENT TO DECEIVE
Force adversary to make decision or take action based on information that I:
• Hide
• Give
• Change (or change the context on)
• Deny/degrade
• Destroy
Enable my decisions based upon knowing yours
“Operations to convey selected information and indicators to audiences to
influence their emotions, motives, and objectives reasoning, and ultimately the
behavior of governments, organizations, groups, and individuals”
ARTEFACTS
ARTEFACTS
Describing the problem
Why misinformation is different now
INSTRUMENTS OF NATIONAL POWER
…and how to influence other nation-states.
Diplomatic Informational Military Economic
Resources available in pursuit of national objectives…
NATIONSTATE MISINFORMATION
From To
Brazil Brazil
China China, Taiwan, US
Iran India, Pakistan
Russia Armenia, France, Germany, Netherlands, Philippines,
Serbia, UK, USA, Ukraine, World
Saudi Qatar
Unknown France, Germany, USA
MISINFORMATION STRATEGIES
Distort
Distract
Divide
Dismay
Dismiss
WHAT’S DIFFERENT NOW?
OTHER ACTORS AND THEIR MOTIVATIONS
• State and non-state actors
• Entrepreneurs
• Grassroots groups
• Private influencers
RESPONSE: NOT JUST ADMIRING THE PROBLEM
MISINFORMATION PYRAMID
MISINFOSEC:
MISINFORMATION +
INFOSEC
All cyberspace operations are
based on influence.
- Pablo Breuer
MISINFORMATION VIEWED AS…
• Information security (Gordon, Grugq, Rogers)
• Information operations / influence operations (Lin)
• A form of conflict (Singer, Gerasimov)
• [A social problem]
• [News source pollution]
ATTACK. DEFEND. NETWORKS. LOOKED FAMILIAR.
MAYBE THERE WERE THINGS WE COULD USE
ADDING MISINFORMATION TO INFOSEC
“Prevention of damage to, protection of, and restoration of computers,
electronic communications systems, electronic communications services, wire
communication, and electronic communication, including information contained
therein, to ensure its availability, integrity, authentication, confidentiality, and
nonrepudiation” - NSPD-54
INFOSEC ALREADY INCLUDES COGNITIVE
PSYOPS AND INFOSEC AREN’T JOINED UP
Information
Operations
PSYOPS
Computer
Network
Operations
INFOSEC SUPPORT TO MISINFORMATION TRACKING
THERE’S NO COMMON LANGUAGE
“We use misinformation attack (and misinformation campaign) to refer to the
deliberate promotion of false, misleading or mis-attributed information. Whilst
these attacks occur in many venues (print, radio, etc), we focus on the creation,
propagation and consumption of misinformation online. We are especially
interested in misinformation designed to change beliefs in a large number of
people.”
MISINFOSEC COMMUNITIES
● Industry
● Academia
● Media
● Community
● Government
FIRST OUTPUT: MISINFOSEC FRAMEWORK STANDARDS
FRAMEWORKS
Underpinning
misinformation
STAGE-BASED MODELS ARE USEFUL
RECON WEAPONIZE DELIVER EXPLOIT CONTROL EXECUTE MAINTAIN
Persistence
Privilege
Escalation
Defense
Evasion
Credential
Access
Discovery
Lateral
Movement
Execution Collection Exfiltration
Command
and Control
WE CHOSE THE ATT&CK FRAMEWORK
AND STARTED MAPPING MISINFORMATION ONTO IT
Initial
Access
Create
Artefacts
Insert
Theme
Amplify
Message
Command
And Control
Account takeover Steal existing
artefacts
Create fake
emergency
Repeat messaging
with bots
Create fake real-life
events
Create fake group Deepfake Create fake argument
Parody account Buy friends
Deep cover
POPULATING THE FRAMEWORK
• Campaigns
• e.g. Internet Research Agency, 2016 US elections
• Incidents
• e.g. Columbia Chemicals
• Failed attempts
• e.g. Russia - France campaigns
HISTORICAL CATALOG
HISTORICAL CATALOG: DATASHEET
• Summary: Early Russian (IRA) “fake news”
stories. Completely fabricated; very short lifespan.
• Actor: probably IRA (source: recordedfuture)
• Timeframe: Sept 11 2014 (1 day)
• Presumed goals: test deployment
• Artefacts: text messages, images, video
• Related attacks: These were all well-produced
fake news stories, promoted on Twitter to
influencers through a single dominant hashtag --
#BPoilspilltsunami, #shockingmurderinatlanta,
• Method:
1. Create messages. e.g. “A powerful explosion heard from
miles away happened at a chemical plant in Centerville,
Louisiana #ColumbianChemicals”
2. Post messages from fake twitter accounts; include handles
of local and global influencers (journalists, media,
politicians, e.g. @senjeffmerkley)
3. Amplify, by repeating messages on twitter via fake twitter
accounts
• Result: limited traction
• Counters: None seen. Fake stories were debunked very
quickly.
FEEDS INTO TECHNIQUES LIST
• Behavior: two groups meeting in same place at
same time
• Intended effect: IRL tension / conflict
• Requirements: access to groups, group trust
• Detection:
• Handling:
• Examples:
Title
Description
Short_Description
Intended_Effect
Behavior
Resources
Victim_Targeting
Exploit_Targets
Related_TTPs
Kill_chain_Phases
Information_Source
Klil_Chains
Handling
THIS IS WHAT A FINISHED FRAMEWORK LOOKS LIKE
FINDING
TECHNIQUES
Tracking incidents and
artefacts
INCIDENT ANALYSIS
Top-down (strategic): info ops
❏ What are misinformation creators
likely to do? What, where, when,
how, who, why?
❏ What do we expect to see?
❏ What responses and impediments
to responses were there?
Bottom-up (tactical): data science
❏Unusual hashtag, trend, topic,
platform activity?
❏Content from ‘known’ trollbots,
8/4chan, r/thedonald,
RussiaToday etc
❏What are trackers getting excited
about today?
Top-down analysis
Means of implementing influence strategies
STRATEGIES
Distort
Distract
Divide
Dismay
Dismiss
DISTORTION TECHNIQUES
• Distort facts: match intended outcome
• Exaggerate: rhetoric & misrepresent facts
• Generate: realistic false artifacts
• Mismatch: links, images, and claims to
change context of information
DISTRACTION TECHNIQUES
• String along: respond to anyone who engages to
waste time
• Play dumb: pretend to be naive, gullible, stupid
• Redirect: draw engagement to your thread
• Dilute: add other accounts to dilute threads
• Threadjack: change narrative in existing thread
DIVISION TECHNIQUES
• Provoke: create conflicts and confusion among community
members
• Dehumanize: demean and denigrate target group
• Hate speech: attack protected characteristics or classes
• Play victim: claim victim status
• Dog-whistle: use coded language to indicate insider status
• Hit and run: attack and delete after short time interval
• Call to arms: make open calls for action
DISMAY TECHNIQUES
• Ad hominem: make personal attacks, insults
& accusations
• Assign threats: name and personalize enemy
• Good old-fashioned tradecraft
DISMISSAL TECHNIQUES
• Last word: respond to hostile commenters
then block them so they can’t reply
• Brigading: coordinate mass attacks or
reporting of targeted accounts or tweets
• Shit list: add target account(s) to insultingly
named list(s)
Bottom-up analysis
Collecting Artefacts to find incidents
MISINFORMATION PYRAMID
RESOURCES
Trollbot lists:
• https://botsentinel.com/
Tools:
• APIs / python libraries / Pandas
• https://github.com/IHJpc2V1cCAK/socint
• https://labsblog.f-secure.com/2018/02/16/searching-twitter-with-twarc/
Existing datasets
• https://github.com/bodacea/misinfolinks
ARTEFACTS: ACCOUNTS
ARTEFACTS: IMAGES
ARTEFACTS: TEXT (WORDS, HASHTAGS, URLS ETC)
ARTEFACTS: DOMAINS
MOVING UP: CONTENT AND CONTEXT ANALYSIS
• Metadata analysis
• Social network analysis
• Text analysis (frequency, sentiment
etc)
• Time series analysis
• Visual inspection (Bokeh, Gephi etc)
• Correlation
• Models, e.g. clustering and
classification
• Narrative analysis
ANALYSIS: BEHAVIOURS
ANALYSIS: RELATIONSHIPS
EXPERT TRACKERS
@katestarbird #digitalsherlocks @josh_emerson
@conspirator0 @r0zetta@fs0c131y
WHY BUILD
FRAMEWORKS?
… what do we do with
them?
COMPONENTWISE UNDERSTANDING AND RESPONSE
• Lingua Franca across communities
• Defend/countermove against reused techniques, identify gaps in attacks
• Assess defence tools & techniques
• Plan for large-scale adaptive threats (hello, Machine Learning!)
• Build an alert structure (e.g. ISAC, US-CERT, Interpol)
WE NEED TO DESIGN AND SHARE RESPONSES
WE NEED TO BUILD COMMUNITIES
● Industry
● Academia
● Media
● Community
● Government
WE NEED INTELLIGENCE SHARING AND COORDINATION
WE NEED FRAMEWORKS
SPECIAL THANK YOUS
THANK YOU
Sara “SJ” Terp
Bodacea Light Industries
sarajterp@gmail.com
@bodaceacat
CDR Pablo C. Breuer
U.S. Special Operations Command / SOFWERX
Pablo.Breuer@sofwerx.org
@Ngree_H0bit
Community
• Parody-based counter-campaigns (e.g. riffs on “Q”)
• SEO-hack misinformation sites
• Dogpile onto misinformation hashtags
• Divert followers (typosquat trolls, spoof messaging etc)
• Identify and engage with affected individuals
• Educate, verify, bring into the light
64
Offense: Potentials for Next
• Algorithms + humans attack algorithms + humans
• Shift from trolls to ‘nudging’ existing human communities
(‘useful idiots’)
• Subtle attacks, e.g. ’low-and-slows’, ‘pop-up’, etc
• Massively multi-channel attacks
• More commercial targets
• A well-established part of hybrid warfare
65
Defence: Potential for next
• Strategic and tactical collaboration
• Trusted third-party sharing on fake news sites / botnets
• Misinformation version of ATT&CK, SANS20 frameworks
• Algorithms + humans counter algorithms + humans
• Thinking the unthinkable
• “Countermeasures and self-defense actions”
66
Non-state
Misinformation
67
Indexing, not Censorship
6

More Related Content

PDF
EENA 2021: Keynote – Open-Source Intelligence (OSINT) for emergency services ...
EENA (European Emergency Number Association)
 
PDF
Introducing Globaleaks
Vittorio Pasteris
 
PDF
Osint presentation nov 2019
Priyanka Aash
 
PPTX
2021 IWC presentation: Risk, SOCs and Mitigations: Cognitive Security is Comi...
Sara-Jayne Terp
 
PPTX
disinformation risk management: leveraging cyber security best practices to s...
Sara-Jayne Terp
 
PDF
Noticing the Nuance: Designing intelligent systems that can understand semant...
Elizabeth Murnane
 
PPTX
Distributed defense against disinformation: disinformation risk management an...
Sara-Jayne Terp
 
EENA 2021: Keynote – Open-Source Intelligence (OSINT) for emergency services ...
EENA (European Emergency Number Association)
 
Introducing Globaleaks
Vittorio Pasteris
 
Osint presentation nov 2019
Priyanka Aash
 
2021 IWC presentation: Risk, SOCs and Mitigations: Cognitive Security is Comi...
Sara-Jayne Terp
 
disinformation risk management: leveraging cyber security best practices to s...
Sara-Jayne Terp
 
Noticing the Nuance: Designing intelligent systems that can understand semant...
Elizabeth Murnane
 
Distributed defense against disinformation: disinformation risk management an...
Sara-Jayne Terp
 

What's hot (12)

PPTX
Risk, SOCs, and mitigations: cognitive security is coming of age
Sara-Jayne Terp
 
PDF
Opportunities and Challenges in Crisis Informatics
Lea Shanley
 
PPTX
2021-05-SJTerp-AMITT_disinfoSoc-umaryland
Sara-Jayne Terp
 
PPTX
2020 09-01 disclosure
Sara-Jayne Terp
 
PPTX
Cognitive security: all the other things
Sara-Jayne Terp
 
PPTX
The Business(es) of Disinformation
Sara-Jayne Terp
 
PPTX
Sj terp emerging tech radar
SaraJayneTerp
 
PPTX
Building Effective Frameworks for Social Media Analysis
Open Analytics
 
PPTX
2021-02-10_CogSecCollab_UBerkeley
Sara-Jayne Terp
 
PPTX
Building Effective Frameworks for Social Media Analysis
ikanow
 
PPTX
2020 12 nyu-workshop_cog_sec
Sara-Jayne Terp
 
PPTX
2021 12 nyu-the_business_of_disinformation
SaraJayneTerp
 
Risk, SOCs, and mitigations: cognitive security is coming of age
Sara-Jayne Terp
 
Opportunities and Challenges in Crisis Informatics
Lea Shanley
 
2021-05-SJTerp-AMITT_disinfoSoc-umaryland
Sara-Jayne Terp
 
2020 09-01 disclosure
Sara-Jayne Terp
 
Cognitive security: all the other things
Sara-Jayne Terp
 
The Business(es) of Disinformation
Sara-Jayne Terp
 
Sj terp emerging tech radar
SaraJayneTerp
 
Building Effective Frameworks for Social Media Analysis
Open Analytics
 
2021-02-10_CogSecCollab_UBerkeley
Sara-Jayne Terp
 
Building Effective Frameworks for Social Media Analysis
ikanow
 
2020 12 nyu-workshop_cog_sec
Sara-Jayne Terp
 
2021 12 nyu-the_business_of_disinformation
SaraJayneTerp
 
Ad

Similar to Misinfosec frameworks Cansecwest 2019 (20)

PDF
2019 11 terp_breuer_disclosure_master
bodaceacat
 
PDF
2019 11 terp_mansonbulletproof_master copy
Sara-Jayne Terp
 
PDF
WG-misinfosec report out to CredCo.pdf
SaraJayneTerp
 
PDF
Disinformation challenges tools and techniques to deal or live with it
Nikos Sarris
 
PPTX
Deep Fakes, Digital Identity and Democracy
Heather Vescent
 
PDF
MITRE ATT&CKcon 2.0: AMITT - ATT&CK-based Standards for Misinformation Threat...
MITRE - ATT&CKcon
 
PDF
CSW2022_05_data collection.pptx.pdf
SaraJayneTerp
 
PPTX
An Introduction to Maskirovka aka Information Operations
Heather Vescent
 
PDF
DIGITAL HYDRA: SECURITY IMPLICATIONS OF FALSE INFORMATION ONLINE
Alireza Ghahrood
 
PDF
Targeted disinformation warfare how and why foreign efforts are
archiejones4
 
PPTX
Sjterp ds_of_misinfo_feb_2019
bodaceacat
 
PDF
[CB19] From Advanced Persistent Threats to "Advanced Persistent Manipulators"...
CODE BLUE
 
PDF
Tactical Misinformation-Disinformation in your Organization
EyesOpen Association
 
PPTX
Team Disinformation - 2022 Technology, Innovation & Great Power Competition
Stanford University
 
PPTX
Social Engineering: Frames and Frame Control
Social Exploits
 
PDF
Facebook usato da Governi anche per fake news
Andrea Spinosi Picotti
 
PDF
CSW2022_03_threat_environment.pptx.pdf
SaraJayneTerp
 
PDF
Co-Inform (Co-Creating Misinformation Resilient Societies)
The Open University
 
PPTX
fake-news-ppt-presentation-updated with critical analysis.pptx
shaibzadamuntazir192
 
PDF
The Changing Face of Crisis in the Digital Age: Part III
RockDove Solutions
 
2019 11 terp_breuer_disclosure_master
bodaceacat
 
2019 11 terp_mansonbulletproof_master copy
Sara-Jayne Terp
 
WG-misinfosec report out to CredCo.pdf
SaraJayneTerp
 
Disinformation challenges tools and techniques to deal or live with it
Nikos Sarris
 
Deep Fakes, Digital Identity and Democracy
Heather Vescent
 
MITRE ATT&CKcon 2.0: AMITT - ATT&CK-based Standards for Misinformation Threat...
MITRE - ATT&CKcon
 
CSW2022_05_data collection.pptx.pdf
SaraJayneTerp
 
An Introduction to Maskirovka aka Information Operations
Heather Vescent
 
DIGITAL HYDRA: SECURITY IMPLICATIONS OF FALSE INFORMATION ONLINE
Alireza Ghahrood
 
Targeted disinformation warfare how and why foreign efforts are
archiejones4
 
Sjterp ds_of_misinfo_feb_2019
bodaceacat
 
[CB19] From Advanced Persistent Threats to "Advanced Persistent Manipulators"...
CODE BLUE
 
Tactical Misinformation-Disinformation in your Organization
EyesOpen Association
 
Team Disinformation - 2022 Technology, Innovation & Great Power Competition
Stanford University
 
Social Engineering: Frames and Frame Control
Social Exploits
 
Facebook usato da Governi anche per fake news
Andrea Spinosi Picotti
 
CSW2022_03_threat_environment.pptx.pdf
SaraJayneTerp
 
Co-Inform (Co-Creating Misinformation Resilient Societies)
The Open University
 
fake-news-ppt-presentation-updated with critical analysis.pptx
shaibzadamuntazir192
 
The Changing Face of Crisis in the Digital Age: Part III
RockDove Solutions
 
Ad

More from bodaceacat (20)

PPTX
CansecWest2019: Infosec Frameworks for Misinformation
bodaceacat
 
PPTX
Terp breuer misinfosecframeworks_cansecwest2019
bodaceacat
 
PPTX
Practical Influence Operations, presentation at Sofwerx Dec 2018
bodaceacat
 
PPTX
Session 10 handling bigger data
bodaceacat
 
PPTX
Session 09 learning relationships.pptx
bodaceacat
 
PPTX
Session 08 geospatial data
bodaceacat
 
PPTX
Session 07 text data.pptx
bodaceacat
 
PPTX
Session 06 machine learning.pptx
bodaceacat
 
PPTX
Session 05 cleaning and exploring
bodaceacat
 
PPTX
Session 04 communicating results
bodaceacat
 
PPTX
Session 03 acquiring data
bodaceacat
 
PPTX
Session 02 python basics
bodaceacat
 
PPTX
Session 01 designing and scoping a data science project
bodaceacat
 
ODP
Gp technologybuilds july2011
bodaceacat
 
ODP
Gp technologybuilds july2011
bodaceacat
 
ODP
Ardrone represent
bodaceacat
 
PPTX
Global pulse app connection manager
bodaceacat
 
PPT
Un Pulse Camp - Humanitarian Innovation
bodaceacat
 
PPT
Blue light services
bodaceacat
 
PPT
Rhok and opendata hackathon intro
bodaceacat
 
CansecWest2019: Infosec Frameworks for Misinformation
bodaceacat
 
Terp breuer misinfosecframeworks_cansecwest2019
bodaceacat
 
Practical Influence Operations, presentation at Sofwerx Dec 2018
bodaceacat
 
Session 10 handling bigger data
bodaceacat
 
Session 09 learning relationships.pptx
bodaceacat
 
Session 08 geospatial data
bodaceacat
 
Session 07 text data.pptx
bodaceacat
 
Session 06 machine learning.pptx
bodaceacat
 
Session 05 cleaning and exploring
bodaceacat
 
Session 04 communicating results
bodaceacat
 
Session 03 acquiring data
bodaceacat
 
Session 02 python basics
bodaceacat
 
Session 01 designing and scoping a data science project
bodaceacat
 
Gp technologybuilds july2011
bodaceacat
 
Gp technologybuilds july2011
bodaceacat
 
Ardrone represent
bodaceacat
 
Global pulse app connection manager
bodaceacat
 
Un Pulse Camp - Humanitarian Innovation
bodaceacat
 
Blue light services
bodaceacat
 
Rhok and opendata hackathon intro
bodaceacat
 

Recently uploaded (20)

PPTX
Blue and Dark Blue Modern Technology Presentation.pptx
ap177979
 
PPTX
The Internet of Things (IoT) refers to a vast network of interconnected devic...
chethana8182
 
PPTX
谢尔丹学院毕业证购买|Sheridan文凭不见了怎么办谢尔丹学院成绩单
mookxk3
 
PPT
Transformaciones de las funciones elementales.ppt
rirosel211
 
PDF
APNIC Update, presented at PHNOG 2025 by Shane Hermoso
APNIC
 
PPTX
Pengenalan perangkat Jaringan komputer pada teknik jaringan komputer dan tele...
Prayudha3
 
PPTX
dns domain name system history work.pptx
MUHAMMADKAVISHSHABAN
 
PDF
BGP Security Best Practices that Matter, presented at PHNOG 2025
APNIC
 
PPTX
Parallel & Concurrent ...
yashpavasiya892
 
PPTX
LESSON-2-Roles-of-ICT-in-Teaching-for-learning_123922 (1).pptx
renavieramopiquero
 
PPTX
原版北不列颠哥伦比亚大学毕业证文凭UNBC成绩单2025年新版在线制作学位证书
e7nw4o4
 
PPTX
The Internet of Things (IoT) refers to a vast network of interconnected devic...
chethana8182
 
PDF
LOGENVIDAD DANNYFGRETRRTTRRRTRRRRRRRRR.pdf
juan456ytpro
 
PPTX
Perkembangan Perangkat jaringan komputer dan telekomunikasi 3.pptx
Prayudha3
 
PDF
Latest Scam Shocking the USA in 2025.pdf
onlinescamreport4
 
PPT
Introduction to dns domain name syst.ppt
MUHAMMADKAVISHSHABAN
 
PPTX
Microsoft PowerPoint Student PPT slides.pptx
Garleys Putin
 
PDF
Data Protection & Resilience in Focus.pdf
AmyPoblete3
 
PPTX
Generics jehfkhkshfhskjghkshhhhlshluhueheuhuhhlhkhk.pptx
yashpavasiya892
 
PPTX
Crypto Recovery California Services.pptx
lionsgate network
 
Blue and Dark Blue Modern Technology Presentation.pptx
ap177979
 
The Internet of Things (IoT) refers to a vast network of interconnected devic...
chethana8182
 
谢尔丹学院毕业证购买|Sheridan文凭不见了怎么办谢尔丹学院成绩单
mookxk3
 
Transformaciones de las funciones elementales.ppt
rirosel211
 
APNIC Update, presented at PHNOG 2025 by Shane Hermoso
APNIC
 
Pengenalan perangkat Jaringan komputer pada teknik jaringan komputer dan tele...
Prayudha3
 
dns domain name system history work.pptx
MUHAMMADKAVISHSHABAN
 
BGP Security Best Practices that Matter, presented at PHNOG 2025
APNIC
 
Parallel & Concurrent ...
yashpavasiya892
 
LESSON-2-Roles-of-ICT-in-Teaching-for-learning_123922 (1).pptx
renavieramopiquero
 
原版北不列颠哥伦比亚大学毕业证文凭UNBC成绩单2025年新版在线制作学位证书
e7nw4o4
 
The Internet of Things (IoT) refers to a vast network of interconnected devic...
chethana8182
 
LOGENVIDAD DANNYFGRETRRTTRRRTRRRRRRRRR.pdf
juan456ytpro
 
Perkembangan Perangkat jaringan komputer dan telekomunikasi 3.pptx
Prayudha3
 
Latest Scam Shocking the USA in 2025.pdf
onlinescamreport4
 
Introduction to dns domain name syst.ppt
MUHAMMADKAVISHSHABAN
 
Microsoft PowerPoint Student PPT slides.pptx
Garleys Putin
 
Data Protection & Resilience in Focus.pdf
AmyPoblete3
 
Generics jehfkhkshfhskjghkshhhhlshluhueheuhuhhlhkhk.pptx
yashpavasiya892
 
Crypto Recovery California Services.pptx
lionsgate network
 

Misinfosec frameworks Cansecwest 2019

  • 1. INFOSEC FRAMEWORKS FOR MISINFORMATION SARA “SJ” TERP AND PABLO BREUER CANSECWEST 2019
  • 2. TALK OBJECTIVES • Describe the problem • Establish a common language • Introduce a framework • Talk about what we can do with the framework
  • 4. SOCIAL ENGINEERING AT SCALE Facebook Group Shares Interactions Blacktivists 103,767,792 6,182,835 Txrebels 102,950,151 3,453,143 MuslimAmerica 71,355,895 2,128,875 Patriototus 51,139,860 4,438,745 Secured.Borders 5,600,136 1,592,771 Lgbtun 5,187,494 1,262,386
  • 5. INTENT TO DECEIVE Force adversary to make decision or take action based on information that I: • Hide • Give • Change (or change the context on) • Deny/degrade • Destroy Enable my decisions based upon knowing yours “Operations to convey selected information and indicators to audiences to influence their emotions, motives, and objectives reasoning, and ultimately the behavior of governments, organizations, groups, and individuals”
  • 8. Describing the problem Why misinformation is different now
  • 9. INSTRUMENTS OF NATIONAL POWER …and how to influence other nation-states. Diplomatic Informational Military Economic Resources available in pursuit of national objectives…
  • 10. NATIONSTATE MISINFORMATION From To Brazil Brazil China China, Taiwan, US Iran India, Pakistan Russia Armenia, France, Germany, Netherlands, Philippines, Serbia, UK, USA, Ukraine, World Saudi Qatar Unknown France, Germany, USA
  • 13. OTHER ACTORS AND THEIR MOTIVATIONS • State and non-state actors • Entrepreneurs • Grassroots groups • Private influencers
  • 14. RESPONSE: NOT JUST ADMIRING THE PROBLEM
  • 16. MISINFOSEC: MISINFORMATION + INFOSEC All cyberspace operations are based on influence. - Pablo Breuer
  • 17. MISINFORMATION VIEWED AS… • Information security (Gordon, Grugq, Rogers) • Information operations / influence operations (Lin) • A form of conflict (Singer, Gerasimov) • [A social problem] • [News source pollution]
  • 18. ATTACK. DEFEND. NETWORKS. LOOKED FAMILIAR.
  • 19. MAYBE THERE WERE THINGS WE COULD USE
  • 20. ADDING MISINFORMATION TO INFOSEC “Prevention of damage to, protection of, and restoration of computers, electronic communications systems, electronic communications services, wire communication, and electronic communication, including information contained therein, to ensure its availability, integrity, authentication, confidentiality, and nonrepudiation” - NSPD-54
  • 22. PSYOPS AND INFOSEC AREN’T JOINED UP Information Operations PSYOPS Computer Network Operations
  • 23. INFOSEC SUPPORT TO MISINFORMATION TRACKING
  • 24. THERE’S NO COMMON LANGUAGE “We use misinformation attack (and misinformation campaign) to refer to the deliberate promotion of false, misleading or mis-attributed information. Whilst these attacks occur in many venues (print, radio, etc), we focus on the creation, propagation and consumption of misinformation online. We are especially interested in misinformation designed to change beliefs in a large number of people.”
  • 25. MISINFOSEC COMMUNITIES ● Industry ● Academia ● Media ● Community ● Government
  • 26. FIRST OUTPUT: MISINFOSEC FRAMEWORK STANDARDS
  • 28. STAGE-BASED MODELS ARE USEFUL RECON WEAPONIZE DELIVER EXPLOIT CONTROL EXECUTE MAINTAIN Persistence Privilege Escalation Defense Evasion Credential Access Discovery Lateral Movement Execution Collection Exfiltration Command and Control
  • 29. WE CHOSE THE ATT&CK FRAMEWORK
  • 30. AND STARTED MAPPING MISINFORMATION ONTO IT Initial Access Create Artefacts Insert Theme Amplify Message Command And Control Account takeover Steal existing artefacts Create fake emergency Repeat messaging with bots Create fake real-life events Create fake group Deepfake Create fake argument Parody account Buy friends Deep cover
  • 31. POPULATING THE FRAMEWORK • Campaigns • e.g. Internet Research Agency, 2016 US elections • Incidents • e.g. Columbia Chemicals • Failed attempts • e.g. Russia - France campaigns
  • 33. HISTORICAL CATALOG: DATASHEET • Summary: Early Russian (IRA) “fake news” stories. Completely fabricated; very short lifespan. • Actor: probably IRA (source: recordedfuture) • Timeframe: Sept 11 2014 (1 day) • Presumed goals: test deployment • Artefacts: text messages, images, video • Related attacks: These were all well-produced fake news stories, promoted on Twitter to influencers through a single dominant hashtag -- #BPoilspilltsunami, #shockingmurderinatlanta, • Method: 1. Create messages. e.g. “A powerful explosion heard from miles away happened at a chemical plant in Centerville, Louisiana #ColumbianChemicals” 2. Post messages from fake twitter accounts; include handles of local and global influencers (journalists, media, politicians, e.g. @senjeffmerkley) 3. Amplify, by repeating messages on twitter via fake twitter accounts • Result: limited traction • Counters: None seen. Fake stories were debunked very quickly.
  • 34. FEEDS INTO TECHNIQUES LIST • Behavior: two groups meeting in same place at same time • Intended effect: IRL tension / conflict • Requirements: access to groups, group trust • Detection: • Handling: • Examples: Title Description Short_Description Intended_Effect Behavior Resources Victim_Targeting Exploit_Targets Related_TTPs Kill_chain_Phases Information_Source Klil_Chains Handling
  • 35. THIS IS WHAT A FINISHED FRAMEWORK LOOKS LIKE
  • 37. INCIDENT ANALYSIS Top-down (strategic): info ops ❏ What are misinformation creators likely to do? What, where, when, how, who, why? ❏ What do we expect to see? ❏ What responses and impediments to responses were there? Bottom-up (tactical): data science ❏Unusual hashtag, trend, topic, platform activity? ❏Content from ‘known’ trollbots, 8/4chan, r/thedonald, RussiaToday etc ❏What are trackers getting excited about today?
  • 38. Top-down analysis Means of implementing influence strategies
  • 40. DISTORTION TECHNIQUES • Distort facts: match intended outcome • Exaggerate: rhetoric & misrepresent facts • Generate: realistic false artifacts • Mismatch: links, images, and claims to change context of information
  • 41. DISTRACTION TECHNIQUES • String along: respond to anyone who engages to waste time • Play dumb: pretend to be naive, gullible, stupid • Redirect: draw engagement to your thread • Dilute: add other accounts to dilute threads • Threadjack: change narrative in existing thread
  • 42. DIVISION TECHNIQUES • Provoke: create conflicts and confusion among community members • Dehumanize: demean and denigrate target group • Hate speech: attack protected characteristics or classes • Play victim: claim victim status • Dog-whistle: use coded language to indicate insider status • Hit and run: attack and delete after short time interval • Call to arms: make open calls for action
  • 43. DISMAY TECHNIQUES • Ad hominem: make personal attacks, insults & accusations • Assign threats: name and personalize enemy • Good old-fashioned tradecraft
  • 44. DISMISSAL TECHNIQUES • Last word: respond to hostile commenters then block them so they can’t reply • Brigading: coordinate mass attacks or reporting of targeted accounts or tweets • Shit list: add target account(s) to insultingly named list(s)
  • 47. RESOURCES Trollbot lists: • https://botsentinel.com/ Tools: • APIs / python libraries / Pandas • https://github.com/IHJpc2V1cCAK/socint • https://labsblog.f-secure.com/2018/02/16/searching-twitter-with-twarc/ Existing datasets • https://github.com/bodacea/misinfolinks
  • 50. ARTEFACTS: TEXT (WORDS, HASHTAGS, URLS ETC)
  • 52. MOVING UP: CONTENT AND CONTEXT ANALYSIS • Metadata analysis • Social network analysis • Text analysis (frequency, sentiment etc) • Time series analysis • Visual inspection (Bokeh, Gephi etc) • Correlation • Models, e.g. clustering and classification • Narrative analysis
  • 55. EXPERT TRACKERS @katestarbird #digitalsherlocks @josh_emerson @conspirator0 @r0zetta@fs0c131y
  • 56. WHY BUILD FRAMEWORKS? … what do we do with them?
  • 57. COMPONENTWISE UNDERSTANDING AND RESPONSE • Lingua Franca across communities • Defend/countermove against reused techniques, identify gaps in attacks • Assess defence tools & techniques • Plan for large-scale adaptive threats (hello, Machine Learning!) • Build an alert structure (e.g. ISAC, US-CERT, Interpol)
  • 58. WE NEED TO DESIGN AND SHARE RESPONSES
  • 59. WE NEED TO BUILD COMMUNITIES ● Industry ● Academia ● Media ● Community ● Government
  • 60. WE NEED INTELLIGENCE SHARING AND COORDINATION
  • 63. THANK YOU Sara “SJ” Terp Bodacea Light Industries [email protected] @bodaceacat CDR Pablo C. Breuer U.S. Special Operations Command / SOFWERX [email protected] @Ngree_H0bit
  • 64. Community • Parody-based counter-campaigns (e.g. riffs on “Q”) • SEO-hack misinformation sites • Dogpile onto misinformation hashtags • Divert followers (typosquat trolls, spoof messaging etc) • Identify and engage with affected individuals • Educate, verify, bring into the light 64
  • 65. Offense: Potentials for Next • Algorithms + humans attack algorithms + humans • Shift from trolls to ‘nudging’ existing human communities (‘useful idiots’) • Subtle attacks, e.g. ’low-and-slows’, ‘pop-up’, etc • Massively multi-channel attacks • More commercial targets • A well-established part of hybrid warfare 65
  • 66. Defence: Potential for next • Strategic and tactical collaboration • Trusted third-party sharing on fake news sites / botnets • Misinformation version of ATT&CK, SANS20 frameworks • Algorithms + humans counter algorithms + humans • Thinking the unthinkable • “Countermeasures and self-defense actions” 66