SlideShare a Scribd company logo
Games User Research
               Ben Lewis-Evans
An Introduction to Games User Research Methods
>Introduction
  Games User Research
   - Fun & Awareness raising
     - Emotion
     - Awareness
  Methods
   - Production
   - Post-Production
>Games User Research
  Game testing traditionally
   done by the QA/Test
   department
   - QA/Test are (usually)
     experts at gaming
     - The audience may not be
   - QA/Test have an
     investment in the game
     - The audience does not
   - Mainly looking for bugs
>Games User Research
  About the user
   experience & fun
  What do you want to
   know?
   - Is the game fun?
   - Does it raise
     awareness?
>Fun
  What is fun?
  Well…
   - Easy to use
   - Challenging
   - Emotional impact
   - Engaging
   - Compelling
   - Relaxing
  It is subjective!
>Emotions & Feelings
                      High Activation
       Scared - x                       x – Excited




  Unpleasant                                  Pleasant




               Bored - x                  x – Relaxed


                      Low Activation
An Introduction to Games User Research Methods
An Introduction to Games User Research Methods
An Introduction to Games User Research Methods
An Introduction to Games User Research Methods
>Emotions & Feelings
  Sometimes fun           High Activation                 Usually fun

           Scared - x                       x – Excited




     Unpleasant                                   Pleasant




                   Bored - x                  x – Relaxed
Almost never fun
                          Low Activation
>Awareness Raising
  What is this?
>Awareness Raising
An Introduction to Games User Research Methods
>In production testing (aka
 Playtesting)
  General points:
   - Get representative
     users (kids, 10-14)
   - Make it clear that the
     game is being tested,
     NOT the user
   - Work out what you
     want to know before
     you test
>In production testing (aka
 Playtesting)
  General points cont.:
   - Test as early as possible, it
     is easier to fix problems
     that way (then test again)
   - Listen to problems, but not
     necessarily solutions
   - Not for balance, and bugs,
     but for fun! (& raised
     awareness)
>Methods
  Focus Groups
  Heuristic Evaluation
  Questionnaires, Surveys and
   Interviews
  Observational studies
  Gameplay metrics
  Biometrics/psychophysiology
  Think out loud
>Focus groups
  6-10 people
  Lead by a facilitator
   - Specific questions
  Try the game/discuss
   potential ideas
  Talk about it
>Focus Groups
  Pros
   - More people can = more feedback
   - Gets everyone together in one place
   - Follow up questions
   - Good for discussing concepts
  Cons
   - You need a good facilitator
   - Strong voices may take over
   - Too many “helpful” suggestions
   - What people say is not what they do
>Heuristic Evaluation
  Expert evaluation (somewhat like a game
   review)
>Heuristic Evaluation
  List of Heuristics:

• Are clear goals provided?               • Is the game and the outcome fair?
• Are players rewards meaningful?         • Is the game replayable?
• Does the player feel in control?        • Is the AI visible, consistent, yet
• Is the game balanced?                   somewhat unpredictable?
• Is the first playthrough and first      • Is the game too frustrating?
impression good?                          • Is the learning curve too steep or
• Is there a good story?                  too long?
• Does the game continue to progress      • Emotional impact?
well?                                     • Not too much boring repetition?
• Is the game consistent and              • Can players recognise important
responsive?                               elements on screen?
• Is it clear why a player failed?        • etc…
• Are their variable difficulty levels?


                                                       From Christina et al 2009
>Heuristics Evaluation
  Pros
   - Smaller numbers
   - Experts are experts
  Cons
   - You need experts
   - Which heuristics to
     pick?
   - Experts are experts
>Questionnaires, Surveys &
 Interviews
  During gameplay (at or after
   set moments)
  After gameplay
  Ask specifically for what
   interests you (don’t forget
   about the raising awareness
   part)
   - But also allow for some
     open ended answers
An Introduction to Games User Research Methods
>Questionnaires, Surveys & Interviews
  Some pre-existing questionnaires:
   - Game Experience Questionnaire (GEQ)
     - http://www.gamexplab.nl/
   - The Computer System Usability
     Questionnaire (if modified)
     - http://oldwww.acm.org/perlman/question.cg

    If you use them modify them to fit your
     particular game & what you want to know
>Questionnaires, Surveys & Interviews
  For Emotion – The Affect Grid




                                   Russell, Weiss & Mendelsohn, 1989
>Questionnaires design
  Order of questions
  Use clear, concise everyday, simple language
   - Avoid jargon
   - Don’t be vague
  Avoid asking duplicate questions
   - The same question in a different way
  Avoid questions that are phrased negatively
   - “I don’t like the jumping” agree -> disagree
   - “I like the jumping” agree -> disagree
>Examples of bad questions
  Leading



  ”Now that you have had fun playing our
  game, which was your favorite level?”
                 Level 1

                 Level 2

                 Level 3

                 Level 4
>Examples of bad questions
  Double(7)barrelled:



  "Should there be a reform of our justice
  system placing greater emphasis on the
  needs of victims, providing restitution and
  compensation for them and imposing
  minimum sentences and hard labour for all
  serious violent offences?”

          Yes                    No
>Examples of bad questions
       Loaded, ill defined and misleading:



       "Should a smack as part of good parental
        correction be a criminal offence in New
        Zealand?"
                    Yes                            No

[The question] "could have been written by Dr Seuss – this isn't Green
Eggs and Ham, this is yes means no and no means yes, but we're all
meant to understand what the referendum means. I think it's ridiculous
myself.” - PM John Key
>Questionnaires design
  Type of question
   - Open
   - Closed
>Questionnaires design
  Type of question
   - Open
     - “What did you like about level 1?”
     - “Why might a child not be able to come to
       school?”
     - “How old are you?”
   - Closed
>Questionnaires design
  Closed questions
   - Dichotomous scale
     - Yes/No
An Introduction to Games User Research Methods
>Questionnaires design
  Closed questions:
   - Dichotomous scale
   - Continuous scale
An Introduction to Games User Research Methods
>Questionnaires design
  Closed questions:
   - Dichotomous scale
   - Continuous scale
   - Interval scale
An Introduction to Games User Research Methods
>Questionnaires design
  Interval scale

  - Numeric/categorical:
              1   2   3    4    5

  - Likert:
              1   2   3    4    5
       Disagree                Agree

  - Semantic:
              1   2   3    4    5
         Poor                  Good
>Questionnaires design
  Interval scale

  - Unipolar:
           1     2   3   4    5
      Not Exciting       Very Exciting

  - Bipolar:
           1     2   3   4    5
      Very Boring        Very Exciting
>Questionnaires, Surveys & Interviews
  Questionnaires & Surveys
   - Pros
     - Consistent
     - Quantifiable
     - Fast
     - Good for testing raised awareness
   - Cons
     - Can lack follow up
     - Not objective
     - Need a large(ish) sample
>Questionnaires, Surveys &
 Interviews
  Interviews
   - Pros
     - Rich data
     - Can follow up
   - Cons
     - Less quantifiable
     - Time consuming
     - Not objective
>Observation studies
  Watch/Record through
   video
  Either with a facilitator
   or without
   - Facilitator must be as
     hands off as possible
  Watch faces/body for
   emotion
  Only write what you
   actually see!
>Observation studies
  Pros
   - Objective data
     - i.e. You see what players actually DO,
       not what they say they would do
   - Facilitator can help if really needed
  Cons
   - Time consuming to analyse video
   - Training required to get the best out of
     observation (especially for emotion)
     - Avoid Observer Bias
>Gameplay Metrics
  Observation via
   the game data
   - Number of
     incidents
   - Where and when
     they occurred
     and with who or
     what?
Heatmap of Kills
Heatmap of Deaths
An Introduction to Games User Research Methods
>Gameplay Metrics                     Deaths

  Pros
   - Objective data
   - See trends
  Cons
   - Time consuming
                                       Kills
   - No subjective feedback/context
   - Needs larger sample sizes
   - Data overload
>Biometrics/psychophysiology
  Measuring body signals:
   - From the Brain (EEG), the
     Heart (EKG), the muscles
     (EMG), the eyes
     (eyetracking), the skin
     (EDA), etc
  The body gives clues into
   cognition, and emotion
>EDA Game Research
  Dying is fun?
  Ravaja et al (2008)
   - EDA increased for
     - Opponent Killed
     - Player Killed
   - EMG (Zygomatic
     and Orbicularis)
     - Increased for
       longer when
       player killed
An Introduction to Games User Research Methods
>Biometrics/psychophysiology
  Pros
   - Gives objective quantifiable data
   - Allows for continuous data
     recording
  Cons
   - Invasive
   - Costs a lot of time & money to use
     & analyse
   - Problems with specificity, artefacts,
     inference and validity
>Think out loud
  An expansion of observation
  Players play the game and
   talk about what they are
   thinking as they go along
   - Observe gameplay, and
     note down what they say &
     when they say it
     - Do NOT prompt them, and
       do NOT correct them
>Think out loud
  Pros
   - Gain an idea what players
     are thinking & feeling
   - May give unexpected
     insight
   - Could test raised
     awareness too
  Cons
   - Unnatural
   - Subjective
>In production testing (aka Playtesting)
  Many options
  In your case I recommend:
   - Observation
     - Gives objective data, and may
       provide insights (think out loud?)
   - After play (between level) questionnaire
     - Allows you to ask specific questions and
       receive feedback
   - Collect gameplay metrics as if you can
     (Optional)
>In production testing (aka
 Playtesting)
  Remember
   - Don’t wait until the game is
     almost finished
     - It is easier to change things
       early in the process
   - Listen to what people say is
     wrong/right, don’t worry too
     much about what they
     suggest to do to fix it
     - You are the game designer
>Post-production testing
  Raised awareness, it is too
   late to be overly concerned
   about fun
  Can’t observe, take
   biometrics, run focus groups,
   collect gameplay metrics
   (game runs offline), etc
   - On the ground observation?
      Costly.
>Questionnaires
  Pen & Paper is one
   option
  Better to build them into
   the game
   - Between levels or
     within levels
     - Multiple choice
       questions
     - Fill in the blanks
An Introduction to Games User Research Methods
>Questionnaires
  In game questions
   - Require completion to continue & provide
     rewards for correct answers
   - Try to make them fun too
     - i.e playtest them
   - Have to get the data back somehow though
     - Uploaded when connected to the web?
     - Teachers & students can access a report?
An Introduction to Games User Research Methods
An Introduction to Games User Research Methods
An Introduction to Games User Research Methods
>In game behavioural modelling
  Raised awareness is fine
  Actually doing behaviour is
   better
  So, if you can, build it into the
   game
   - Require the awareness you are
     raising to be used to progress
     in the game
     - Not just through answering
       questionaires
     - But by DOING
http://www.peta.org/interactive/games/default.aspx
>Going further than Awareness?
  Is there any way you can monitor behaviour
   change?
   - New questions on later play throughs?
   - Teacher/adult report system?
     - Tied to in-game achievements/rewards?
>Summary
  Playtest early, playtest as often as you can
   - With yourselves
   - But also with players
     - Behavioural observation allows for good
       data with small numbers for playtesting
  Build in awareness testing
   - Require it to progress
   - Try and make it fun too
Thank you


            b.lewis.evans@rug.nl
All images used in this
 presentation belong to their
respective copyright holders.

 If an image belongs to you
  and you wish it removed
     please notify me at
    b.lewis.evans@rug.nl

More Related Content

DOCX
SRS REPORT ON A ANDROID GAME
milan tripathi
 
PPSX
Zombi - Shoot for Survive
Divy Singh Rathore
 
PDF
Android Application And Unity3D Game Documentation
Sneh Raval
 
PPTX
Computer Graphics - Introduction and CRT Devices
Hisham Al Kurdi, EAVA, DMC-D-4K, HCCA-P, HCAA-D
 
PPTX
Raster scan displays ppt
ABHISHEK KUMAR
 
PDF
Design your 3d game engine
Daosheng Mu
 
PPTX
Designing of media player
Nur Islam
 
PPTX
graphics hardware input devices
Tabeer12
 
SRS REPORT ON A ANDROID GAME
milan tripathi
 
Zombi - Shoot for Survive
Divy Singh Rathore
 
Android Application And Unity3D Game Documentation
Sneh Raval
 
Computer Graphics - Introduction and CRT Devices
Hisham Al Kurdi, EAVA, DMC-D-4K, HCCA-P, HCAA-D
 
Raster scan displays ppt
ABHISHEK KUMAR
 
Design your 3d game engine
Daosheng Mu
 
Designing of media player
Nur Islam
 
graphics hardware input devices
Tabeer12
 

What's hot (20)

PDF
User Experience 6: Qualitative Methods, Playtesting and Interviews
Marc Miquel
 
DOCX
Project synopsis on online voting system
Lhakpa Yangji
 
PPTX
Introduction to Game Development and the Game Industry
Nataly Eliyahu
 
PPTX
Graphics in games
ShubhamRokde1
 
PPTX
ONLINE EXAMINATION SYSTEM
Aakanksha .
 
PPT
Game Interface Design
Chris Castaldi
 
PPTX
Raster Scan display
Lokesh Singrol
 
PPTX
Multimedia: Audio and video technology
Arti Parab Academics
 
DOC
Srs for virtual eucation
Susheel Thakur
 
PPTX
Proposal of 3d GAME Final Year Project
fahim shahzad
 
PDF
Guide to creation of game concept document
Emma Westecott
 
PPTX
Chapter 2 Time boxing & agile models
Golda Margret Sheeba J
 
PDF
Redesigning everything ITARC Stockholm 2021
Alberto Brandolini
 
PPTX
Camera model ‫‬
Fatima Radi
 
DOCX
The complete srs documentation of our developed game.
Isfand yar Khan
 
PPTX
Improve the performance of your Unity project using Graphics Performance Anal...
Unity Technologies
 
PDF
Online Examinition System
Harsh Jobanputra
 
PPT
HCI 3e - Ch 8: Implementation support
Alan Dix
 
PPTX
display devices.pptx
AseebKhan33
 
PDF
cuento solo vine a llamar por telefono.pdf
YanethRuiz16
 
User Experience 6: Qualitative Methods, Playtesting and Interviews
Marc Miquel
 
Project synopsis on online voting system
Lhakpa Yangji
 
Introduction to Game Development and the Game Industry
Nataly Eliyahu
 
Graphics in games
ShubhamRokde1
 
ONLINE EXAMINATION SYSTEM
Aakanksha .
 
Game Interface Design
Chris Castaldi
 
Raster Scan display
Lokesh Singrol
 
Multimedia: Audio and video technology
Arti Parab Academics
 
Srs for virtual eucation
Susheel Thakur
 
Proposal of 3d GAME Final Year Project
fahim shahzad
 
Guide to creation of game concept document
Emma Westecott
 
Chapter 2 Time boxing & agile models
Golda Margret Sheeba J
 
Redesigning everything ITARC Stockholm 2021
Alberto Brandolini
 
Camera model ‫‬
Fatima Radi
 
The complete srs documentation of our developed game.
Isfand yar Khan
 
Improve the performance of your Unity project using Graphics Performance Anal...
Unity Technologies
 
Online Examinition System
Harsh Jobanputra
 
HCI 3e - Ch 8: Implementation support
Alan Dix
 
display devices.pptx
AseebKhan33
 
cuento solo vine a llamar por telefono.pdf
YanethRuiz16
 
Ad

Viewers also liked (20)

PDF
Game Metrics and Biometrics: The Future of Player Experience Research
Lennart Nacke
 
PPTX
Psychophysiology(biometrics) and games
Ben Lewis-Evans
 
PDF
UX Design Testing Battle School - ISL
Eric Shutt
 
PDF
Business Analysis Intermediate level
Hitesh Grover
 
PDF
Elearning classroom expectations sheet1
Jacalyn Tapp
 
PPT
Tema 1 de Mate
maestrojuanavila
 
PDF
Bloomberg Essential - Fixed Income Essentials
Hitesh Grover
 
PPTX
Theories of Student Support for Retention
EADTU
 
PDF
2015 d. gašević an opportunity for higher education
EADTU
 
PPTX
Games User Research is for Game Design!
Marina Kobayashi
 
KEY
DevDays Games UX Talk
guest7e5c7c1
 
PDF
Prelude to QA Testing III Tedy
Agate Studio
 
PDF
Lecture 3 - Decision Making
Luke Dicken
 
PDF
Qa tester
Will Hounsell
 
PPTX
QA_EA and Certification Testing
Andrew Pritchard
 
PDF
Prelude to QA Testing #4 by Tedy
Agate Studio
 
PPT
Game as a service - do you know what it means?
ICO Partners
 
PPTX
“Gorbeia” Natural Park
Axiersukun
 
PPTX
Izki National Park
Axiersukun
 
PPTX
Intro to Games User Research Methods - March 2013
Ben Lewis-Evans
 
Game Metrics and Biometrics: The Future of Player Experience Research
Lennart Nacke
 
Psychophysiology(biometrics) and games
Ben Lewis-Evans
 
UX Design Testing Battle School - ISL
Eric Shutt
 
Business Analysis Intermediate level
Hitesh Grover
 
Elearning classroom expectations sheet1
Jacalyn Tapp
 
Tema 1 de Mate
maestrojuanavila
 
Bloomberg Essential - Fixed Income Essentials
Hitesh Grover
 
Theories of Student Support for Retention
EADTU
 
2015 d. gašević an opportunity for higher education
EADTU
 
Games User Research is for Game Design!
Marina Kobayashi
 
DevDays Games UX Talk
guest7e5c7c1
 
Prelude to QA Testing III Tedy
Agate Studio
 
Lecture 3 - Decision Making
Luke Dicken
 
Qa tester
Will Hounsell
 
QA_EA and Certification Testing
Andrew Pritchard
 
Prelude to QA Testing #4 by Tedy
Agate Studio
 
Game as a service - do you know what it means?
ICO Partners
 
“Gorbeia” Natural Park
Axiersukun
 
Izki National Park
Axiersukun
 
Intro to Games User Research Methods - March 2013
Ben Lewis-Evans
 
Ad

Similar to An Introduction to Games User Research Methods (20)

PDF
Game design 2 (2013): Lecture 11 - User Feedback in Game Design
David Farrell
 
PPT
Comu346 lecture 7 - user evaluation
David Farrell
 
PDF
Better Than Fun? - Bitspiration 2012
BDressler
 
PPTX
JTEL2012 emotion and games in technology-enhanced learning
Kostas Karpouzis
 
PDF
Next Generation Testing: Biometric Analysis of Player Experience
Lennart Nacke
 
PDF
Games User Research by Valen
Agate Studio
 
PPTX
Mmig talk jan 245 2011
Brock Dubbels
 
PPTX
Building a Game for a Assessment Nursing Game
Brock Dubbels
 
PDF
On the Usability of Psychophysiological User Research for the Games Industry
Lennart Nacke
 
PDF
Playability and Player Experience Research
ナム-Nam Nguyễn
 
PPT
HCI 3e - Ch 9: Evaluation techniques
Alan Dix
 
PPT
Fun And Games Presentation, Robert Grigg, University of Portsmouth, Use8
Use8.net
 
PDF
Human Computer Interaction Evaluation
LGS, GBHS&IC, University Of South-Asia, TARA-Technologies
 
PDF
Data collection workshop
Dalhousie University
 
PDF
Playability & Player Experience Research
Lennart Nacke
 
PDF
Why keep it to yourself? Teaching everyone on the team to do usability testing
Dana Chisnell
 
PPT
Games Design 2 - Lecture 9 - User Evaluation
David Farrell
 
PPT
evaluation-ppt is a good paper for ervalution technique
rrbehera
 
Game design 2 (2013): Lecture 11 - User Feedback in Game Design
David Farrell
 
Comu346 lecture 7 - user evaluation
David Farrell
 
Better Than Fun? - Bitspiration 2012
BDressler
 
JTEL2012 emotion and games in technology-enhanced learning
Kostas Karpouzis
 
Next Generation Testing: Biometric Analysis of Player Experience
Lennart Nacke
 
Games User Research by Valen
Agate Studio
 
Mmig talk jan 245 2011
Brock Dubbels
 
Building a Game for a Assessment Nursing Game
Brock Dubbels
 
On the Usability of Psychophysiological User Research for the Games Industry
Lennart Nacke
 
Playability and Player Experience Research
ナム-Nam Nguyễn
 
HCI 3e - Ch 9: Evaluation techniques
Alan Dix
 
Fun And Games Presentation, Robert Grigg, University of Portsmouth, Use8
Use8.net
 
Human Computer Interaction Evaluation
LGS, GBHS&IC, University Of South-Asia, TARA-Technologies
 
Data collection workshop
Dalhousie University
 
Playability & Player Experience Research
Lennart Nacke
 
Why keep it to yourself? Teaching everyone on the team to do usability testing
Dana Chisnell
 
Games Design 2 - Lecture 9 - User Evaluation
David Farrell
 
evaluation-ppt is a good paper for ervalution technique
rrbehera
 

Recently uploaded (20)

PDF
Responsible AI and AI Ethics - By Sylvester Ebhonu
Sylvester Ebhonu
 
PDF
Make GenAI investments go further with the Dell AI Factory
Principled Technologies
 
PDF
SparkLabs Primer on Artificial Intelligence 2025
SparkLabs Group
 
PPTX
Applied-Statistics-Mastering-Data-Driven-Decisions.pptx
parmaryashparmaryash
 
PPTX
IT Runs Better with ThousandEyes AI-driven Assurance
ThousandEyes
 
PDF
Orbitly Pitch Deck|A Mission-Driven Platform for Side Project Collaboration (...
zz41354899
 
PDF
CIFDAQ's Market Wrap : Bears Back in Control?
CIFDAQ
 
PDF
Structs to JSON: How Go Powers REST APIs
Emily Achieng
 
PDF
Security features in Dell, HP, and Lenovo PC systems: A research-based compar...
Principled Technologies
 
PDF
Brief History of Internet - Early Days of Internet
sutharharshit158
 
PDF
A Strategic Analysis of the MVNO Wave in Emerging Markets.pdf
IPLOOK Networks
 
PDF
Google I/O Extended 2025 Baku - all ppts
HusseinMalikMammadli
 
PDF
Unlocking the Future- AI Agents Meet Oracle Database 23ai - AIOUG Yatra 2025.pdf
Sandesh Rao
 
PDF
OFFOFFBOX™ – A New Era for African Film | Startup Presentation
ambaicciwalkerbrian
 
PDF
Trying to figure out MCP by actually building an app from scratch with open s...
Julien SIMON
 
PPTX
Dev Dives: Automate, test, and deploy in one place—with Unified Developer Exp...
AndreeaTom
 
PDF
AI Unleashed - Shaping the Future -Starting Today - AIOUG Yatra 2025 - For Co...
Sandesh Rao
 
PDF
Advances in Ultra High Voltage (UHV) Transmission and Distribution Systems.pdf
Nabajyoti Banik
 
PDF
How-Cloud-Computing-Impacts-Businesses-in-2025-and-Beyond.pdf
Artjoker Software Development Company
 
PDF
AI-Cloud-Business-Management-Platforms-The-Key-to-Efficiency-Growth.pdf
Artjoker Software Development Company
 
Responsible AI and AI Ethics - By Sylvester Ebhonu
Sylvester Ebhonu
 
Make GenAI investments go further with the Dell AI Factory
Principled Technologies
 
SparkLabs Primer on Artificial Intelligence 2025
SparkLabs Group
 
Applied-Statistics-Mastering-Data-Driven-Decisions.pptx
parmaryashparmaryash
 
IT Runs Better with ThousandEyes AI-driven Assurance
ThousandEyes
 
Orbitly Pitch Deck|A Mission-Driven Platform for Side Project Collaboration (...
zz41354899
 
CIFDAQ's Market Wrap : Bears Back in Control?
CIFDAQ
 
Structs to JSON: How Go Powers REST APIs
Emily Achieng
 
Security features in Dell, HP, and Lenovo PC systems: A research-based compar...
Principled Technologies
 
Brief History of Internet - Early Days of Internet
sutharharshit158
 
A Strategic Analysis of the MVNO Wave in Emerging Markets.pdf
IPLOOK Networks
 
Google I/O Extended 2025 Baku - all ppts
HusseinMalikMammadli
 
Unlocking the Future- AI Agents Meet Oracle Database 23ai - AIOUG Yatra 2025.pdf
Sandesh Rao
 
OFFOFFBOX™ – A New Era for African Film | Startup Presentation
ambaicciwalkerbrian
 
Trying to figure out MCP by actually building an app from scratch with open s...
Julien SIMON
 
Dev Dives: Automate, test, and deploy in one place—with Unified Developer Exp...
AndreeaTom
 
AI Unleashed - Shaping the Future -Starting Today - AIOUG Yatra 2025 - For Co...
Sandesh Rao
 
Advances in Ultra High Voltage (UHV) Transmission and Distribution Systems.pdf
Nabajyoti Banik
 
How-Cloud-Computing-Impacts-Businesses-in-2025-and-Beyond.pdf
Artjoker Software Development Company
 
AI-Cloud-Business-Management-Platforms-The-Key-to-Efficiency-Growth.pdf
Artjoker Software Development Company
 

An Introduction to Games User Research Methods

  • 1. Games User Research Ben Lewis-Evans
  • 3. >Introduction  Games User Research - Fun & Awareness raising - Emotion - Awareness  Methods - Production - Post-Production
  • 4. >Games User Research  Game testing traditionally done by the QA/Test department - QA/Test are (usually) experts at gaming - The audience may not be - QA/Test have an investment in the game - The audience does not - Mainly looking for bugs
  • 5. >Games User Research  About the user experience & fun  What do you want to know? - Is the game fun? - Does it raise awareness?
  • 6. >Fun  What is fun?  Well… - Easy to use - Challenging - Emotional impact - Engaging - Compelling - Relaxing  It is subjective!
  • 7. >Emotions & Feelings High Activation Scared - x x – Excited Unpleasant Pleasant Bored - x x – Relaxed Low Activation
  • 12. >Emotions & Feelings Sometimes fun High Activation Usually fun Scared - x x – Excited Unpleasant Pleasant Bored - x x – Relaxed Almost never fun Low Activation
  • 13. >Awareness Raising  What is this?
  • 16. >In production testing (aka Playtesting)  General points: - Get representative users (kids, 10-14) - Make it clear that the game is being tested, NOT the user - Work out what you want to know before you test
  • 17. >In production testing (aka Playtesting)  General points cont.: - Test as early as possible, it is easier to fix problems that way (then test again) - Listen to problems, but not necessarily solutions - Not for balance, and bugs, but for fun! (& raised awareness)
  • 18. >Methods  Focus Groups  Heuristic Evaluation  Questionnaires, Surveys and Interviews  Observational studies  Gameplay metrics  Biometrics/psychophysiology  Think out loud
  • 19. >Focus groups  6-10 people  Lead by a facilitator - Specific questions  Try the game/discuss potential ideas  Talk about it
  • 20. >Focus Groups  Pros - More people can = more feedback - Gets everyone together in one place - Follow up questions - Good for discussing concepts  Cons - You need a good facilitator - Strong voices may take over - Too many “helpful” suggestions - What people say is not what they do
  • 21. >Heuristic Evaluation  Expert evaluation (somewhat like a game review)
  • 22. >Heuristic Evaluation  List of Heuristics: • Are clear goals provided? • Is the game and the outcome fair? • Are players rewards meaningful? • Is the game replayable? • Does the player feel in control? • Is the AI visible, consistent, yet • Is the game balanced? somewhat unpredictable? • Is the first playthrough and first • Is the game too frustrating? impression good? • Is the learning curve too steep or • Is there a good story? too long? • Does the game continue to progress • Emotional impact? well? • Not too much boring repetition? • Is the game consistent and • Can players recognise important responsive? elements on screen? • Is it clear why a player failed? • etc… • Are their variable difficulty levels? From Christina et al 2009
  • 23. >Heuristics Evaluation  Pros - Smaller numbers - Experts are experts  Cons - You need experts - Which heuristics to pick? - Experts are experts
  • 24. >Questionnaires, Surveys & Interviews  During gameplay (at or after set moments)  After gameplay  Ask specifically for what interests you (don’t forget about the raising awareness part) - But also allow for some open ended answers
  • 26. >Questionnaires, Surveys & Interviews  Some pre-existing questionnaires: - Game Experience Questionnaire (GEQ) - http://www.gamexplab.nl/ - The Computer System Usability Questionnaire (if modified) - http://oldwww.acm.org/perlman/question.cg  If you use them modify them to fit your particular game & what you want to know
  • 27. >Questionnaires, Surveys & Interviews  For Emotion – The Affect Grid Russell, Weiss & Mendelsohn, 1989
  • 28. >Questionnaires design  Order of questions  Use clear, concise everyday, simple language - Avoid jargon - Don’t be vague  Avoid asking duplicate questions - The same question in a different way  Avoid questions that are phrased negatively - “I don’t like the jumping” agree -> disagree - “I like the jumping” agree -> disagree
  • 29. >Examples of bad questions  Leading ”Now that you have had fun playing our game, which was your favorite level?” Level 1 Level 2 Level 3 Level 4
  • 30. >Examples of bad questions  Double(7)barrelled: "Should there be a reform of our justice system placing greater emphasis on the needs of victims, providing restitution and compensation for them and imposing minimum sentences and hard labour for all serious violent offences?” Yes No
  • 31. >Examples of bad questions  Loaded, ill defined and misleading: "Should a smack as part of good parental correction be a criminal offence in New Zealand?" Yes No [The question] "could have been written by Dr Seuss – this isn't Green Eggs and Ham, this is yes means no and no means yes, but we're all meant to understand what the referendum means. I think it's ridiculous myself.” - PM John Key
  • 32. >Questionnaires design  Type of question - Open - Closed
  • 33. >Questionnaires design  Type of question - Open - “What did you like about level 1?” - “Why might a child not be able to come to school?” - “How old are you?” - Closed
  • 34. >Questionnaires design  Closed questions - Dichotomous scale - Yes/No
  • 36. >Questionnaires design  Closed questions: - Dichotomous scale - Continuous scale
  • 38. >Questionnaires design  Closed questions: - Dichotomous scale - Continuous scale - Interval scale
  • 40. >Questionnaires design  Interval scale - Numeric/categorical: 1 2 3 4 5 - Likert: 1 2 3 4 5 Disagree Agree - Semantic: 1 2 3 4 5 Poor Good
  • 41. >Questionnaires design  Interval scale - Unipolar: 1 2 3 4 5 Not Exciting Very Exciting - Bipolar: 1 2 3 4 5 Very Boring Very Exciting
  • 42. >Questionnaires, Surveys & Interviews  Questionnaires & Surveys - Pros - Consistent - Quantifiable - Fast - Good for testing raised awareness - Cons - Can lack follow up - Not objective - Need a large(ish) sample
  • 43. >Questionnaires, Surveys & Interviews  Interviews - Pros - Rich data - Can follow up - Cons - Less quantifiable - Time consuming - Not objective
  • 44. >Observation studies  Watch/Record through video  Either with a facilitator or without - Facilitator must be as hands off as possible  Watch faces/body for emotion  Only write what you actually see!
  • 45. >Observation studies  Pros - Objective data - i.e. You see what players actually DO, not what they say they would do - Facilitator can help if really needed  Cons - Time consuming to analyse video - Training required to get the best out of observation (especially for emotion) - Avoid Observer Bias
  • 46. >Gameplay Metrics  Observation via the game data - Number of incidents - Where and when they occurred and with who or what?
  • 50. >Gameplay Metrics Deaths  Pros - Objective data - See trends  Cons - Time consuming Kills - No subjective feedback/context - Needs larger sample sizes - Data overload
  • 51. >Biometrics/psychophysiology  Measuring body signals: - From the Brain (EEG), the Heart (EKG), the muscles (EMG), the eyes (eyetracking), the skin (EDA), etc  The body gives clues into cognition, and emotion
  • 52. >EDA Game Research  Dying is fun?  Ravaja et al (2008) - EDA increased for - Opponent Killed - Player Killed - EMG (Zygomatic and Orbicularis) - Increased for longer when player killed
  • 54. >Biometrics/psychophysiology  Pros - Gives objective quantifiable data - Allows for continuous data recording  Cons - Invasive - Costs a lot of time & money to use & analyse - Problems with specificity, artefacts, inference and validity
  • 55. >Think out loud  An expansion of observation  Players play the game and talk about what they are thinking as they go along - Observe gameplay, and note down what they say & when they say it - Do NOT prompt them, and do NOT correct them
  • 56. >Think out loud  Pros - Gain an idea what players are thinking & feeling - May give unexpected insight - Could test raised awareness too  Cons - Unnatural - Subjective
  • 57. >In production testing (aka Playtesting)  Many options  In your case I recommend: - Observation - Gives objective data, and may provide insights (think out loud?) - After play (between level) questionnaire - Allows you to ask specific questions and receive feedback - Collect gameplay metrics as if you can (Optional)
  • 58. >In production testing (aka Playtesting)  Remember - Don’t wait until the game is almost finished - It is easier to change things early in the process - Listen to what people say is wrong/right, don’t worry too much about what they suggest to do to fix it - You are the game designer
  • 59. >Post-production testing  Raised awareness, it is too late to be overly concerned about fun  Can’t observe, take biometrics, run focus groups, collect gameplay metrics (game runs offline), etc - On the ground observation? Costly.
  • 60. >Questionnaires  Pen & Paper is one option  Better to build them into the game - Between levels or within levels - Multiple choice questions - Fill in the blanks
  • 62. >Questionnaires  In game questions - Require completion to continue & provide rewards for correct answers - Try to make them fun too - i.e playtest them - Have to get the data back somehow though - Uploaded when connected to the web? - Teachers & students can access a report?
  • 66. >In game behavioural modelling  Raised awareness is fine  Actually doing behaviour is better  So, if you can, build it into the game - Require the awareness you are raising to be used to progress in the game - Not just through answering questionaires - But by DOING
  • 68. >Going further than Awareness?  Is there any way you can monitor behaviour change? - New questions on later play throughs? - Teacher/adult report system? - Tied to in-game achievements/rewards?
  • 69. >Summary  Playtest early, playtest as often as you can - With yourselves - But also with players - Behavioural observation allows for good data with small numbers for playtesting  Build in awareness testing - Require it to progress - Try and make it fun too
  • 71. All images used in this presentation belong to their respective copyright holders. If an image belongs to you and you wish it removed please notify me at [email protected]

Editor's Notes

  • #9: This image from Dead Space could be said to have Low Valence, High Arousal = and therefore could be experienced as being scary by players
  • #10: Whereas this image from the game desert bus on the other hand is somewhat low in Valence, but also low in Arousal - Producing boredom.
  • #11: In contrast this is an High Valence, High Arousal moment from Halo Reach which, since it is my Spartan pictured, I can say was a moment of excitement in a firefight game.
  • #12: And finally we have a High Valence, low Arousal image, from the game flower which aims to create a pleasant feeling of relaxation in the players.
  • #13: Now, you will notice there is nowhere on this axis marked “ fun ” – again, fun is somewhat slippery concept. However what can be said is that fun is much more likely to be found associated with emotions on this side on the axis than the other – and is hardly ever associated with this quadrant. The final quadrant is a tricky one, because being scared can be fun – but it can also be overwhelming if the experience is too unpleasant or intense.
  • #35: Closed questions on the other hand let you have much tighter control on the answers your participants give, and come in several flavours. There is the dichotomous scale where people are giving simple yes or no answers.
  • #40: An interval scale uses set positions, and asks for the person filling in the questionnaire to select one of those set intervals. This lowers the resolution of the scale, but makes answers more easy to compare.
  • #59: Whatever you chose to do though, remember it is important that you start doing this research as soon as you think you can manage. It really is much easier to change things early in the process rather than waiting until the end. AND then repeat once changes have been made. And again, do listen to the feedback you get, and observe what people do. But in the end you are the game designers, so you have to decide what is an important issue and what is not.