Qualifications
User Testing &
Design Research

January 24, 2013
Who we are

Analytic Design Group Inc (ADGi) is a visionary user experience     Founded in 2005 on the
strategy and design firm that specializes in innovating in digital   principle that evidence-based
environments by leveraging in-depth primary research to find         design will always be more
expose unexamined assumptions. Our work not only withstands the     powerful than design driven by
                                                                    best practices, we have grown
complexity of multiple agendas and intricate implementation but
                                                                    from a single practitioner to a
also the scrutiny of the public.
                                                                    vibrant, collaborative team.

Some of our clients include:
Samsung, Sony, AT&T, Adobe, Nokia, LG, Motorola
Our Services
Key service areas include: design research, user
experience strategy development, interaction design,
communication design, and usability testing. Lately some of
our work has also included service design considerations as
well. Our projects can include the full sweep of user
experience services (i.e. user research through strategy and
design) or just one element. Our aim is to always fit the work
required to the need, and we’ll work with you to ensure you
are getting the best value from our efforts.


This presentation focuses on our design research and user
testing services.




                                                                                                  / User Experience
/ Design Research /   / Usability Testing /   / Communication Design /   / Interaction Design /
                                                                                                  Strategy Development /
Design Research
Design Research
We use a diverse set of design research methodologies:


Surveys — we have used surveys to establish baseline data
(largely attitudinal), help to segment audiences, and in
some cases, help to identify core issues that can be further
explored by other research.


Context-rich group interviews (like marketing focus groups
but much richer) — the focus groups we do, are typically
very rich and usually drive out a great deal of contextual as
well as attitudinal data. We usually ask participants to
complete homework prior to the session (aids in grounding
the user and supports contextual data gathering) as well as
have some form of participatory design exercise to allow
participants to tap into their feelings and attitudes quickly.
Design Research
On-site observation (w/o) interviews — this is useful when we are
looking for issues that are process related.


Task analysis — this is usually both an expert review and then a
walkthrough with participants to identify particular pain points with
certain tasks. This often involves both offline and online elements.      TASK 1


                                                                        80%50%
                                                                             70%
Expert review/Heuristic analysis — this can be a quick and cost-
effective means of identifying user experience and usability issues.
We typically rank severity of issues identified and can include an
accessibility review in this process.


Card sorting — we have done card sorting exercises in both one
on one as well as group sessions. We’ve used both open and closed
card sorts and typically use the findings to develop information
architectures.


Diary Studies — are useful when we are looking at processes that
occur over a longer period of time or are looking at the impact of
certain things over time.
Design Research
In-Situ & Ethnographic

We have conducted numerous ethnographic or in-situ studies on
a wide range of physical and digital products. These typically are
very data rich and result in in-depth, tactical, near-term findings
as well as robust, strategic, longer-term, insights. Our clients
report that the ROI on these studies is that along with finding
solutions to nagging problems, it can help them focus their
product management for a year or more.

For example, last year ADGi conducted an ethnographic study for
a mobile carrier on a device experiencing high returns. We were
able to identify key usability issues, service design issues, and
deliver insights about how their customers currently perceived
these devices and were likely to for the foreseeable future.
Design Research
Sample Report
User Testing
User Testing
The range of user testing methods we use include:


Metrics-based usability studies — the usability studies we do
are quite rich with quantitative (metrics) data as well as
qualitative data. We typically collect task time, performance,
SUS, satisfaction, and hedonic scores


Remote-moderated usability studies — through the use of
such tools as WebEx (or other screen sharing tools) we have
successfully conducted remote moderated testing, collecting
similar (or the same) metrics as we do for in person tests –
this is particularly useful when testing with participants who
are geographically dispersed or where the user’s context
heavily influences their interaction and on-site observation is
not possible/feasible.
User Testing
‘Listening-lab’ style user testing — this is essentially user testing
without a set task list. There is some hard data we draw out of
these sessions, but mainly this is focused on qualitative data.


Un-moderated usability testing — this is user testing where the
user is in the lab and observed and recorded but completing the
tasks on their own.


ADGi Field Test — this is a web-based tool we developed in house
that automates a field test: participants are asked via email
whether they wish to participate. If they indicate yes, they are sent
a set of instructions or tasks to complete along with an NDA
reminder. After a set period of days participants are then sent a
survey to fill out. From a test administration point of view we can
track all the participants, where they are in the study and get a
graphical view on how they responded to each question, as well
as download a CSV of the results for additional manipulation.
We’ve used this tool to test devices and apps.
User Testing
Navigation testing — this is another tool we developed in
house to test navigation structures. Users are asked a series of
questions about under what categories and labels they would
expect to find certain pieces of information. They are shown
the tree structure for the site and can navigate through it to the
spot where they would expect to find the content. This testing
has been very effective for us in establishing how findable
content on very large sites will be and in determining the
effectiveness of categorization and labeling schemes.


Concept acceptance testing — this is useful for trying out a
new concept, typically while comparing it to other more familiar
ones. We’ve used this on devices when a client wants to
evaluate new way of navigating or different form factor
User Testing
Competitive benchmark testing — this is useful when
comparing a product (interface, device, site) against one or
more others – we have used this to set benchmarks for future
comparison as well as just comparisons


Iterative testing — this is where we test one or at most two
discreet elements with a very small set of users (2 or 3) make
recommendations on that testing, the development team
makes those changes and we test again until we do not see
the need for any more changes. We use this method primarily
for games research looking at a particular interaction. While
other clients have asked about this, after discussing it we
have so far determined that the value of this approach does
not warrant the effort and cost for the project at hand.
User Testing
Remote user testing — Ability and experience in executing
remote usability testing — inclusive of screen sharing, audio
and video recording.


We have experience conducting remote-moderated usability
as well as focus group sessions. We screen share and
capture (audio and video record) the sessions. We have
found that this type of research can be very cost effective
and is especially useful when we are asking participants to
log in to their own accounts, or are geographically
dispersed. On occasion we’ve also found that by having the
user located in their own environment, we are able
to glean more contextual information than we are typically
able to in the lab.
User Testing
Sample Report
User Testing
Sample Report
Mobile Test Lab
Mobile Test Lab

Mobile test lab — we conduct a great deal of testing
on mobile devices and our lab set up is both flexible
and powerful:


Our testing equipment is deliberately flexible so that
we can set up in a lab environment, a coffee shop, a
person’s home or office. We have designed a very
stable, yet flexible camera mount that allows us to
capture a variety of interactions. Assuming we can
connect to a stable WiFi, we can also live stream (to
allow remote viewing) outside of a lab environment.
                                                        For more information view this presentation:
                                                        Mobile Usability: What's Your Strategy
Karyn Zuidinga
Principal & Director of User Experience

604.669.7655
karyn@analyticdesigngroup.com

www.analyticdesigngroup.com
@analytic_design

More Related Content

PDF
Get UX Help FlowChart
PDF
Analytic Design = Experience Design That Works
PDF
Basics of-usability-testing
PDF
Beyond Usability Testing: Assessing the Usefulness of Your Design
PPTX
UXPA DC Redux 2013 Notetaker Perspective 10-25-2013.ppt
PPTX
Measuring the User Experience in Digital Products
PPTX
Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move ...
PPT
Tips for involving users in your website design - commercial property markete...
Get UX Help FlowChart
Analytic Design = Experience Design That Works
Basics of-usability-testing
Beyond Usability Testing: Assessing the Usefulness of Your Design
UXPA DC Redux 2013 Notetaker Perspective 10-25-2013.ppt
Measuring the User Experience in Digital Products
Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move ...
Tips for involving users in your website design - commercial property markete...

What's hot (20)

PDF
Validating hypotheses with user research
PDF
BENCHMARKING MINI-SERIES PART #1: Proving Value & Quantifying the Impact of U...
PPTX
Preference and Desirability Testing: Measuring Emotional Response to Guide De...
PDF
Proposal Template To Increase Traffic To A Website PowerPoint Presentation Sl...
PDF
Automotive MR and Virtual Reality
PDF
[19.2 UserZoom Spring Release Webinar] Get Card Sort Insights with Confidence
PDF
Usability Testing Bootcamp
PDF
Ericsson Review: Crafting UX - designing the user experience beyond the inter...
PDF
User testing on a diet
PPTX
Experience mapping - UX Case study
PPTX
#flashtest: User Research Live
PPTX
What does a product manager actually do?
PPTX
Building Products Your Customers Love with Empathy and Human Insights
PPTX
Research Ready to Build: Compelling Artefacts that Speak Your Agile Team's La...
PDF
Benchmarking Mini-series Part #2: Conducting Quick, Cost-Effective UX Benchma...
PPTX
Politics of design systems
PDF
Tackle the Problem with Design Thinking - GDSC UAD
PPTX
Sample - Design Portfolio Walkthrough
PDF
Introduction to UX
PDF
How to effectively implement different online research methods - UXPA 2015 - ...
Validating hypotheses with user research
BENCHMARKING MINI-SERIES PART #1: Proving Value & Quantifying the Impact of U...
Preference and Desirability Testing: Measuring Emotional Response to Guide De...
Proposal Template To Increase Traffic To A Website PowerPoint Presentation Sl...
Automotive MR and Virtual Reality
[19.2 UserZoom Spring Release Webinar] Get Card Sort Insights with Confidence
Usability Testing Bootcamp
Ericsson Review: Crafting UX - designing the user experience beyond the inter...
User testing on a diet
Experience mapping - UX Case study
#flashtest: User Research Live
What does a product manager actually do?
Building Products Your Customers Love with Empathy and Human Insights
Research Ready to Build: Compelling Artefacts that Speak Your Agile Team's La...
Benchmarking Mini-series Part #2: Conducting Quick, Cost-Effective UX Benchma...
Politics of design systems
Tackle the Problem with Design Thinking - GDSC UAD
Sample - Design Portfolio Walkthrough
Introduction to UX
How to effectively implement different online research methods - UXPA 2015 - ...
Ad

Similar to Analytic Design Group Design Research Qualifications (20)

PDF
Usability testing for qualitative researchers
PDF
Usability testing for qualitative researchers
PPT
Don’t make me think!
PDF
User Experience Design Fundamentals - Part 2: Talking with Users
PDF
User Testing talk by Chris Rourke of User Vision
PDF
Understanding User Experience Workshop - Interlink Conference 2012
PPTX
Remote Testing Methods & Tools Webinar
PDF
Prototyping and Usability Testing your designs
PPTX
Choosing the Right UX Method
PPTX
Designing Better Experiences
PPTX
Discount mobile usability methods
PDF
Usability Testing for Qualitative Researchers - QRCA NYC Chapter event
PPT
Rick Barron: User Experience Testing Methods
PDF
Web Site Usability
PDF
Guerrilla usability testing
DOCX
Embry-Riddle Campus Solutions UX Design
PDF
COSC 426 Lect. 7: Evaluating AR Applications
PDF
Understanding The Value Of User Research, Usability Testing, and Information ...
PDF
NoVA UX Meetup: Product Testing and Data-informed Design
Usability testing for qualitative researchers
Usability testing for qualitative researchers
Don’t make me think!
User Experience Design Fundamentals - Part 2: Talking with Users
User Testing talk by Chris Rourke of User Vision
Understanding User Experience Workshop - Interlink Conference 2012
Remote Testing Methods & Tools Webinar
Prototyping and Usability Testing your designs
Choosing the Right UX Method
Designing Better Experiences
Discount mobile usability methods
Usability Testing for Qualitative Researchers - QRCA NYC Chapter event
Rick Barron: User Experience Testing Methods
Web Site Usability
Guerrilla usability testing
Embry-Riddle Campus Solutions UX Design
COSC 426 Lect. 7: Evaluating AR Applications
Understanding The Value Of User Research, Usability Testing, and Information ...
NoVA UX Meetup: Product Testing and Data-informed Design
Ad

Analytic Design Group Design Research Qualifications

  • 1. Qualifications User Testing & Design Research January 24, 2013
  • 2. Who we are Analytic Design Group Inc (ADGi) is a visionary user experience Founded in 2005 on the strategy and design firm that specializes in innovating in digital principle that evidence-based environments by leveraging in-depth primary research to find design will always be more expose unexamined assumptions. Our work not only withstands the powerful than design driven by best practices, we have grown complexity of multiple agendas and intricate implementation but from a single practitioner to a also the scrutiny of the public. vibrant, collaborative team. Some of our clients include: Samsung, Sony, AT&T, Adobe, Nokia, LG, Motorola
  • 3. Our Services Key service areas include: design research, user experience strategy development, interaction design, communication design, and usability testing. Lately some of our work has also included service design considerations as well. Our projects can include the full sweep of user experience services (i.e. user research through strategy and design) or just one element. Our aim is to always fit the work required to the need, and we’ll work with you to ensure you are getting the best value from our efforts. This presentation focuses on our design research and user testing services. / User Experience / Design Research / / Usability Testing / / Communication Design / / Interaction Design / Strategy Development /
  • 5. Design Research We use a diverse set of design research methodologies: Surveys — we have used surveys to establish baseline data (largely attitudinal), help to segment audiences, and in some cases, help to identify core issues that can be further explored by other research. Context-rich group interviews (like marketing focus groups but much richer) — the focus groups we do, are typically very rich and usually drive out a great deal of contextual as well as attitudinal data. We usually ask participants to complete homework prior to the session (aids in grounding the user and supports contextual data gathering) as well as have some form of participatory design exercise to allow participants to tap into their feelings and attitudes quickly.
  • 6. Design Research On-site observation (w/o) interviews — this is useful when we are looking for issues that are process related. Task analysis — this is usually both an expert review and then a walkthrough with participants to identify particular pain points with certain tasks. This often involves both offline and online elements. TASK 1 80%50% 70% Expert review/Heuristic analysis — this can be a quick and cost- effective means of identifying user experience and usability issues. We typically rank severity of issues identified and can include an accessibility review in this process. Card sorting — we have done card sorting exercises in both one on one as well as group sessions. We’ve used both open and closed card sorts and typically use the findings to develop information architectures. Diary Studies — are useful when we are looking at processes that occur over a longer period of time or are looking at the impact of certain things over time.
  • 7. Design Research In-Situ & Ethnographic We have conducted numerous ethnographic or in-situ studies on a wide range of physical and digital products. These typically are very data rich and result in in-depth, tactical, near-term findings as well as robust, strategic, longer-term, insights. Our clients report that the ROI on these studies is that along with finding solutions to nagging problems, it can help them focus their product management for a year or more. For example, last year ADGi conducted an ethnographic study for a mobile carrier on a device experiencing high returns. We were able to identify key usability issues, service design issues, and deliver insights about how their customers currently perceived these devices and were likely to for the foreseeable future.
  • 10. User Testing The range of user testing methods we use include: Metrics-based usability studies — the usability studies we do are quite rich with quantitative (metrics) data as well as qualitative data. We typically collect task time, performance, SUS, satisfaction, and hedonic scores Remote-moderated usability studies — through the use of such tools as WebEx (or other screen sharing tools) we have successfully conducted remote moderated testing, collecting similar (or the same) metrics as we do for in person tests – this is particularly useful when testing with participants who are geographically dispersed or where the user’s context heavily influences their interaction and on-site observation is not possible/feasible.
  • 11. User Testing ‘Listening-lab’ style user testing — this is essentially user testing without a set task list. There is some hard data we draw out of these sessions, but mainly this is focused on qualitative data. Un-moderated usability testing — this is user testing where the user is in the lab and observed and recorded but completing the tasks on their own. ADGi Field Test — this is a web-based tool we developed in house that automates a field test: participants are asked via email whether they wish to participate. If they indicate yes, they are sent a set of instructions or tasks to complete along with an NDA reminder. After a set period of days participants are then sent a survey to fill out. From a test administration point of view we can track all the participants, where they are in the study and get a graphical view on how they responded to each question, as well as download a CSV of the results for additional manipulation. We’ve used this tool to test devices and apps.
  • 12. User Testing Navigation testing — this is another tool we developed in house to test navigation structures. Users are asked a series of questions about under what categories and labels they would expect to find certain pieces of information. They are shown the tree structure for the site and can navigate through it to the spot where they would expect to find the content. This testing has been very effective for us in establishing how findable content on very large sites will be and in determining the effectiveness of categorization and labeling schemes. Concept acceptance testing — this is useful for trying out a new concept, typically while comparing it to other more familiar ones. We’ve used this on devices when a client wants to evaluate new way of navigating or different form factor
  • 13. User Testing Competitive benchmark testing — this is useful when comparing a product (interface, device, site) against one or more others – we have used this to set benchmarks for future comparison as well as just comparisons Iterative testing — this is where we test one or at most two discreet elements with a very small set of users (2 or 3) make recommendations on that testing, the development team makes those changes and we test again until we do not see the need for any more changes. We use this method primarily for games research looking at a particular interaction. While other clients have asked about this, after discussing it we have so far determined that the value of this approach does not warrant the effort and cost for the project at hand.
  • 14. User Testing Remote user testing — Ability and experience in executing remote usability testing — inclusive of screen sharing, audio and video recording. We have experience conducting remote-moderated usability as well as focus group sessions. We screen share and capture (audio and video record) the sessions. We have found that this type of research can be very cost effective and is especially useful when we are asking participants to log in to their own accounts, or are geographically dispersed. On occasion we’ve also found that by having the user located in their own environment, we are able to glean more contextual information than we are typically able to in the lab.
  • 18. Mobile Test Lab Mobile test lab — we conduct a great deal of testing on mobile devices and our lab set up is both flexible and powerful: Our testing equipment is deliberately flexible so that we can set up in a lab environment, a coffee shop, a person’s home or office. We have designed a very stable, yet flexible camera mount that allows us to capture a variety of interactions. Assuming we can connect to a stable WiFi, we can also live stream (to allow remote viewing) outside of a lab environment. For more information view this presentation: Mobile Usability: What's Your Strategy
  • 19. Karyn Zuidinga Principal & Director of User Experience 604.669.7655 [email protected] www.analyticdesigngroup.com @analytic_design