From finding to doing

Framing outputs for increasing impact of design research

Visual of ice cream app with surrounding graphics that illustrate the translation of data (as dots) to findings / recommendations (as lines and UI elements).
Illustrations by Misha Cobb 2022

What comes out of a user research study? Many potential outputs can serve different functions at different phases of a design project. Demystifying data, findings, insights, frameworks, opportunities and recommendations is vital for crafting actionable next steps for design and communicating research outcomes. Because if a research study happens and no one understands what to do with it, did it even happen?

Diagram outlining research outputs from data through findings, insights, frameworks, opportunities and recommendations. Below is a map of deliverables that correspond with each output phase, including data collection, synthesis, co-analysis, top-line reports, detailed analysis, framework creation, insights reports, insights reframing, HMW brainstorming and prioritization.
Research outputs and how they can be used in the design process.

What was said

Data is the raw stuff. It’s the exact words of a participant, the ranking values, the pass/fail rates. There is no interpretation applied to the data. It is simply what happened.

Visualization of data as a cloud of dots
Data is the disorganized raw information.

Analysis should always start with data, whether that be verbatim notes or direct transcripts. True analysis should never start from notes based on interpretation. If notes read more like the voice of a notetaker than the voice of a participant, take a step back and look at the transcripts.

It is important to note that a data point is not an insight. When project teams latch onto one quote, one task failure, or one piece of information, they risk losing the forest for a single tree.

To that end, data is typically only useful to the researcher, not the project team or client. When people want “the raw data,” oftentimes what they really want is an interpretation of the research that clearly draws a connection between insights and data. Using data to tell the story of research, via quotes, ranking values and pass/fail rates, lends credibility to insights and recommendations.

Data examples:

  • 7 people failed to put sprinkles on their ice cream.
  • “I can’t find the sprinkles — are they under the chocolate chips?.”
  • “This idea for an ice cream truck is great!”

What was heard

Findings are the first layer of synthesis in a review of data. They’re the initial categories, the first pass, the immediate “so what did we hear?” Findings should explain what the team witnessed in the study, meaning they are interpretations of the data based on the project’s needs.

Findings visualized as data dots organized into lines
Findings take data and begin to make sense of them.

Findings are great for top-line reports, initial co-analysis sessions and summary emails. They can give some idea of what was heard without giving away any insights prematurely. Findings are a vital layer of the analysis process and shouldn’t be skipped unless there are intense time constraints on the project.

A note on language: For small-scale qualitative work, I recommend not attaching numbers to findings unless the finding can be quantified at least somewhat objectively (pass/fail rates, rankings, etc.). I find that saying “8 participants want x feature” based on qualitative data can lead research audiences to focus too heavily on quantity of feedback as opposed to overall sentiment, which is a misleading framing for these types of studies. Additionally, there is an element of subjectivity to interpreting qualitative data that can be acknowledged by framing findings as related to One, Some, Most, Nearly All, and All participants.

Findings examples:

  • Most participants who failed tasks couldn’t find items within the mega-menu.
  • Most participants expected the sprinkles to be under the chocolate chips.
  • Nearly all participants had a strong positive reaction to the idea of an ice cream truck.

What it means

Here’s where it gets fun. Insights are collections of findings that can tell a project team how findings should impact the project. Insights can be numerous or concise. They are the “so what” behind the findings.

Insights visualized as findings dots organized into circles
Insights organize findings into ah-ha moments.

Constructing insights looks differently for different researchers, but ultimately they are a summation of findings into a consolidated list of important things to know. Here is where you can also make the leap from saying ‘participants’ (hard base in the data) to ‘users’ (a level extrapolated from the data). They’re what stakeholders actually want to know when they show up to a research readout.

When communicating insights, ideally a researcher should be able to tie them back to individual findings, and then specific pieces of data. These insights should be able to stand independently and provide clear, effective inspiration to a project team.

Here is where design can pick research up and start to run with it, depending on the project and the designer.

Insights examples:

  • Relying on a mega-menu to build an ice cream cone is confusing for users.
  • Current information architecture doesn’t support an intuitive ice cream cone building experience for users.
  • Users want access to ice cream anytime, anywhere.

How we think about it

Insights are helpful, but how should we think about them? What kind of mental models exist in our problem space? For discovery or generative studies, an output may be a set of frameworks that can be evolved and refined based on further research.

Insight dot translated into a framework visualization of a gradient with a spotlight on it
Frameworks provide us with a way of thinking about insights.

Frameworks can be themes, experience maps, journey maps, user types, personas, service blueprints, illustrative design vision screens, etc. There are countless versions of these types of outputs that are meant to serve as a common language for multi-disciplinary project teams. They should be easy to understand, evolve and manipulate as boundary objects that can bring people together.,

It is important to consider the frameworks that will be most useful to a project before going into data collection. The intended frameworks should influence the way methods are chosen, questions are posed and session activities are designed. A framework should be chosen based on the goals of the study and the intended impact of the research. If a stakeholder wants to understand a single user type’s motivations, an experience map may not be the best framework to communicate those types of insights.

Framework development is a pitfall for research performance. It’s very easy for stakeholders to generate some personas based on what they think users want, but real, true, human-centered design frameworks should be aggregations of insights, findings and data and be able to be traced back to source material and real primary or secondary research.

Frameworks examples:

  • Themes: Efficiency in ice cream building, Personalization in topping choices, Ease of access to ice cream
  • Personas: The on-the-go ice cream eater, the savor-the-moment ice cream connoisseur
  • Experience map: Want ice cream > go to ice cream truck > access touch screen > find ice cream flavor > add ice cream toppings > buy ice cream > wait > pick up ice cream > eat ice cream

What we could do

Often, research ends at frameworks. Then there’s a lull as design attempts to internalize the blast of information at the end of a research study. Sometimes, designers are so overwhelmed that they glaze over the insights, ignore the frameworks and stick on one data point that speaks to them and design toward that. Here’s where opportunities come in.

Visualized question mark with lines and dots representing findings
Opportunities give us a launching point for further ideation off of an insight.

Opportunities are reframed insights that open up an insight for exploration. While insights explain, opportunities invite. They can provide project teams with jumping off points for innovation. While divergent opportunities may seem to most naturally live within a generative or discovery project, they can also have value for more evaluative, usability based projects as well.

Opportunities should take into account business goals and design questions. They should bring clarity to the design team around where to focus. Opportunities are awesome workshop kick-offs.

“How might we?” statements are one of the most common ways of framing opportunities, but there are countless ways of phrasing insights as open ended questions that can help bridge the gap between research and design.

Opportunities examples:

Insight: Relying on a mega-menu to build an ice cream cone is confusing for users.

Opportunity: How might we design a way of navigating the ice cream creation workflow that showcases all of the toppings available to users?

Insight: Current information architecture doesn’t support an intuitive ice cream cone building experience for users.

Opportunity: How should the information architecture support the ice cream creation workflow for maximum topping use?

Insight: Users want access to ice cream anytime, anywhere.

Opportunity: How can we embed ice cream in people’s lives?

What we should do

Opportunities are great for when there is time and space to do some ideation or exploration, but many times the team needs direction and they need direction now.

Series of UI elements symbolizing actionable recommendations
Recommendations are the connection between insight and action.

Recommendations are reframed insights and can be defined either by the researcher or in collaboration with designers. They take into account not just what could be done but what should be done based on time constraints and business needs.

Recommendations can be prioritized and placed on product roadmaps to be explored when the time is right. They can be big strategic research initiatives or small usability tweaks. They are the fullest expression of what the research is telling the project team to do. Good recommendations should be broad enough to be inspirational but specific enough to be actionable. Ideally, they should be collaboratively determined by a multi-disciplinary project team.

Researchers can be afraid to wade into the recommendation space since it may feel too prescriptive. ‘Well, our users didn’t tell us to do this! Who am I to interpret these insights to mean a certain next step?’ When working in the space of applied research, I’d argue that recommendations are simply assertive insights that are necessary to bridge the gap between research and design. They make the leap a little shorter and give a multidisciplinary team a starting point for consensus or disagreement.

Research can get a bad rep as a discipline that holds things up. Recommendations are how researchers can insert themselves into the design process as active participants who serve to push a product forward, not hold it in a state of suspended ambiguity.

Recommendations examples:

Insight: Relying on a mega-menu to build an ice cream cone is confusing for users.

Near- term: Refine mega-menu to be more easily navigable by users

Long- term: Explore alternative workflow navigation experiences

Insight: Current information architecture doesn’t support an intuitive ice cream cone building experience for users.

Near- term: Determine most effective way to group toppings, possibly by layer (whipped cream first, then candy pieces, then drizzles)

Long- term: Conduct card sorting activity to uncover users’ mental models for the site

Insight: Users want access to ice cream anytime, anywhere.

Near-term: Continue development of mobile app

Long-term: Conduct research to define when and where people most often desire ice cream

At the core of it all is the data. Without rigorous research planning and data collection (coupled with a deep understanding of business goals and stakeholder perspectives) there can be no confident recommendations. That said, without confident recommendations, there can be no progress.

A simple rephrase of an insight can make research outputs more actionable, useful and inspiring for a design team. Ultimately, the value of research is not just in how it is done but also how it is communicated. People need to understand insights to act on them. This way of thinking can help researchers pinpoint the right framing of information to ensure that research is seen, heard, and most importantly, acted upon.

Illustrations by Misha Cobb 2022

10/21/22 — Edited to add clarification on Recommendations section

Read the full article here

Leave a Reply

Your email address will not be published.

Qualitative Coding for UX Research Analysis

Qualitative Coding for UX Research Analysis

Table of Contents Hide What is qualitative coding?

How to Start a Web Design Agency in 28 Days: Week Four

How to Start a Web Design Agency in 28 Days: Week Four

Table of Contents Hide Day 22: Pitch DeckDay 23: Kick-Off ChecklistDay 24: Find

You May Also Like