• Issue #6

UX Research: Drawing the Right Kind of Inferences

A person coloring cogs in a machine next to a cup of coffee.

Hi there.

This week, spreading design research nerd wings, tools for more effective marketing, and news about an upcoming pizza party.

Thanks for being here.

UX Research: Drawing the Right Kind of Inferences

Two weeks ago, I wrote about one of the more common pitfalls in UX research: collecting more data than you can reasonably take stock of and use productively. Last week, I wrote about another common hazard in UX research: collecting data you do not yet have a purpose for. So, I thought I would round out this mini-research series on one of my favorite topics: knowing what you can infer from design research. This is one of my favorite topics because I can spread my nerd wings and go deep into theory on you!

Part of the reason I adore the research phase so much is that it comes down to superb matchmaking. It’s finding the connection between your question and your method for discovering the answer.

To illustrate this idea, here’s a bad match:

Research Question: what is the relationship between pizza crust texture and our customer satisfaction scores?

Research Method: Interview the pizza chef.

I’m not saying that the pizzaiolo doesn’t know how to make a delectable and doughy pizza crust. I’m not suggesting that our beloved pizza maker doesn’t have some natural intuition about what her customers most often enjoy. The problem is that interviewing a pizza maker produces qualitative answers for a statistical problem. It’s a bad match.

So, how do we generate a good match between questions and methods in design research? Is it qualitative or quantitative? Let me take you back a step further to… ahem… logic.

In terms of reasoning, many people are familiar with the idea of deductive reasoning from Sherlock Holmes. Deduction is knowing the “what” (a thing) and the “how” (a general rule or principle). From there, you can reason your way to a result.

But, we work in Academia, where researchers primarily use inductive reasoning (theory building). They know the “what,” and they know the “result.” They’re trying to figure out the “how” or that general rule or principle that can explain why certain things happen. This is theory building. Once you have this workable theory, you can predict results with deductive reasoning or theory testing.

Design research is a different logic altogether. Instead of theory building or theory testing, Design researchers are at a stage of theory generating. In this case, a specific desired result is imagined, and design researchers ask, “what are our ‘things’ and our ‘processes’ that will get us to this desired state?”

So, they kind of flow in order:

Abductive (generating): ?? + ?? = Result

Inductive (developing): What + ?? = Result

Deductive (confirming): What + How = ??

Back to the question part. This means in design research that, we can’t predict; we can only identify. We’re generating the ‘what’ and the ‘how’ to get the specific result. Imagine you’re in a usability lab setting. You’re watching users go through tasks to achieve a result. So, you are calibrating the ‘what’ and the ‘how’ (mainly the how. We don’t really calibrate the users unless we’re creepy). When you see something go wrong for the user (and you identify an issue), you get a more specific idea about the how that will get you closer to the desired result. Prediction requires statistically determined sample sizes to draw reliable inferences from. But, issue identification doesn’t need a sample size, only observation.

How does this work in practice? Well, as my friend Nick likes to say, “You don’t need to see 7 out of 10 people slip on icy steps before you know you need to put some salt down.” In user experience research, we’re surfacing issues, not statistically validating them. We know if one person encounters an issue, chances are more people will, and that issue is worth fixing.

This means that in user research, you can ask questions like what will cause people to experience some difficulty or what is not working for people? But, you cannot gather preference data. The trouble is that when you’re in a design research session, your participants often give you tons of preference data. “I love big pictures; I want to see more videos, I would never watch this video, I want all the links on this page, so I don’t have to search for anything ever again, I want everything centered, there’s too much white space….” These are preferences. Without statistics, they are only intuitions, like what your pizza chef would tell you about the type of pizza dough texture most people prefer.

So, when you run user experience research, focus on what you can identify. Tune out the noise (e.g., “I don’t like this color palette”), and focus on the signal (e.g., “this text is hard to read over this photo”). Then you can change design elements based on identified issues making the experience easier for your audiences.

Kristin Van Dorn

A Few Tools for More Effective Marketing

Regardless of how big or small your team is, we can always use a little extra support. This is especially true for Higher Ed Marketers, who are often tasked with the roles and responsibilities of an entire advertising agency. That’s why I enjoy spreading the word when I come across resources that make my job easier. Here are a few of my favorites.

Pexels - Pexels is a stock imagery and video site where everything is free to use, and attribution isn’t required. Their library of assets is much more robust than you might imagine, and their search functionality is quite intuitive. I’ll be the first to admit, though, that it’s always better to use your own photos or videos for any marketing efforts, but when you’re in a jam, Pexels is wonderfully convenient.

Pro Tip - Quell any reservations you might have about using Pexels content by sifting through their FAQ page.

Free Music Archive - On their homepage, it says, “It’s not just free music, it’s good music,” and I agree after using several FMA tracks for many video projects over the years. Each song has its own unique creative commons license, though, so it’s critical that you understand what you can and can’t do with them. Check out their License Guide so you know what sort of attribution guidelines (if any) might be required.

Pro Tip - Use their custom search tool to look at tracks available in the Public Domain.

Premiere Pro - For Spiderman, it was a radioactive arachnid; for Captain Marvel, an energy blast; and for me, my (marketing) superpowers come from video editing software. Creating content within Premiere Pro allows me to elevate every marketing campaign I’m a part of. The videos look great and can be customized to fit my needs. Keep in mind, though, that it is a paid tool, but because of what you can do with it, most MarComm decision-makers wouldn’t require too much convincing.

Pro Tip - Premiere has a speech-to-text feature that lets you caption all your video content.

These tools have made me a more effective marketer, and I hope your experience with them is the same.

Carl Gratiot

Higher Ed Pizza in Little Rock

If you’re heading to Arkansas for HighEdWeb in early October, join us for a FREE Pizza Party on Tuesday, October 11th from 5 - 7 PM. We’ll be celebrating 10 years of Bravery, chatting about Higher Ed Marketing, and enjoying some of Little Rock’s finest pizza. Register here.