Skip to content

Garbage In, Garbage Out: Design Process Survey

My name is Brian T. O’Neill, and when I’m not getting paid to strike percussion instruments in musically appealing ways, I run an independent design and UX consultancy called Designing for Analytics that specializes in custom enterprise data products and analytics apps.

IIA recently invited me to write a guest post on their blog regarding some research I conducted while running my workshop in Portland at their 2019 IIA Symposium. My session was centered around empathy, and how this is a fundamental ingredient in designing great user experiences for enterprise data products and analytics services.

I had a great time meeting many of you at the session, and for those who did not attend, I introduced the five-part design process I often use with my clients as applied to the world of analytics. We then “practiced” stage one of that process–UX research–which often involves user/stakeholder interviews. Good interview techniques are a critical skill in developing the empathy that is necessary to ensure your analytics software endeavors are centered around the humans that will use them. When we ask the right open-ended, how/why questions, it helps us uncover latent problems, needs, and attitudes of our stakeholders that are rarely apparent on the surface. In short, my goal was to help get to the “good stuff” that rarely is uncovered when we just ask our stakeholders, “What data do you need? What are the requirements?”

I’m telling you: in my 20+ years of designing software UIs and UXs, I’ve never come across a great product or service that was driven by a requirements doc! We have to involve users in the process if we want to reliably and repeatedly deliver good, usable, useful software. If you’ve built a product or decision support tool that ended up with no user engagement, low user engagement, extreme usability issues, or head-scratching, then you know what I’m talking about.

Anyhow, the workshop itself is beyond the scope of this post, but prior to everyone practicing their UX interviews in pairs, I invited participants to take a brief three-question survey about how frequently their own analytics teams are conducting UX research.

Each question had five response options, and these were the same for each question:

When was the last time you:

  1. ...conducted a UX-style research interview of a customer/user?
    (Note: See one of my past articles on this topic)

  2. ...watched a user use my service or shadowed their job (i.e. shadowing is like doing "ride-along" style research where you spend time observing somebody in their work environment)

  3. ...conducted a formal usability study of your service?

So, what did we learn in this mini-survey?

Garbage In, Garbage Out: First, You Gotta Design the Survey Properly!

Now wait just a second; you thought I was just going to tell you my findings? Of course not!

Since we’re talking design and UX here, first I have to get a little “meta” on you and talk about the design of the survey questions themselves. Pardon me if this seems a little “Mr. Miyagi” of me, but before you sweep the leg, you have to sweep the floor, and get the basics right. In my excitement to capture this survey data, I overlooked a few aspects of doing good quant research, and so first, I’m going to share those thoughts with you:

  • You gotta design the survey right! By talking to "users" (in this case, my IIA respondents were the first to take the survey), I realized I left out some response options (answer possibilities) in the question. For one, there was a no choice to say "never," which is relevant. There also wasn't a "n/a" choice for users on question 3, which could be relevant for users who are just getting started creating a new service. Now, this was actually pretty lame on my part given I worked on a redesign of an open source survey creation tool many years ago called Lime Survey. The team I was working with at MITRE actually met with some domain experts and academics from Harvard's Institute for Quantitative Social Science on survey design, and I remembered they mentioned the importance of having deliberate vs. implicit options for responses such as "Skip" or "N/A," with the point being that there are often times when you want to distinguish between a null (accidental skip or pass) vs. a deliberate pass. I did have "skip/other" response, but in this case, I should have let people respond with "never" as a choice since this is distinctly different.

  • Bias! As part of the workshop, I talked about good interview techniques as a way to help research facilitators (YOU) stop self-reflecting when you're trying to develop empathy and create more human-centered solutions. In this case, I made the implicit assumption that all the attendees could have answered "yes" to these three questions at some point in their past. The last response option for each question was "more than 2 months ago." This creates friction; if you have never conducted a usability study before, should you answer "more than 2 months ago" because it was the "maximum" time answer? Or should you respond with "skip/other"? Because I am a product designer and UX consultant, I brought my "UX" bias into the survey design; I forgot to allow the very relevant "never" response option to these questions.

  • The “Other” Response Option Can Enlighten: Oh, how we data people hate the typed-in “other” response option on fixed-response data fields, right? Darn those people who do things like type “B” in the “gender” field–because the product was sold to a “person” with a gender of “Business.*” Anyhow, more data to clean up right? Well, because of the fact this survey was taken using pens and paper, it allowed people to write messages on their paper, and these “outlier” responses were insightful. One of my favorites was the fact that at least one person said, “we have a user study planned next week!” I loved hearing that one, but the survey didn’t account for this type of response. Which gets me to my point: we do qualitative interview research in part because surveys–by design–don’t facilitate the open-ended conversations we need to uncover the tangents and outliers that may lead us to innovative solutions. While that “Other” field may illuminate us occasionally, it is not a reliable means to have an empathetic conversation with our users and stakeholders.

    *That’s a real story from Mark Madsen that he told me at a tiki bar in Portland during IIA.

And now a few early findings and thoughts from the first survey (IIA Symposium session attendees). Remember, this is a small sample size, so take my opinions and findings accordingly:

This content is hidden.

Click here to view Brian’s preliminary survey results

Anyhow, as I have more attendees at my conference talks take this survey and the sample size grows, I will be updating my mailing list about my findings. See below if you’d like to subscribe.

Until next time, remember that designing a data product or analytics service to have an engaging UX doesn’t start with data, pixels, markers, or data viz. If you want your analytics or decision support tools to deliver value and be thought of as innovative, you have to develop empathy for the humans who are going to use and depend on them. Besides, it makes the problem space–and the downstream solutions–a whole lot easier to figure out.

Yours,

Brian T. O’Neill
Founder and Principal, Designing for Analytics
Host of Experiencing Data
Mailing List | Podcast | Free DFA Self-Assessment Guide