Teacher-Approved Tips For Crafting User Interview Questions

As a teacher, I asked high-impact questions every day. Here’s what I learned.

 

Here’s a throwback to my very teacherly desk drawer! It also serves as proof that my obsession with brightly-colored sticky-notes long predates my transition to UX Design.

 

Questions Are Fuel

When I taught ELA for high school students, I quickly learned the value of questions. Questions could make or break a discussion, and discussions had extraordinary potential for engagement and collaborative learning. An excellent question was a conduit that linked students to curiosity, discovery, progress, and reflection. Not-so-excellent questions were discouraging dead ends and discourse dampeners.

Through time, trial, and (much) error, I developed criteria to evaluate questions before I launched them in class. Of course, well-planned questions still missed the mark at times. This criteria did not lead to foolproof questions; instead, it helped me remain mindful about what I was asking, how I was asking it, and why I wanted to ask it in the first place.

If a question flopped during class, all was not lost. By responding to student needs in the moment, I could adjust the question (or in some cases: scrap it), and get the learning back on track. Plus, dissecting “failed” questions helped me further hone my question-crafting abilities.

 

Criteria for Great Questions

While lesson planning, I strove to write questions that were clear, purposeful, properly scoped, and relevant. In my transition to UX Design, I found myself revisiting these same principles when I constructed interview scripts and usability tests.

 
  1. Clear – Students should spend more energy considering their response and less energy decoding a muddled question. I asked complex questions when relevant, but conveying the ideas clearly was always top priority.

  2. Purposeful – While questions could serve many purposes, I considered why I was asking a question and how it furthered the relevant learning objectives. Identifying each question’s purpose put the focus on quality and helped me cut repetition or fluff. Class time was precious, and busy work was loved by no one.

  3. Properly Scoped – Questions could be wide open or extremely narrow. Narrow questions (I’m thinking of anything that can be answered succinctly, or questions that have clearcut, concrete answers) served as fruitful warm-ups, as they invited students to “wade” into the discussion. Once students were comfortable, they were equipped to tackle more open-ended questions that required more effort (and sometimes more courage) to answer.

  4. Relevant – The discussion was for my students, not for me. I had spent years immersing myself in content that was unfamiliar to many students. Thus, I was mindful about generating questions that would be accessible to everyone. This didn’t mean avoiding nuance — it was a framework for putting students at the center of my planning. Additionally, my class built knowledge and context together; I asked very different questions at the beginning and end of a unit.

 

UX Research Application

Questions are a bridge between the participant and the desired insights. As a researcher, I strive to build traversable bridges. This means investing time to draft, review, and revise interview scripts before the sessions. Collaboration is a powerful tool for improving scripts. Additionally, testing the questions on a friend helps me catch issues (especially with wording) that are difficult to unearth through proofreading alone.

After I have facilitated the interviews, there’s one more very important step: reflection. When I was a teacher, I had a mentor who added sticky notes on every lesson plan she taught. This was her 5-minute method for distilling successes and growth areas before they got swept away in the current of a busy schedule.

I adapted this method as a teacher, and it transfers seamlessly to my work as a researcher. At the end of an interviewing day, I revisit the script document, and add comments based on participants’ responses to the questions. It’s a bit meta to do research on the research, but it helps me identify best practices and avoid missteps in the future.

 

My go-to reflection prompts for evaluating my user interview scripts after conducting the interviews:

  • Clear:

    • Did any questions cause confusion?

    • Did I need to rephrase certain questions in the moment?

    • What questions felt most accessible and easy for participants to understand?

  • Purposeful:

    • How well did the interview questions serve the research goals?

    • Which questions were especially fruitful?

    • Which questions were less fruitful?

  • Properly Scoped:

    • Did any questions feel too wide or too narrow?

    • How did the questions build on each other and help participants “wade in“ to the interview?

  • Relevant:

    • Did participants have any difficulty accessing the question content?

    • Was the necessary context provided?

    • How effectively did the questions support the participants’ perspective?

Previous
Previous

What Bathroom Lock Design Can Teach Us About Privacy Settings

Next
Next

Where Delicious Pizza And Awesome UX Meet