Erika Hall
Review
I found this book incredibly useful when I first came across it; although I rarely recommend it now. If you’re seeking an intro to UX Research or Experimentation I recommend ‘Talking to Humans’ and ‘Testing with Humans’ instead. This book tries to cover too much ground - and therefore has to stay at the surface level.
The chapter on Organisational Research is the exception; I found it incredibly helpful. Product Managers and Project managers get wise to how your organisation works.
The title itself makes a great mental model - ‘Just enough of x’ is a useful lens. It reminds me of one of the 7 wastes of lean manufacturing ‘over processing’.
You Might Also Like…
Key Takeaways
The 20% that gave me 80% of the value.
- Reminders
- Customer service can be a treasure trove.
- Agile describes only how to build - not what to build
- Prioritise the highest value users.
- Defer less urgent research and complete it while the software is being constructed.
- Research Best Practice:
- 1) Phrase your research goal/question clearly
- 3) Be prepared. The better you prep, the faster and cleaner the work goes. Get your process and materials in order before you start. Setup so easy to reuse / replicate.
- 4) Take good notes → If you don’t take notes it didn’t happen
- 5) Allow sufficient time for analysis
- How Much Research is Enough?
- Try to avoid unnecessary research.
- Identify your highest priority questions and your assumptions that carry the biggest risk. Don’t investigate assumptions with little risk or impact.
- Research can validate assumptions and reduce risk, but it can also help you discover opportunities you haven’t thought of
- The answer to how much is enough… is the point at which you feel sufficiently informed and inspired. The point at which you have a clear idea of the problem you want to solve and enough information to solve it
Organisational Research: Helps you understand your organisation.
- Research should include anyone without whose support your project will fail. Executives, managers, subject matter experts, staff in various roles, investors and board members
- What to do with your organisational research
- Create a clear statement of what you need to accomplish to make the project a success
- Connect what you’re doing to the goals of the business
User Research: Helps you identify patterns and develop empathy
- Ethnography = “What do people do and why do they do it?” User research = Ethnography + What are the implications for the success of what I am designing?”
- Ethnography is a set of qualitative methods to understand and document the activities and mind-sets of a particular cultural group who are observed going about their ordinary activities in their habitual environment.
- This is very different from gathering opinions. It isn’t just surveying or polling. And it’s definitely not focus groups.
- Talking to, or observing representative users directly in their environment reduces the risk of bad assumptions and bias
- Assumptions are insults. You risk being wrong and alienating people. Designing for yourself can bake discrimination into your product. Be cautious with assumptions about the age, gender, ethnicity, sex → they might not serve your goals or ethics.
- Your challenge as a researcher is to figure out how to get the information you need by asking the right questions and observing the right details.
- Mindset: “How should what I’m designing interact with this person” try not to be judgmental.
Competitive Research
- Goal: See how other people solving similar problems, and identify opportunities to offer something uniquely valuable
- What matters to our customers? How are we better at serving that need than any competitor? How can we show our target customers our product is the superior choice
- Competitor landscape moves quickly → new options are appearing and product categories are collapsing every day. An accurate, user-centred perspective of your comparative strengths and weaknesses will help you focus your message and hone your image.
Evaluative Research
- Is testing how well your design/solution works for your audience. Best before doing a large public launch. Helps you assessing the merit of your design. The approach will depend on project stage.
- A Heuristic analysis is a quick and cheap way to identify potential issues. 2 colleagues can do it in an hour. It’s a good way to deal with obvious issues in early prototypes before bringing in users.
- You can test a competitor’s service or product. You can test your early sketches.
- Once live you can look at quantitative data (site analytics) to see how people are actually interacting and if that meets your expectations
- Numbers tell you what’s happening. Talking to individuals helps you understand why it’s happening.
- Not a substitute for usability testing
Usability Testing
- The easier it is for your customers to switch to an alternative the more important usability is. The more complex the system, the more work is required to ensure its usable.
- Frequency of testing should depend on how quickly design decisions are being made
- Do it as you go - not just before launch. Build it into your design / build workflow. Create testing process and checklist that includes all of the information and equipment you need.
- You need a … Plan, prototype or sketch, 4-8 participants, facilitator, observer, a way to document, a timer
- Facilitators should avoid leading the user or helping the when they get lost
- Users will blame themselves if they can’t get it to work - ask them to describe how they expected the system to work and why they had that expectation.
Quantitative Research
- Quantitative research and analysis should mainly be used for optimisation.
- Once you can measure your success in numerical terms, you can start iterating towards it
- Look for trends and patterns BUT be careful data can deceive - more page views could be more engagement (or a sign of user frustration)
- Low traffic sites will take weeks to validate something
- The best response to a user interface question is not necessarily a test
- Use for tweaking and knob-twiddling—not a source of high-level strategic guidance
- Can introduce inconsistencies
- Local Maximum Problem: experimentation can lead to a culture of incrementalism and risk aversion. How will you ever make a great leap that might have short-term negative effects?
Deep Summary
Longer form notes, typically condensed, reworded and de-duplicated.
- Being the smart person is more fun than obeying the smart person
- How people can feel how people if they’re just recipients of the analysis
- To select the best research tools… you need to know…
- What decisions are in play (the purpose)
- What you’re asking about (the topic)
- Then you can find the best ways to gather background information, determine project goals and requirements, understand the project’s current context, and evaluate potential solutions.
Types of Research:
- Generative or Exploratory Research is the research you do before you even know what you’re doing.
- Interviews, field observation, reviewing existing literature
- Descriptive & explanatory Research is observing and describing the characteristics of what you are studying.
- Ensures you’re not designing for yourself. You’ve moved passed what is a good problem to solve and are now what is the best way to solve this problem?
- What activities do people engage in which may complement or compete with your solution.
- Evaluative Research is “Are we getting close?”
- Once you have a clear idea of the problem you can define potential solutions.. And test them to make sure they work and meet requirements. Ongoing and iterative as you move through design / development.
- Casual Research is “Why is this happening”
- Looking at usage (cause and effect can be tricky).
- Research Roles represent clusters of tasks, not individual people. Often one person will take on multiple roles in a study
- Author: Plans and writes study. Problem statement and questions, and the interview guide or test script
- Interviewer / Moderator: interacts with the test participants
- Coordinator / scheduler: schedules sessions
- Notetaker / recorder: writes notes
- Recruiter: screens participants and selects good subjects
- Analyst: reviews the gathered data to look for patterns and insights. More than one person should have this role
- Documenter: Reports findings once research complete
- Observer: Watch or listen in
- The research process: We’ll cover ways to organise research activities - checklist & RACi
- Objections to research:
- Sometimes viewed as a threat or a nuisance
- Organisational buy-in needed to proceed
- Get ready to advocate for your research project - before you start it
- Objections you will hear:
- We don’t have time? - You don’t have time to be wrong about your key assumptions though
- We don’t have the expertise or the budget - We can reduce scope
- The CEO is going to dictate what we do anyway - fight dictatorial culture.
One Research methodology is superior (qualitative vs quantitative)
- What you need to find out determines the type of research you need to conduct, its that simple.
Why people say it won’t work:
- You need to be a scientist: this isn’t pure science. Its applied research. You’ll share these qualities with a scientist
- Desire to find out needs to be stronger than to predict
- Depersonalise the work
- Good communicator and analytical thinker
- You need infrastructure -
- It will take too long -
- You can find out everything you need in beta - You can find out if features work, found. However many things are better known before designing / coding
- We know the issue / users / app / problem inside out already - unless research has been done recently a fresh look will probably be useful.
- Research will change the scope - helps stop unexpected complexity later
- Stopping innovation - relevance to the real world separates invention from innovation
Actual Reasons:
- I don’t want to be bothered
- Afraid of being wrong
- Uncomfortable talking to people
Research in any situation:
Design happens in context, research is simply understanding that context
When doing in-house at an organisation:
- Politics are a consideration
- Challenging assumptions of those in power can be scary
- Customer service is a data treasure trove
- You need to understand how product and marketing decisions are made in your company
- Agile describes only how to build not what to build
Read link: : Best practices for Adding UX Work to Agile Development by Jeff Patton
- prioritise highest value users
- Analyse and model data quickly and collaboratively
- Defer less urgent research and complete it while the software is being constructed
Just enough rigor:
Cover your bias - note obvious ones
- Design bias: design of the studies themselves, how they are structured and conducted
- Sampling bias: unavoidable in normal companies, careful when drawing conclusions
- Interviewer bias: interviewers should be neutral,
- Sponsor bias: - concealing your motive (for neutrality)
- Social desirability bias: you want honesty not ego
- Hawthorne effect: behaviour may change when being observed
Ethics of user research:
- Ethical research charter questions
- Best researchers are like Spock. Just enough humanity and humour to temper their logical thought processes and allow them to roll with imperfect circumstances. Rigorous not rigid.
- How do you know you’re being rigorous enough:
- Discipline and Checklists
Research best practice checklist
- Phrase questions clearly (the big question you’re asking, why you’re doing the research)
- Set realistic expectations
- Do this before starting with stakeholders
- Questions to be answered
- Methods to be used
- Decisions to be informed by the findings
- Ask them what they hope for and tell them what to expect
- Be prepared
- The better you prep, the faster and cleaner the work goes
- Get your process and materials in order before you start. Setup so easy to reuse / replicate
- Allow sufficient time for analysis
- Take good notes
- If you don’t take notes it didn’t happen
How Much Research is Enough?
- Avoiding unnecessary research
- Identify your highest priority questions - your assumptions that carry the biggest risk
- Given our business goals, what potential costs do we incur - what bad thing will happen - if, 6 months from now, we realise:
- We are solving the wrong problem
- We were wrong about how much the organisational support we have for this project
- We don’t have the competitive advantage we thought we had
- We were working on features that excited us not the customer
- Failed to reflect on what is actually the most important to our users
- Our users don’t understand the labels we’re using
- We missed a key aspect if our users’ environments
- We were wrong about our prospective users habits and preferences
- If there is no risk associated with an assumption you don’t need to investigate it
- As well as validating assumptions and minimising risks - you may discover new opportunities that you haven't thought of
- All it takes to turn potential hindsight into foresight is keeping eyes open and asking the right questions
- That satisfying click
- No matter how much research you do, there will always be things you wish you’d known
- Design will still be an iterative process
- The answer to how much is enough… is the point at which you feel sufficiently informed and inspired. Known unknowns.
- The point at which you have a clear idea of the problem you want to solve and enough information to solve it
- Collaborate with the team to see if they see the same patterns as you
The Process
The systematic inquiry. Follow these 6 steps:
Chapter 4: Organisational Research:
Understanding you org:
- You have a goal. You want to design something that millions of people will use
- Design happens in the warm, sweaty proximity of people with a lot on their minds.
- People create and join organisations to accomplish greater things more efficiently.
- Organisations become more complex over time.
- Oral culture trumps process of how to get things done. You get politics.
- A design project is a series of decisions, getting the right decisions made in a complex organisation is difficult
- You can have more influence than you think, you need to understand the inner workings though.
- Its inescapable that the nature of the organisation matters to the design process (budgets, approvals, timing, resource and availability… all rely on your ability to navigate the organisation).
- The success of a product or service depends on how well it fits into everything else the organisation is doing and well the organisation can and will support it.
- Your success depends on how well you understand the organisation
- An organization is just a set of individuals and a set of rules, explicit and implicit. Once you understand that environment, you’ll be able to navigate it and create the best possible product.
Put an MBA out of work:
- Organisational research: Determining what drives a business, how all the pieces work together, and its capacity for change is normally done by Business Analysts.
- However, very similar to user research
- Most user-centered design studios will interview client stakeholders (people who will be directly afffected by the outcome of the project) as part of gathering requirements
- May have to do some role playing to gather information (talk to me about how you interact with x, or y team. If you think of a particular role play example, and they refuse to go along, that can often be a sign. Find out why they resist.
- In organisational research, the observer effect can be a force for positive change.
- Ask hard questions and they’ll have to come up with answers - leading to reflection
- Asking the same question of different people will reveal crucial differences in motivation and understanding
- Asking a lot of questions can make you sound smart
- The biggest difference is that you’re talking to current stakeholders not potential future customers
Who are stakeholders:
- those groups without whose support the organization would cease to exist.”
- research should include anyone without whose support your project will fail
- Executives, managers, subject matter experts, staff in various roles, investors and board members
Interviewing stakeholders:
- offer a rich source of insights into the collective mind of an organization. They can help you uncover areas of misalignment between a company’s documented strategy and the attitudes and day-to-day decision-making of stakeholders. They can also highlight issues that deserve special consideration due to their strategic importance to a business
- What stakeholder interviews are for: Hearing the same issues considered by people in different roles relative to your work will give you a much more complete perspective and great insights. Some individual interviews can be valuable on their own, and some higher-level insights are only possible in the aggregate
- Stakeholder interviews will help you understand the essential structure of the organization, how your work fits into the organization as a whole, and the approval process for various aspects of your project. They’ll also provide you with some less obvious opportunities to influence your project’s chances of succes
- Neutralise politics - find anyone who is deeply opposed to your work
- Better requirements gathering - Define success, how will you measure success, technical requirements
- Understanding organisational priorities - we have a maxim based on repeated observation: the more important a project is to an organization, the more successful it will be
- Tailoring the design process - be careful with a one size fits all design approach - you may have to bring together cross functional teams for example
- Getting buy-in: Avoid ‘why wasn’t I consulted’ Humans have a need to be consulted and engaged to exercise their knowledge. Inquiry is flattery, inviting people to participate empowers and disarms them. Find out is your assumptions are true, how much you need to share, worse case scenarios for different stakeholders
- Understanding how your work affects the org- your work affects everyone, Executives will have to defend it as a part of the overall strategy. Customer service will have to support it. Salespeople will have to sell it. Production staff will have to maintain it. If your product drives change, find out in what department, how, if they can cope etc… Anticipate changes in workflow!
- Understanding workflow: - Understand how your work affects operations. Make tour recommendations based on that.
Sharpen your tact:
- Prioritise your list of stakeholders. Meet them. You’ll have to meet with some people for political reasons.
- Find out as much as possible about the people you’re interviewing before you do
- Do individual interviews. Do a group interview if there’s a bunch of people with similar benefits and risks
- You can try a remote email in a pinch. Send only a few key questions
Interview structure
- 30mins, calm, natural, ask if you don’t understand, have a note taker if possible, send agenda & key questions ahead, the more complex the topic ‘the more you should give them a heads up.
- Introduce yourself and the purpose of the meeting
- Explain to what extent the information will be shared
- Get something in writing that people can speak freely
- Always ask
- Role tenure, duties & responsibilities, typical day, teams & people work with & strength of those relationships, define success for this project, concerns for the project, the greatest challenges for the project (internal & external), how will their interactions/workflow change after the project?
- What most common tasks within the system, what problems, what work-arounds, concerns, anyone else I should talk to?
- Hostile witness: Remain calm, breath, restate goal, ask if clear, ask general open ended question (what they think is most important given that’s the goal).
- Often caused by - under informed, power move, under preasure to perfom
- Documenting interviews
- Attitude, goal as they describe it, incentive alignment, influence, who they communicate with, participation in project, are you in harmony or conflict?
- You’ve interviewed enough when you know:
- The stakeholders
- Roles, attitudes and perspectives
- Levels of influence, interest and availability
- How they stand to benefit or suffer with the success of failure of your work
- Likelihood that any of them have power or potential to prevent project success
- How workflow has to change
- Resources to build, launch and support
- Business requirements and constraints
- Do you agree on goals and definition of success
- How people outside the team view your project
- What to do with stakeholder analysis:
- Clear statement of what you need to accomplish to make the project a success
- The requirements ensure stakeholders agree on the purpose and limitations of what you’re doing.
- To increase your chance of success, connect what you’re doing to the goals of the business, increase collaboration, and save costs, particularly those associated with changes.
- Business strategy must remain constant for the duration of a project.
- Requirements must be:
- Cohesive. The requirements all refer to the same thing
- Complete. No missing information. No secret requirements that didn’t make it onto the list.
- Consistent. The requirements don’t contradict each other.
- Current. Nothing obsolete.
- Unambiguous. No jargon. No acronyms. No opinions.
- Feasible. Within the realm of possibility on this project
- Concise. Keeping them short and clear will increase the chances that they are read, understood, remembered, and used. Aim for no more than two to three pages.
- What to include:
- Problem statement and assumptions
- Goals
- Success metrics
- Completion criteria
- Scope
- Risks concerns and contingency plans
- A workflow diagram
- Verbatim quotes
Chapter 5: User Research:
- Every delightful and every frustrating artifact that exists, exists because of a series of design decisions.
- Your work must be accessible, novel, fit into other people's lives and environments. But how?
- User Research to identify patterns and develop empathy
- User Research = Ethnography, study of humans in their culture
- We want to learn about target users as people existing in a cultural context
- We want to understand how they behave and why
- This is very different from gathering opinions. It isn’t just surveying or polling. And it’s definitely not focus groups
- Ethnographic design research allows…
- Understand the true needs and priorities of your customers/ readers/target audience/end users
- Understand the context in which your users will interact with what you’re designing
- Replace assumptions about what people need and why with actual insight.
- Create a mental model of how the users see the world
- Create design targets (personas) to represent the needs of the user in all decision-making.
- Hear the users language and develop the voice of the product
- Everything in context
- Talk with or observe representative users directly in their environment - reducing the risk of bad assumptions and bias
- Physical environment
- Mental model - internal concept of and associations with a system or situation.
- Habits - How does the user already solve the problem you are trying to solve for them, if indeed they do?
- Social networks - intersection of human relationships & digital products. People are social animals and every interactive system has an interpersonal component.
- Assumptions are insults:
- Make assumptions about your users and you run the risk of being wrong. Wrong assumptions in your product can alienate people, maybe before they hear what you have to offer.
- If you design for yourself or your team, you could be baking discrimination into your product
- Assumptions about the age, gender, ethnicity, sex might lead to barriers you don’t actually intend—barriers that don’t serve your goals or ethics.
Good data from imperfect sources
- The first rule of user research: never ask anyone what they want!
- People want to be liked more than they want to tell the truth about their preferences
- Many people lie, many lack sufficient self knowledge to give the truth
- “most people are poor reporters and predictors of their own preferences and behavior when presented with speculative or counterfactual scenarios in the company of others.”
- Your challenge as a researcher is to figure out how to get the information you need by asking the right questions and observing the right details.
- Ask direct questions and you’ll run into defenses and come up with potentially useless answers
- To design what your target users need, you have to know relevant habits, behaviors, and environments, then turn that knowledge into insights you can act on so you can design with more confidence and less guesswork.
What is ethnography?
- Ethnography is a set of qualitative methods to understand and document the activities and mind-sets of a particular cultural group who are observed going about their ordinary activities in their habitual environment.
- Ethnography = “What do people do and why do they do it?”
- User research = “What do people do and why do they do it? + what are the implications for the success of what I am designing?”
- Mindset: “how should what I’m designing interact with this person” try not to be judgmental.
4 D’s of ethnography:
- Deep dive - Get to know a small but sufficient number of representative users very well (in their shoes)
- Daily life - Get into the field where things are messy and unpredictable. Behaviour changes iwth context & circumstances
- Data analysis - take time to gain real understanding, and get your team involved in creating useful models.
- Drama - narratives help everyone on your team rally around and act on the same understanding of user behavior - Personas will keep you honest. You design for them, not for you or for your boss
Interviewing humans:
- Goal: Learn about everything that might influence how the users might use what you’re creating
- To be a good interviewer you need to be able to shut up
- People want to be liked and want to demonstrate their smarts
- When you’re interviewing someone you know nothing. You’re learning a new subject: that person
- Preparation
- Create an interview guide
- Brief description and goal of study (you can share this with the participant)
- Basic factual / demographic characteristics of the user - provide context
- Ice breaker / warm up questions
- The questions or topics that are the primary focus of the interview
- Gather some background information on the topic and people you’ll be discussing, particularly if the domain is unfamiliar to you.
- Interview structure - 3 boxes, loosely joined
- Introduction: Smile, gratitude, purpose, topic, how info shared, obtain permission, opportunity for them to ask questions
- Body: Open ended questions, probe (tell me more about that), leave silences. List of questions as a checklist not a script. Keep it natural
- Conclusion: Thank you, let them answer questions,
Conducting the interview:
- You are the host and the student
- Build rapport, put them at ease, make them comfortable
- Absorb everything they have to say
- Only say things to redirect, keep things on track
- Breathe.
- Practice active listening
- Keep an ear out for vague answers. Follow up with… why is that? Tell me more about that?
- Avoid talking about yourself
- Checklist Ethnography Field Guide from Helsinki design Lab:
- Welcoming atmosphere / make participants feel at ease
- Listen more than you speak
- Take responsibility to accurately convey the thoughts and behaviors of the people you are studying
- Conduct your research in the natural context of the topic you’re studying
- Start each interview with a general description of the goal, but be careful of focusing responses too narrowly
- Encourage participants to share their thoughts and go about their business
- Avoid leading questions and closed yes/no questions. Ask follow-up questions
- Prepare an outline of your interview questions in advance, but don’t be afraid to stray from it
- Whenever possible, snap photos of interesting things and behaviors
- Also note the exact phrases and vocabulary that participants use
- Pay attention after you stop recording. You might get a valuable revelation.
- Offer enough information to set the scope for the conversation, but not so much that you influence the response.
- Tell me about your job
- Walk me through a typical week in your life
- How often are you online?
- What computers or devices do you use?
- When do you use each of them?
- Do you share any of them?
- What do you typically do online?
- What do you typically do on your days off?
- How do you decide what to do?
- Tell me about how your children use the internet
- How do you decide what to do on your days off with your kids?
- What are your particular non-work interests? What do you read online besides the news?
- How frequently do you visit museums in your town? Which ones?
- What prompts you to go?
What to do with the data you collect
- The interview is the basic unit of ethnographic research
- Analyze them together to find themes, including user needs and priorities, behavior
- Research patterns, and mental models
- Note specific language and terms you heard
- If doing generative research, look to the needs and behaviors you discover to point out problems that need solving
- Turn the clusters around user types into personas that you can use for the life of the product or service you’re working on
Contextual Inquiry:
- You enter the participant’s actual environment and observe as they go about the specific activities you’re interested in studying.
- See behaviors in action, learn the small things you won’t hear about in an interview, such as unconscious and habitual work-arounds
- Contextual inquiry is useful for developing accurate scenarios, stories of how users interact with potential features, identifying aspects of the environment that affect how a product is used.
- Settle, interview at a time that's least disruptive, observe and summarise (and repeat back to user)
Focus Groups:
- don’t provide insight into behavior or the user’s habitual context
- Focus groups are supposed to be merely the source of ideas that need to be researched
- Focus groups are the antithesis of ethnography.
- Creates an artificial environment that bears no resemblance to the context in which what you’re designing would actually be used
- Conversation is a performance that invites social desirability bias and gets in the way of finding out what people need and how they behave outside of this peculiar group dynamic
- Participants are more likely to proclaim or conceal behaviors for the benefit of those around them.
The talking (and watching) cure
- Accept no substitute for listening to and observing real people who need to do the things you’re designing a thing to help people do
- A few interviews could change everything about how you approach your work. The information you gather will pay dividends as you gather and examine it, grounding your design decisions in real human needs and behaviors. You will develop powerful empathy that can inspire you to find creative, effective solutions
Chapter 7: Competitive Research:
- Competition is everything else anyone has considered pr started using that solves the problem you want to solve or helps people avoid it
- If you aren’t working on something that solves a real problem or fills a real need then your competitions is Anything else that anyone does with their time and money.
- The hardest competitor is the one your potential customers are using now
- Switching cost
- Inertial - have to love you more than they hate change
- Competition is for business but also for attention in the minds of your target users.
- See how other people solving similar problems, identify opportunities to offer something uniquely valuble
- What matters to our customers? How are we better at serving that need than any competitor? How can we show our target customers our product is the superior choice
User research helps with strengths and weaknesses, opportunities and threats are more competitor research
Competitive Audit:
- Once you’ve identified a set of competitors and a set of brand attributes, conduct an audit to see how you stack up
- Add in competitors mentioned in your interviews or that appear in web searches
- Identify which bits of their work are most relevant and accessible. For each competitor:
- Positioning? Offer?
- Targeting? Vs your target audience
- Key differentiators. What makes them uniquely valuable to their target customers?
- To what extent do they embody your +ve/-ve attributes?
- User needs / wants
- What are they doing well / badly
- Given all that - opportunities to beat, things to adopt or take into consideration
Brand Audit
- Take a good look at your own brand
- Does it set expectations correctly for the whole experience?
- Brand = reputation - signify your identity & reputation to current and potential customers
- Attributes: Characteristics associated/ to be avoided with the product
- Value proposition: what value you offer, how you communicate this
- Customer perspective: what associations do they have with your brand
- Competition often drives how important a brand is, and your positioning
- Name: unique, unambiguous, easy to spell and say
- Logo: Wordmark, lockup, app icon, favicon.
- List all the contexts users encounter it
- Consider competitors
- Will it ever appear on its own?
Usability testing the competition
- Don’t just test yours, test the competitors
- Helps you understand their strengths and weaknesses directly from the users point of view
- Opportunities
- How users conceptualize the tasks
A niche in time:
- The competitive landscape and how what you’re designing fits into it may be the fastest moving target of all research topics. New options are appearing and product categories are collapsing every day. Just taking a user-eye view at how your company, product, and message measure up will give you some competitive advantage. The accurate, user-centered perspective of your comparative strengths and weaknesses will help you focus your message and hone your image.
Chapter 7: Evaluative Research:
- Have an appropriate design solution, it’s a good idea to test how well it works for your audience and its intended purpose before you stage a splashy public launch
- Evaluation is assessing the merit of your design. It’s the research you never stop doing
- Depends on the project:
- early stages, evaluation takes the form of heuristic analysis and usability testing
- test an existing site or application before redesigning. If you have access to a competitor’s service or product, you can test that. You can test even the very earliest sketches
- Once live you can look at quantitative data (site analytics) to see how people are actually interacting and if that meets your expectations
- The numbers will tell you what’s going on, and the individual people will help you understand why it’s happening.
Heuristic Analysis:
- Most casual way of evaluating usability
- Score a product against the following 10 principles:
- System status visibility - appropriate feedback
- Match between system and the real world - familiar language and conventions
- User control & freedom - exits, undo, redo
- Consistency and standards - things that appear the same should behave the same
- Error prevention - Help users avoid errors
- Recognition rather than recall - options should be visible. Instructions findable. Don’t make the user remember information
- Flexibility and efficiency of use - Support shortcuts for expert users
- Aesthetic and minimalist design - avoid providing irrelevant information
- Help users recognise and recover from errors - Error messages should be helpful
- Help and documentation - should be usable without documentation, but help should still be available
- A few of the above focus on error prevention and recovers (often neglected in design)
- Heuristic analysis is a quick and cheap way to identify potential issues. 2 colleagues can do it in an hour. It’s a good way to deal with obvious issues in early prototypes before bringing in users.
- Not a substitute for usability testing
Usability testing:
- Minimum standard for anything designed to be used by humans
- The easier it is for your customers to switch to an alternative the more important usability is
- The more complex the system, the more work is required to ensure its usable
- Usability is an attribute defined by 5 components:
- Learnability - How easy is first time use
- Efficiency - Once learned, how quickly can they perform tasks
- Memorability - when returning after a period, how quickly can they re-establish proficiency
- Errors - How many errors do users make, how severe are those errors, how easily can they recover from those errors?
- Satisfaction - how pleasant is it to use the design?
Cheap tests first, expensive tests later
- Don’t use expensive tests to find out things you can discover with cheap tests
- Start with paper not prototypes
- Start in the office not in the field
- Test with general audience before a specific audience
- Test a competitors before you put pen to paper
- Frequency of testing should depend on how quickly design decisions are being made
- Do it as you go - not just before launch
- Cheap is early, expensive is late, super expensive is with your customers
Preparing for usability testing:
- Build into design / build workflow
- Create testing process and checklist that includes all of the information and equipment you need
- Maintain a database of potential participants
- Decide who is in charge
- You need a …
- Plan, prototype or sketch, 4-8 participants, facilitator, observer, a way to document, a timer or a watch
Usability test plan:
- Revolves around tasks
- Persona and journey led
- Feature scenarios and tasks led
- Provide a scenario to the user before giving them the task (you want to do x)
- Not all tasks are super important - is it a deal breaker?
- The test plan includes:
- What you’ll do, how you’ll conduct the test. Which metrics you’ll capture, the number of participants, which scenarios you’ll use.
- Objectives
- of the test: what are you testing and what state is it in?
- Methodology
- Participants and recruiting
- Procedure.
- Tasks
- Usability goals
- Completion rate (the percentage of tasks the user was able to complete)
- Error-free rate (the percentage of tasks completed without errors or hiccups)
- Add templates for Human factors etc!
- Recruiting - Single use. Share key goals of your target users.
- Facilitating - Guided journey of imagination, clear tasks (unclear ones can’t be tested), personable and patient, don’t intervene, avoid leading the user or helping the when they get lost
- Users will blame themselves if they can’t get it to work - ask them to describe how they expected the system to work and why they had that expectation.
- Observing and documenting - Recording is good. Second person note taking is good.
- Note the following:
- Participants reaction to the task
- How long it takes
- If they were successful / fail
- Any terminology that was a stumbling block
- Non-verbal frustration
- Quotes
- Successful / unsuccessful features
- Using a webcam as a screen / audio recording
- Only use eye tracking with people who can’t verbalise what’s drawing their attention (expensive waste of money)
How bad and how often?
- Severity:
- High: an issue that prevents the user from completing the task at all
- Moderate: an issue that causes some difficulty, but the user can ultimately complete the task
- Low: a minor problem that doesn’t affect the user’s ability to complete the task.
- Frequency:
- High: 30% or more participants experience the problem
- Moderate: 11–29% of participants experience the problem
- Low: 10% or fewer of participants experience the problem.
- Tiers:
- Tier 1: high-impact problems that often prevent a user from completing a task. If you don’t resolve these you have a high risk to the success of your product
- Tier 2: either moderate problems with low frequency or low problems with moderate frequency
- Tier 3: low-impact problems that affect a small number of users. There is a low risk to not resolving these.
Benchmark against your competitors to make a more compelling argument for change!
Chapter 8: Analysis & Models
- Qualitative analysis can seem mysterious.
- You’ll soon get clarity of the analysis then the concepts and recommendations
- Closely Review notes
- Look for interesting behaviours, eMotons, actions and quotes
- Write observations on a sticky note (write a code that tags it back to the study)
- Group observations
- Watch patterns emerge
- Rearrange notes as you build patterns
- Gap Analysis - Use mental models to identify gaps, or mismatches between what you offer and what the user needs or expects. Design features that close those gaps.
CHAPTER 9: Quantitative Research
- Chief aim for quantitative research and analysis is for optimisation
- Define good? What is good? How do you know its good? What does it mean to be best? What are you optimizing for? How will you know when you’ve reached it?
- Once you can measure your success in numerical terms, you can start tweaking
Preaching to the converted:
- Conversion (sign up, buy now, make a reservation)
Ease into analytics:
- Look for trends and patterns
- Be careful: more page views could mean more engagement (or more frustrated audience)
- A high bounce rate means people aren’t getting what they’re expecting when they come to you from search
- Split test - Control and variation.
- Select a goal
- Create variations
- Choose an appropriate start date
- Run the experiment until you’ve reached a 95% confidence level
- Review the data
- Decide what to do next, stick with control or switch to variation, or run more tests
- You need a goal, you need to know the current conversion rate and how much you want to change it
- Low traffic sites will take weeks to validate something
- Cautions and considerations
- Testing can be seductive because it seems to promise mathematical certitude and a set-it-and-forget-it level of automation
- The best response to a user interface question is not necessarily a test
- Good for tweaking and knob-twiddling—not a source of high-level strategic guidance
- Can introduce inconsistencies
- Landing pages for new users - why not
- Navigation around the site - caution!
- Local Maximum Problem: Focusing on small positive changes can lead to a culture of incrementalism and risk aversion. How will you ever make a great leap that might have short-term negative effects?
- The best teams are Spock-like. They embrace data while encouraging and inspiring everyone working on a product to look beyond what can be measured to what might be valued.
- You can optimize everything and still fail, because you have to optimize for the right things. That’s where reflection and qualitative approaches come in. By asking why, we can see the opportunity for something better beyond the bounds of the current best. Even math has its limits.