THIS APPENDIX LISTS SOME additional resources that you may find useful and/or interesting as you dive deeper into experimentation, or into user experience research around your designs.
We all know one of the best ways to learn about an area is to start with some keyword searches online. Here are some keywords to get you started, listed chapter by chapter:
Keywords: data analytics, data science, data visualization, design thinking, experimental design, introduction to statistics, mixed methods, reasoning biases, user experience research
Keywords: acquiescence bias, attitudinal data, correlation, causality, cohorts, data triangulation, demographics, dependent variable, detectable difference, independent variable, local versus global effects, metric, Philosophy of Science, population, qualitative data, quantitative data, research methods, samples, segments, social desirability bias, statistical significance, test users
Keywords: design opportunity, experimentation framework, hypothesis statement
Keywords: A/B testing, big data, daily active users, errors, experimentation, lab studies, metric of interest, metric sensitivity, metrics, minimal detectable effect (MDE), multiple hypotheses, negative results, novelty effect, pilot studies, positive results, proxy metrics, sample, seasonal bias, secondary metrics, surveys, test cells, thick data, thin data, usability testing
Keywords: collaborative teams, common language, company culture, distributed teams, hiring, learning culture, organizational learning, project reviews, shared vocabulary
Keywords: ethics, institutional review board (IRB), legal, social media experiments
Following are some books we have regularly referred to over the years:
Abelson, Robert P. Statistics as Principled Argument. New York: Psychology Press, 1995.
Gauch, Hugh G. Scientific Method in Practice. Cambridge: Cambridge University Press, 2002.
Hubbard, Douglas W. How to Measure Anything: Finding the Value of Intangibles in Business. 3rd ed. Hoboken: John Wiley & Sons, 2014.
Levy, Jaime. UX Strategy: How to Devise Innovative Digital Products That People Want. Sebastopol: O’Reilly, 2015.
Pearl, Judea. Causality. Cambridge: Cambridge University Press, 2009.
Rubin, Jeffrey, Dana Chisnell, and Jared Spool. “How to Plan, Design, and Conduct Effective Tests.” Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. Hoboken: John Wiley & Sons, 2008.
Sauro, Jeff, and James R. Lewis. Quantifying the User Experience: Practical Statistics for User Research. Cambridge, MA: Morgan Kaufmann, 2016.
Salmon, Merrilee, John Earman, Clark Glymour, James G. Lenno, Peter Machamer, J.E. McGuire, John D. Norton, Wesley C. Salmon, and Kenneth F. Schaffner. Introduction to the Philosophy of Science. Upper Saddle River, NJ: Prentice-Hall, 1992.
For a general introduction to user experience research methods, outside of experimentation at scale, a few good resources are:
Kuniavsky, Mike, Elizabeth Goodman, and Andrea Moed. Observing the User Experience: A Practitioner’s Guide to User Research. 2nd ed. Elsevier, 2012.
Portigal, Steve. Interviewing Users: How to Uncover Compelling Insights. Brooklyn, NY: Rosenfeld Media, 2013.
Sharon, Tomer. Validating Product Ideas: Through Lean User Research. Brooklyn, NY: Rosenfeld Media, 2016.
A text that is popular with students and professionals, this introduces a lot of the philosophy behind as well as the interdisciplinary skills needed for interaction design, human–computer interaction, information design, web design and ubiquitous computing:
Preece, Jenny, Helen Sharp, and Yvonne Rogers. Interaction Design: Beyond Human-Computer Interaction. 4th ed. Hoboken: John Wiley & Sons, 2015.
For an introduction to qualitative methods that complement our approach in this book, a good text is this one:
Patton, Michael Quinn. Qualitative Research & Evaluation Methods. Integrating Theory and Practice. 4th ed. Thousand Oaks, CA: SAGE Publications, 2014.
For a humorous, accessible discussion of p-values and statistics:
Vickers, Andrew J. What is a p-value anyway? 34 Stories to Help You Actually Understand Statistics. New York: Pearson, 2009.
For more on design thinking, here are some excellent resources:
Brown, Tim. “Design Thinking.” Harvard Business Review, June 2008.
Kelley, Tom, and Jonathan Littman. The Ten Faces of Innovation: IDEO’s Strategies for Defeating the Devil’s Advocate and Driving Creativity Throughout Your Organization. New York: Currency/Doubleday, 2005.
Lawson, Bryan. How Designers Think. Oxford UK: Architectural Press/Elsevier, 2006.
Patnaik, Dev. “Forget Design Thinking and Try Hybrid Thinking,” Fast Company, August 25, 2009.
Rowe, G. Peter. Design Thinking. Cambridge, MA: The MIT Press, 1987.
For more cautionary tales in research:
Hargittai, Eszter, and Christian Sandvi, eds. Digital Research Confidential: The Secrets of Studying Behavior Online. Cambridge, MA: The MIT Press, 2015.
There are a number of online resources in the form of blogs, articles, checklists, and Q&A sites which can be a great source of additional information:
The User Testing blog (https://www.usertesting.com/blog/)
The Interaction Design Foundation (https://www.interaction-design.org/)
The Nielson Norman Group (https://www.nngroup.com/)
The Design Council (http://www.designcouncil.org.uk): The double diamond diagram was developed through in-house research at the Design Council in 2005 as a simple graphical way of describing the design process.
Here are some great articles we’ve found about A/B testing:
“A/B Testing: a checklist,” by Lisa Qian (http://oreil.ly/2n8lwm7)
“How do you build and maintain an A/B testing roadmap?” answer from Ronny Kohavi on Quora (http://bit.ly/2jwWaA7)
“Design Like You’re Right, Test Like You’re Wrong,” by Colin McFarland (http://bit.ly/2lxUYyd)
“Four Reasons to use the One Metric that Matters,” by Benjamin Yoskovitz and Alistair Croll (http://oreil.ly/2lWdgW4)
“A/B Testing - Concept != Execution,” by Erin Weigel (http://bit.ly/2mySSxQ)
“Overlapping Experiment Infrastructure: More, Better, Faster,” by Diane Tang, Ashish Agarwal, Deirdre O’Brien, Mikey Meyer (http://bit.ly/2mySVK2)
“Online Controlled Experiments at Large Scale,” by Ron Kohavi, Alex Deng, Brian Frasca, Toby Walker, Ya Xu, Nils Pohlmann (http://bit.ly/2mvoILz)
A/B Testing @ Internet Scale by Ya Xu (http://bit.ly/2mfXqbv)
“Implications of use of multiple controls in an A/B test,” by Lucille Lu (http://bit.ly/2myQZBs)
“Why most A/B tests give you bullshit results,” by Justin Megahan (http://bit.ly/2mg4iWd)
“Experiments at Airbnb,” by Jan Overgoor (http://bit.ly/1f0kvIE)
“Common Pitfalls in Experimentation,” by Colin McFarland (http://bit.ly/2lQ7Jzr)
“A/B Testing and the Benefits of an Experimentation Culture,” by Wyatt Jenkins (http://bit.ly/2mZoDgY)
“Power, minimal detectable effect and bucket imbalance in A/B Tests,” by Twitter Engineering (http://bit.ly/2mvt0Cu)
“We need to fail forward if we want to succeed,” by Mary Porter (http://bit.ly/2ly4i5n)
“The Morality Of A/B Testing,” by Josh Constine (http://tcrn.ch/2mvtQPX)
“Consumer Subject Review Boards—A Thought Experiment,” by Ryan Calo (http://stanford.io/2md7DEb)
“Experimentation Jargon Buster,” by Rik Higham (http://bit.ly/2n8G3ag)
“The Difference Between “Significant” and “Not Significant” is not Itself Statistically Significant” by Andrew Gelman and Hal Stern (http://bit.ly/2mpIDYL)
Coursera courses (https://www.coursera.org/) may be helpful for thinking more deeply about the issues in these chapters. As the content is largely focused on our own experiences. Good examples could include:
Basic Statistics (https://www.coursera.org/learn/basic-statistics/)
Inferential Statistics (http://bit.ly/2nrDQeb)
Improving your statistical inferences (https://www.coursera.org/learn/statistical-inferences/)
User Research and Design (http://bit.ly/2nJcNqR)
Another take on experiment design (https://www.coursera.org/learn/designexperiments/)
There are a number of tools available for companies to implement A/B testing on their own. The following is a short list to help you get started:
Adobe Target (https://www.adobe.io/apis/marketingcloud/target.html)
Google Optimize (https://www.google.com/analytics/optimize/)
Hypothesis Kit from Rik Higham (http://www.experimentationhub.com/hypothesis-kit.html)
Optimizely (https://www.optimizely.com/)
p-value calculator from Rik Higham http://www.experimentationhub.com/p-value.html)
VWO (Visual Website Optimizer) (https://vwo.com/)
If you would like to meet others who are involved in user research, various professional societies and social groups exist, including the following:
The Association for Computing Machinery (ACM)’s Special Interest Group on Human Computer Interaction
IxDA: Interaction Design Association
The User Experience Professionals Association (UXPA)