Readings, Tutorials, & Tools
Assignment Instructions + Submission Template
Assignment Instructions++ on Google Docs (Preferred)
Assignment Instructions++ in MS Word File (File-Based
Fallback):
Assignment Instructions in MS Word File (File-Based Fallback).docx
Assignment Rubric:
Agile Analytics- Peer-Review Rubric (Coursera).docx
Notes on Course vs. Agile Specialization
If You've Taken Courses 1 & 2: These materials are for review and additional depth or practice if
you want it.
If You're Starting Here (Agile Analytics): I think you'll be fine, but if you want to better understand
some topics that span the specialization, they're organized by topic below.
Hypothesis-Driven Development
Tutorial on Alex's Site: Hypothesis-Driven Development. This page covers the basics. It's a good
place to start if you want to understand the general nature of a topic and how it relates to the prac-
tice of agile.
Agile in General
Tutorial on Alex's Site: Agile- Just the Basics. This page covers…the basics. It's a good place to
start if you want to understand the general nature of a topic and how it relates to the practice of ag-
ile.
Tutorial on Alex's Site: Your Best Agile User Story. This page covers the user story- how to pre-
pare to write them, how to use them, and how to link them to implementation and testing.
Agile & IT/Enterprise Software
Tutorial on Alex's Site: The Enterprise Software Playbook. This page introduces the use of prod-
uct design and lean concepts for use in enterprise software.
Design Thinking/Product Design
Tutorial on Alex's Site: Personas. This page introduces the use of personas , one of the primary
design thinking tools we use here in the specialization.
Tutorial on Alex's Site: Problem Scenarios/Jobs-to-be-Done. This page introduces the use of
PS/JTBD, these complement/complete personas and are the other primary design thinking tool we
use here in the specialization.
Tutorial on Alex's Site: Customer Discovery (Persona and Problem Hypothesis). This section
of the handbook steps through methods for talking to users, customers and generally doing design
research.
Lean Startup & Evidence-Based Innovation
Tutorial on Alex's Site: Your Lean Startup. This tutorial will allow you to review the fundamentals
and a few case studies. It also links to the applicable Venture Design templates.
Slideshare deck from David McClure on 'Pirate Metrics'. While we use the AIDAOR framework to
talk about the customer sales/relationship funnel, this is another popular alternative you might want
to explore.
Book on Lean Analytics: Lean Analytics. If you're interested in how to use analytics to drive your
practice of Lean Startup, this is a great resource. It also has a lot of baselines and case studies I find
useful.
Usability Testing and User Experience Design
Tutorial on Alex's Site: Your Usability Test Plan. The usability section of this tutorial shows you
how to anchor a practical test plan with agile user stories.
Book from Donald Norman on Human-Centered Design: The Design of Everyday Things. This
was the first book I read about the practice of design and I still think it's one of the best (possibly the
best). Prof. Norman recently revised it and the new edition is even better.
Book from Laura Klein on Lean UX: UX for Lean Startups. If you'd like to hear more from Laura
about Lean UX, this is the place.
Book from Edward Tufte on Visual Communication: Visual Explanations. If you're new to com-
municating visually, this is an enlightening introduction.
Book from Mike Monteiro on Working as a Designer: Design is a Job. If you work with design-
ers or you yourself work in that role, this is a great book about not only the practice of design, but
specifically how designers make themselves effective.
Venture Design Process
Tutorial on Alex's Site: Venture Design. This page provides a comprehensive set of resources.
What I would do for a general overview is just read the intro to each of the areas (Personas, Problem
Scenarios, etc.).
Tutorial on Alex's Site: Venture Design Sprints. This page provides an overview of design
sprints, which are the primary topic of Course 2 (Running Product Design Sprints).
Congratulations! You passed!
TO PASS 80% or higher
Keep Learning
GRADE
100%
A/B Testing
TOTAL POINTS 4
1.
Question 1
Your team is setting up quantitative measurement of user behavior on Google
Analytics. Which of the following should you make sure you have?
1 / 1 point
Cohort analysis
A goal
Segmentation
A/B testing
Correct
That's right. Working backward from a goal (or goals) is a good way to keep your analytics focused.
2.
Question 2
Which of the following is a good reason to execute an A/B test?
1 / 1 point
You just got a new site online
You're already testing one change
The software will do it automatically
You have an idea you think will improve outcomes
Correct
That's right. If you have an important area of your site or app where you'd like to see a metric im-
prove, and you have a strong idea about what might help, an A/B test is a good fit.
3.
Question 3
When might you want to observe user sessions from a real-time analytics
platform like Mouseflow or Hotjar?
1 / 1 point
You want to look at progression through a funnel.
You want to look at cohort analysis.
You've finished qualitative usability testing and you want to see how those behaviors relate to what
you see users doing out in the wild.
You want to look at the performance of an A/B test.
Correct
That's right. These analytics platforms are a good fit for answering that type of question.
4.
Question 4
What's the potential problem with simply asking "How do you like the site?"
with a pop-up?
1 / 1 point
There's very little context to the question.
The pop-up is a bad fit.
Users may not respond.
Users may not respond to a similar pop-up.
Correct
That's right. For example, what if users love how it looks but can't find what they want?
Qualitative and Quantitative Analytics
LATEST SUBMISSION GRADE
70%
1.
Question 1
Which of the following is a focal point of early stage, exploratory qualitative
usability testing?
0 / 1 point
Sticking to your test plan, no matter the results
Making only one prototype
Using low fidelity prototypes
Making sure you're collecting structured, comparable data
Incorrect
You shouldn't be worried about collecting super specific, structured data at this point. See the "Quali-
tative Usability Testing" video to review.
2.
Question 2
Which of the following is most likely to derail an analytics program and make it
irrelevant?
1 / 1 point
Irrelevant segmentation
Lack of clear actionability for the analytics
Lack of alignment with user stories
Lack of a clear goal
Correct
That's right. Working backward from a goal (or goals) is a good way to keep your analytics focused.
3.
Question 3
Why might it be a bad idea to A/B test on a brand new site?
1 / 1 point
You have an idea you think will improve outcomes
You don't have enough historical traffic to establish a reliable baseline
You want to compare two features
The software will do it automatically
Correct
With a brand new site, it's likely you don't have a lot of traffic. WIth an A/B test, you need a reliable
baseline to A/B test any new features and understand if they are working well or not.
4.
Question 4
For the user story below, which of the following do you think is the most
relevant UX analytical question?
As Ivan the Inside Salesperson, I want to have a sales pitch tool that helps me
know where to focus the call with a given prospect.
0 / 1 point
What type of tool does Ivan want?
Should this tool appear side-by-side with the prospect's name and title?
Does Ivan care about knowing where to focus from call to call with customers?
Does Ivan use the sales pitch tool?
Incorrect
This is a good question, but interviews designed around persona and problem/job hypotheses are a
better way to address this question. For more on that, see the material from Week 1.
5.
Question 5
What is the potential problem with going directly from user story to metrics in
structuring your analytics?
0 / 1 point
Metrics are hard to get
Lack of alignment with user stories
Lack of actionability of your analytics
Too many lines of code
Incorrect
You may or may not end up with good alignment, but you are skipping a step that would help ensure
alignment and actionability. Review the "Pairing Your User Stories with Analytics: Trent the Techni-
cian" video.
6.
Question 6
Which of the following would be a signal that you are ready to move forward
with your user story and into designing?
1 / 1 point
Your team is satisfied with a prototype they developed.
Your users interact with a prototype and achieve a goal you had in mind for the interaction.
You've outlined triggers, actions, and rewards.
You've reviewed your personas, jobs-to-be-done, and value propositions.
Correct
That's right. Every user story should have that kind of testability. Without it, it's hard to do good, fo-
cused, and testable work.
7.
Question 7
How would you change this user story to make it more testable?
As Ivan the Inside Salesperson, I want a pre-call checklist to review.
1 / 1 point
As Ivan the Inside Salesperson, I want to know where to focus the call with a given prospect so that I
can increase our sales.
As Ivan the Inside Salesperson, I want to have a sales pitch tool.
As Ivan the Inside Salesperson, I want to have a sales pitch tool that helps me with my sales calls.
As Ivan the Inside Salesperson, I want to review relevant notes before I make a call, so I know
where to focus the call with a given prospect.
Correct
That's right. This version has a testable goal.
8.
Question 8
We have this user story: As Trent the Technician, I want to review a pre-job
checklist so I make sure I have what I need to start work at the customer's site.
and this question about it: Does Trent use the checklist?
Which of the following metrics is the best fit for acquiring observations to
answer the question?
1 / 1 point
[Dispatches to a Job Site Viewed]
[Checklists Viewed]
[Checklists Viewed]/[Dispatches to a Job Site Viewed]
[Checklists Viewed]/[Total Logins by Technicians]
Correct
That's right. First of all, this is a ratio and those are generally better since most metrics need some
kind of relative context. Second, seems like a reasonable proxy for "sessions on the site where a
technician was dealing with a dispatch and therefore might be interested in a checklist."
9.
Question 9
How can real-time analytics from platforms such as Crazy Egg or Mouseflow
help deepen your understanding of your user?
1 / 1 point
You can look at user retention.
You can look at user acquisition.
You can look at cohort analysis.
You can see how your qualitative usability testing compares to what your users are doing in the real
world.
Correct
That's right. Analytics from these platforms are a good fit for answering that type of question.
10.
Question 10
How would you change this question, displayed within a pop-up, to make it
more useful?
How do you like the site?
1 / 1 point
Were you able to find what you were looking for on the site today?
What would you like to get out of your site visit today?
What do you like or dislike about the site?
How is the site working for you?
Correct
That's right. Asking this question would help you collect more information from a user about what
they are looking to do on the site and if the site is working well for them or not.