One of the ways we collect feedback is totally automated. Our help widget randomly prompts users to provide an answer to the question, “How likely are you to recommend Edthena to a friend or colleague?”
This question is in the format of a Net Promoter satisfaction question. And along with ratings 0-10, users can optionally leave us written feedback. Here are the most recent, presented in reverse chronological order:
- Technologically empowering blended with immediate, thoughtful, responsive service. (10)
- I love the convenience of having the interns upload their own videos and documents and the ease of being able to provide feedback on their videos. (10)
- Because I find it such a useful tool for communicating with my interns. (10)
- Edthena is a powerful tool for providing feedback on teacher practice. (9)
- We love Edthena! Thanks for everything! (10)
- It’s an ok resource, relatively easy to use. (6)
- It is a very amazing tool and you get help immediately. (10)
- Because it’s a nice place to hold videos. It’s also nice to have the choice of who you want to see the videos. (10)
- None of my friends would have a use for it. (0)
- It’s a great tool to use for having conversations with colleagues and getting feedback. (8)
- Some great aspects/qualities, some others that need some working out. (7)
- Great tool with super-easy-to-use functions that enables coaches to give direct, specific feedback without needing to be in the classroom. (8)
- Easy to upload, share, and comment on video. I like that we can edit comments after posting. I also appreciate space to upload attachments like photos, pdfs, lesson plans, etc. (10)
- Easy to use and does exactly what I want it to do (10)
So what does this written feedback tell us?
Generally, people are really satisfied with what we’re doing. Basically, reading most of these gives me the warm-fuzzy feeling of hugging a giant teddy bear.
Some people potentially have good ideas for us about how to improve the platform (e.g. statements with rating 6 and 7). I make a point to follow up and ask if they have suggestions. The next great feature idea could be one email away.
And some people are confused by the question (e.g. statement with rating 0). We’ve found out most people who rate us zero are actually happy with their experience. Unfortunately, this confusion prevents us from accurately calculating our Net Promoter Score.
In the case of the zero, the user is answering honestly: her friends in other majors at her university don’t have a need for a platform that helps teachers get better. But she doesn’t necessarily have any complaints about features, experience, or performance.
In essence, she has answered zero (really bad for Net Promoter Score), but has no actual dissatisfaction with the product.
That leaves us with strong evidence that Net Promoter Score is not perfect, and this tool can’t be the only way we try to measure satisfaction.
To try and get at what our users, on average, really think, we’ve actually engineered a way to randomly sample our users via a personalized email survey. We can track who we’ve polled previously to ensure we’re not randomly emailing some of the same people multiple times (which is possible since we do actually randomize the selction).
The goal is to ensure that we can ask more focused questions of our users around product and usability while saving the Net-Promoter-style questions for our customers—the decisionmakers within a program or organization.
We’ve not yet tried it out—yet—but I’m hopeful that, in addition to the warm-fuzzy feedback, we’ll find out more of the subtle pain points that we can improve for our users.