Surveys Have No Place in Entrepreneurship Classes
Gathering information from customers is the most valuable skill an entrepreneur can practice.
Two common methods for collecting that information are surveys and customer interviews. Customer interviews are, hands down, more valuable for entrepreneurs than surveys because they:
- Provide the depth of insight to validate problem hypotheses
- Provide emotionally driven marketing copy from the customer’s perspective
- Identify high potential marketing channels
- Identify realistic competitors, and competitive advantages
- Provide potential pivot opportunities, by eliciting alternative problems to solve if hypothesized problem is not one customers are seeking a solution to
The qualitative nature of interview-based research gives entrepreneurs the chance to dive deeply into the problems and emotions a potential customer is feeling. It’s those feelings that the entrepreneur will ultimately resolve that will lead to their success.
Surveys in entrepreneurship classes, on the other hand, largely avoid addressing customers’ underlying emotional needs, because few, if any, potential customers will complete a survey about their feelings. Instead, customer surveys in entrepreneurship classes often use leading questions in an attempt to do the impossible – predict future customer behavior:
- Would you use a product that does ______________?
- How often would you use a product that does ________________?
- How much would you pay for a product that does ______________?
The result of these surveys is that students either confirm their bias that there’s high demand for their product without discovering the emotional ways customers describe their problems, or they conclude there isn’t sufficient demand, leaving them without any actionable next steps either way.
Validation surveys provide no actionable marketing strategy if demand is “confirmed”, and no potential pivots if demand is “invalidated.”
While surveys have the allure of producing statistically significant data, statistically significant data on people’s predictions of their own behavior aren’t worth anything – especially in terms of business model validation. If we really want to answer questions like how much customer will pay for a product, there are far more effective ways of doing that than surveys, for example, selling pre-orders.
If we believe interviews are a far more powerful tool than surveys for business model validation, the question becomes:
How do we show students interviews are more powerful than surveys?
In our Surveys vs. Interviews Lesson Plan, we provide an experience that will demonstrate to your students just how much more effective interviews are than surveys, by having them complete both experiences, and compare them.
As a part of our Experiential Entrepreneurship Curriculum (ExEC), we recommend that before this lesson, students complete the following lessons:
- Emotionally Intelligent Innovation. Here they learn that customer problems are the most effective place to look for value propositions, and
- Idea Generation. Here they hypothesize the customers for whom they are uniquely suited to solve problems, and they hypothesize the problems they are uniquely suited to solve
With this background, they begin figuring out how to test those hypotheses.
Step 1: Problem Survey
Before class, ask your students to complete a Challenges Survey (find a sample in the lesson plan). Your students will be asked questions about the problems they face and how they have tried solving those problems.
In ExEC, we provide results from thousands of students at the universities using the curriculum so you can highlight how difficult it is to validate hypotheses about problems students face using a survey. What we find, and what your students will likely produce, are:
- Low volume of responses
- Short answers, with little emotional depth
- Some responses aren’t even comprehendible
Step 2: Surveys vs. Interviews
Start class discussing with students the pros and cons of asking customers about their problems using surveys and using interviews. Each method of validation has pros and cons, as highlighted below. After the discussion, show this table and highlight any relevant points. Let students know they will now experience these differences.
|Surveys Pros||Surveys Cons||Customer Interview Pros||Customer Interview Cons|
|Fast||Difficult to get responses to open-ended questions||Higher quality information||Takes longer to facilitate than surveys|
|Can produce statistically significant results||Don’t provide insights on an emotional level||Significant emotional depth||Results aren’t statistically significant|
|Difficult to probe/ask follow-up questions||Probe as deeply as necessary by asking follow-up questions|
|Often expensive (in time and money) to collect enough quantitative data to be statistically significant||Can explore multiple problems|
Step 3: Discuss Their Survey Experience
In the lesson plan, we guide you through a conversation with your students about this surveying experience. First, discuss why some students did not complete it. Then transfer those reasons to customers from whom they want to gather information. Next discuss what it felt like completing the survey, and how much emotional depth they provided.
Step 4: Interview Experience
We then guide you through introducing your students to customer interviewing. In groups, students will experience being interviewed, interviewing, and taking notes/observing. In these groups, students will ask and answer the same exact same questions from the survey, but in a format that’s much more conducive to problem validation.
Step 5: Compare their Survey vs Interview Experiences
The lesson ends with a discussion, focused on two key points:
- Comparing the quality and depth of information gathered through each method, and
- Comparing the ability to validate problem hypotheses using the information gathered through teach method
This is where the magic happens, as you reveal that in both their survey and interviews, they answered the exact same questions. As professor Emma Fleck told us after this lesson:
“I genuinely feel that this was a light bulb moment in my class. Students were frustrated and angry about this survey and didn’t see the point. However, 2 days later, when we did this as customer interviews, I was able to illustrate to them how much I could learn from using a different format with customers. They really started to understand as many of them had taken marketing research classes and were convinced that all of their customer learning would come from surveys!! Great exercise.”
This is a powerful lesson for students as they begin their entrepreneurial journey. It engages them in two important methods for gathering information to validate aspects of their business model. But more importantly, it offers two benefits:
- Students feel the benefit of interviewing as a hypothesis validation tool.
- Students practice customer interviewing. They learn how to be able to talk to anyone about their problems, so they can put themselves in a position to solve them.
Below is the complete lesson plan of the Surveys vs. Interviews exercise.
Get the “Surveys vs. Interviews” Lesson Plan
We’ve created a detailed lesson plan for the “Surveys vs. Interviews” exercise to walk you, and your students, through the process, step-by-step.
It’s free for any/all entrepreneurship teachers, so you’re welcome to share it.
Get our Next Free Lesson Plan
We email new experiential entrepreneurship lesson plans regularly.
Subscribe here to get our next lesson plan in your inbox!
5 thoughts on “Surveys Have No Place in Entrepreneurship Classes”
I really liked the post, but you’re missing two major points around surveys vs. interviews. Surveys are impersonal while interviews are not. This is important because the personal nature makes interviews a more uncomfortable situation for most students, many of whom will avoid direct personal contact in favor of text (even over emails, and especially over leaving voicemail messages these days). I make them do interviews before doing surveys, and this is the overarching theme I’ve always heard from students. Sidestepping the issue in the article does a disservice to the faculty who will try the interview approach.
Related to this is the second reason, ease of coverage. If you’re not close to your intended market (you’re in a remote college town and your market is urban, or same college town but you don’t know older people or parents, etc.) it is much easier to find or recruit (or pay for) survey respondents where you need them than find people to interview face-to-face. Yes, Justin has that wonderful hack using Mturk and Google Talk (https://customerdevlabs.com/2012/08/21/using-mturk-to-interview-100-customers-in-4-hours/), but getting students to that level of comfort and chutzpah probably takes more time than any of us have, and again, lots of them are not really comfortable talking to people on phones much less video. In reality, finding interviewees is emotionally more taxing and draining on the student than finding survey respondents.
Be aware of this, be sensitive to this, and think of tweaking the next version (or the posted version) of this exercise to account for these things.
BTW, it is possible to design surveys that are a lot more sensitive and better at hitting follow-up questions, so that part of the table is I think a bit of straw-man reasoning. When I was at UMich’s ISR, it was part of the training, but today the info is more widespread. One of the best examples I’ve seen recently is Ryan Levesque’s Ask Method. He makes it look and sound a lot easier than what they taught me 40 years ago, but that is true of so many things, including your stuff.
Thanks for sharing this Jerry. I think you’re bringing up some great points that are shared by a number of professors.
I also love how the crux of both your points is to be sensitive to how daunting and awkward interviews are for students, and I think you’re right on both accounts: students will avoid direct personal contact whenever possible and it is easier to find survey respondents than it is in a remote college town.
At the same time, in my experience, the quality and utility of information gathered through surveys, even ones with well-worded questions, is vastly inferior to the information gathered via customer interviews. The problem lies in two primary factors:
So while I agree with you, students aren’t comfortable doing interviews, I don’t think surveys are the right solution to that problem. Instead, I think we need to do a better job helping students become more comfortable with interviews.
I can’t claim we have the right answer on how to do that, although we’re working on some lessons that we’ll be sharing in the next few months and will be testing at our USASBE Happy Hour party you’ll be there. but I’d rather see our energy put there than on teaching what I view as an inferior customer discovery tools.
(Baby just woke up so I need to run but…) If you or anyone else has examples of surveys that address the issues above, I’d love to see them and re-evaluate my perspective!
When I have students develop surveys, I follow a best practice I learned at UMichigan’s ISR, pilot testing (but Qualtrics and SurveyMonkey both talk about the procedures on their sites – or google “pilot test your survey”). My students have to verbally administer the survey in a face-to-face situation with a member of the target audience. I teach them to watch the reactions of the person, their hesitations, the questions they ask for clarification, and their facial expressions. Once the survey is done, they go over it with the respondent to find out what the person thought the question meant. To get them started on question formatting, I actually point them to your customer interview script generator on customerdevlabs.com. Few surveys survive the first pilot, but that’s what piloting is all about. On the emotional element, I agree it is vital, but I really think the leading edge of online marketers have gotten much better at eliciting and parsing emotions, and more to the point, microexpressions of like, dislike and more importantly commitment. That
That’s a fascinating idea, usability testing a survey! I’d love to run an experiment and see the if there’s any difference in the quality/quantity of response between a survey administered verbally versus one administered online – if there isn’t, could be an interesting approach. My assumption is that people will say much more than they’ll type…but we all know how useful assumptions are 😉
ISR and NORC used to do all surveying by mail or door-to-door or by phone. They usually did pilots in person because when they asked a question (even if they were giving the respondent the answer choices) respondents would often ask questions about the choices, or display confusion or some other response, and pilot testing surveyors were trained to pick up on and explore these as a way to get insight on the question and its shortcomings. You’re right about getting more from oral surveys. Think about your example for the Google Talk survey on your own website. Think of all the additional comments the respondent made.