What I Learned From The Podcast Market Fit Survey

What I Learned From The Podcast Market Fit Survey
Feedback and analysis from the podcast market fit survey.

Hello everyone. I recently did a Podcast Market Fit survey to see what folks thought of the podcast. I have done this sort of thing before with my old podcast, The Story Grid Editor Roundtable.

The idea for a Podcast Market Fit survey came from the email tool Superhuman and the excellent article their CEO, Rahul Vohra, wrote about How Superhuman Built an Engine to Find Product/Market Fit. It’s well worth a read if you’re like me and trying to figure out when to scale products or services.

Rahul learned the method from Sean Ellis, who now has a site dedicated to Product Market Fit.

With that background, here is what I learned from the Podcast Market Fit Survey.

The Only Question That Matters

At the heart of the Podcast Market Fit Survey is the first question -- How would you feel if you could no longer listen to The Entrepreneur Ethos Podcast. At first, this seems a bit odd or offputting but it’s at the heart of what you want to achieve -- avoiding loss aversion.

The three choices for this question are Very Disappointed, Somewhat Disappointed, or Not Disappointed. Framed this way, we trigger a strong emotion of loss, which is a lot stronger than a gain of equal magnitude. What we want is a strong reaction so that we get a good signal. If we asked the question the other way, as in “How do you feel when you listen to The Entrepreneur Ethos Podcast”, we’d get different results.

The Importance of Sample Size & Type

One of the challenges with Podcasting is that you don’t have a direct connection to those that listen -- unlike a product or service you sell to someone. This makes it a bit of a challenge to get to your listeners. How I got around this problem is to survey those that have signed up for my newsletter, friends that I know listen to the podcast, and my guests. The total size of that list is 117, which is much less than those that listen but does give me a good starting point for this initial survey.

The thing with statistics is that it’s highly dependent on the population you are taking the sample from and the actual sample size taken from the said population. The best approach to sample size is to use a calculator to determine it as I did in my first Podcast Market Fit Survey. For surveys like Podcast Market Fit or even Product Market Fit, you don’t need as high a precision as you’ll read in some online posts.

As an example, if you use this online sample size calculator with the defaults, for the 117 folks I sent the survey to, I would have needed 90 responses to have a 95% confidence that my results had a margin of error of 5%. Frankly, that’s not going to happen.

Instead, it’s important to think about both the confidence interval and the margin of error for something like this. In reality, I don’t need either high confidence or a low margin of error. What I really want is some indication that I’m on the right track.

So if you replace the 5% margin of error with say a 25% margin of error (e.g. the real values will be within plus or minus 25%), then you’ll need only 14 responses. That same calculator page above will also allow you to calculate your margin of error given your population and sample size.

I received 19 total responses out of the 117 folks I sent the survey to. That gives me a 16.2% response rate, which was my goal, given my minimum needed to be 14 (based on the calculations above). If I calculate the error from the number of samples I got, it turns out to be 20.5%, which again means that the real “answers” will be within plus or minus 20.5% of the results with 95% confidence.

It can also go the other way. Let’s say I wanted to be 80% confident in my answer, with the same sample size from the same population, I’d have a 13.2% margin of error.

If you’re trying to predict the outcome of an election or whether or not your cancer treatment will work, this is a horrible error but if you want to get some feedback on how to make your podcast better, this is certainly good enough at least for me.

Of course, the more data you have the better (to a point) but knowing these numbers makes it a lot easier to judge what the data is telling you.

What I Learned

The number of Very Disappointed if the podcast went away came in at 26%, which is below the 37% number for podcast market fit given something called the Optimum Stopping Theorem, which I won’t get into details here but will say it has to do when to stop exploring and start “exploiting” or using the information you have to make decisions.

While I’m a bit disheartened by the results, I’m pretty happy with the fact that I got an 8.6 out of 10 on the likelihood of sharing with a friend. That’s a good sign along with the score of 63.2% on the Somewhat Disappointed responses. This tells me that while I’m not there yet, I have some room for growth.

Some more important things I learned from the survey include:

  1. Provide some ways to take action or apply what you learned
  2. Have more informative titles
  3. Tell people ahead of time who the guests are so they can ask questions
  4. Ask harder questions
  5. Get better guests
  6. Create a learning group

I did a full write-up on the results over at The Story Funnel if you want to check that out as well.

What I’m Going to Do With What I Learned

So now that I have some feedback on the podcast, now what? I’ll have to say that it can sometimes be hard to hear honest feedback especially on something that you have worked so hard on. I must admit it took a few days to absorb it all and then figure out what to do.

The good news is that I have some great ideas in which to try out. Some of which I have already started to do like give some actionable insights that I learned from the interview and asking the same first and last question.

As for some of the other suggestions, I’ll be looking into them as well because some will require a little more effort and coordination.

Thanks for Listening & Helping the Show Get Better

Thanks to everyone who took the survey. It was great to hear what you had to think. I hope you continue to listen to the show and follow along as I make it better and better.