Generic interview advice tells you to listen and avoid pitching. That is necessary but not enough. For a SaaS, the interview has to diagnose four specific things: real problem, current solution, willingness to pay, and frequency. Miss any one and you will build the wrong product or price it wrong.
The standard customer interview framework — popularized by the Mom Test and a hundred Lean Startup posts — tells you to focus on past behavior, avoid pitching, ask open-ended questions, and listen more than you talk. That is all true, but it is not enough. It is the floor, not the ceiling.
For a SaaS founder, generic interview advice produces conversations that confirm a vague problem exists without telling you whether your specific product will work as a subscription business. You walk away with thirty pages of notes and no clearer picture of what to build, how to price it, or whether anyone will keep paying month after month.
The SaaS-specific problem is this: a subscription product survives or dies on four numbers — activation rate, monthly conversion, retention curve, and price point. Each one is set in the product before launch. The customer interview is the only chance to set them correctly. Generic questions do not produce data on these specific numbers.
You need an interview structured around the four signals that decide whether a subscription business will work. Ask the right questions, and you exit with the numbers you need to set product scope and pricing. Ask generic questions, and you exit feeling good about the conversations and no closer to a decision.
Every SaaS customer interview should produce data on these four signals. Each maps to a specific decision you have to make before launch.
You assumed a problem when you started thinking about this product. The interview has to test whether the problem you imagined matches the problem the customer actually has. They are almost always different in scope, frequency, or framing.
The decision this informs: what your product actually does. If the real problem is narrower than what you planned to build, you scope down. If broader, you may have a bigger opportunity than you thought.
The customer is already solving the problem somehow. Maybe with a spreadsheet, maybe with a competitor product, maybe by accepting the pain and doing nothing. The current solution is your real competition — not the named SaaS competitors you found on Google.
The decision this informs: positioning and minimum viable scope. Your product has to be meaningfully better than what they do today, not better than what some Y Combinator startup ships. Often the current solution is "nothing" or "a spreadsheet" — which is easier to beat than you think, but only if you build for that displacement specifically.
Existence of a problem is not the same as willingness to pay. People have problems they would not pay to fix. The interview has to test whether this specific person would pay $X per month for this fix, where X is a real number you have in mind, not a placeholder.
The decision this informs: pricing and whether the business model works. If five out of five interviewees flinch at $49 per month for a problem they spend $200 per month working around, you have a pricing problem or a positioning problem. The SaaS pricing guide covers what to do with that data.
This is the signal generic interview advice misses entirely. How often does the customer hit this problem? Daily, weekly, monthly, quarterly, once-per-year? The answer predicts your retention curve more than any other variable.
The decision this informs: whether subscription is even the right model. A problem someone hits once per quarter does not justify a monthly subscription — they will cancel and resubscribe, killing your retention math. Daily or weekly frequency is the signal that subscription works. Monthly is borderline. Less frequent than that, you may need a different business model.
The interview pool is more important than the interview technique. Talking to the wrong people will produce confident, useless data. Five interviews with the right person beats fifty with the wrong one.
Interview the specific person who would pay. Not "people in the industry." Not "potential users." The decision-maker who would write your invoice from their company budget, or the consumer who would put their personal card on the line. If your product is for agency owners, do not interview agency employees — they cannot say yes to a tool. They can only forward you to someone who can.
Skip people who are too friendly. Your friends, your former colleagues who like you, people who already know you are working on this. They will tell you the idea is great. They will not pay. Interviews with friendly audiences are worse than no interviews because they generate false confidence.
Interview strangers if you can. Strangers have no incentive to be polite. They will tell you the truth — that they would not actually pay, that the problem is not really their problem, that they have already tried something similar and quit. Finding strangers is hard, especially without an audience. The getting customers guide covers where to find them in specific communities.
Five interviews is a minimum, not a maximum. Patterns become visible at five. Confidence becomes possible at ten. Beyond ten you are usually procrastinating the decision. If you have done fifteen interviews and still cannot decide, the problem is not data — it is willingness to commit.
These are the questions I recommend asking, organized by which signal they diagnose. None of them are clever. All of them are specific.
The last question is the most important one. Verbal "I would pay" is not data. Acting on the verbal claim — clicking through to a pre-order page within minutes of the conversation — is data.
Customers will lie to you. Not maliciously — they will lie because being honest with a stranger about whether you would buy their product is uncomfortable, and most people resolve that discomfort by being supportive. The interview's job is to detect the lies and discount them.
Signals of polite enthusiasm: phrases like "that sounds cool," "I could see myself using that," "definitely something the industry needs." These mean nothing. Discount them entirely. They are the customer being kind, not the customer telling you they will pay.
Signals of real demand: the customer leans forward and tells you a specific story about the problem without prompting. They name a specific tool they currently use and rant about its shortcomings. They ask when the product will be ready. They give you their email and credit card before you ask. They forward your conversation to a colleague.
The activation test: at the end of the conversation, ask them to do something — pre-order, sign up for a waitlist with a card, schedule a follow-up demo. If they agree but then do not follow through within 48 hours, treat that as a no. Real demand acts. Polite demand schedules and disappears.
Interviews can become a procrastination tool. Founders run them indefinitely because each one feels productive without forcing the decision they are avoiding. The decision is always: build or kill. Interviews exist to inform that decision, not replace it.
Stop interviewing when one of three conditions is met. First, you have hit five strong signals across all four diagnostic questions — build, with the scope and pricing the interviews suggested. Second, you have hit five clear no signals across willingness to pay — kill or pivot. Third, you have done ten interviews and the data is mixed — make a decision based on the strongest pattern and accept the risk.
The trap is doing twenty interviews hoping for clarity that will not arrive. Twenty interviews of mixed data is still mixed data. More interviews of the same audience asking the same questions will not produce a different signal. If you are at fifteen and still unsure, the question itself is wrong — not the sample size.
If you are deep into interviews and stuck on what to do with the data, the validate a SaaS idea guide covers the kill-versus-continue thresholds I use, and the AI business coach for solopreneurs page covers what diagnostic Marcus can run on your interview notes.
Paste the five most useful answers from your interviews. Marcus will tell you which of the four signals you have proven and which you still need data on.
Talk to Marcus free →