What Is a Leading Question?
It’s a question with an opinion.
We break down leading questions in this week’s rant.
- Why they are bad for learning what people really think
- How they cause suspicion about your motives and erode trust
- The key questions to help you identify bad leading questions
- We walk through how to fix a real life leading question
- Examples from: Client intake questionnaires, corporate employee surveys, community surveys and more…
This is a long one folks. (Just treat it like a podcast and you’ll be fine. 😉)
Full research nerd level this week.
But if you are serious about writing surveys or interviewing customers, community members or doing any research with people, you need to know a leading question when you see one – and know how to avoid writing them. If you want to get better at spotting them in the wild, this is for you too. You’ll be amazed at how many leading questions are out there.
The Transcript
Karyn: Today’s video is not going to be short. I’m just putting it out there. If you were expecting something in the 10 minute range, that’s not going to be this one because we have big nerdy subjects to talk about today. We’re going to talk about leading questions and just to give you a quick description. A leading question is a question with an opinion. It has a position. Leading questions are not inherently evil, but they can be very problematic. They show up everywhere. In surveys and interviews and regular conversation, in political polls. I’m here with Maggie, of course, my fellow nerd buddy. Why should our viewers or listeners care about leading questions?
Maggie: What a good question. And like Karyn said, this is not going to be a short video because there are lots of reasons to care about leading questions. Number one, because they give you bad data. Leading questions bias responses and biased responses aren’t usable and they’re not useful. So you’ve just asked a question and because it’s crap, you have to throw away all the responses. So that’s a bummer. Number two, you lose trust when people feel like something is being prescribed to them either in the way that the question is asked or in what the potential responses are.
Maggie: For those of us who are maybe a little bit conspiracy minded, it tips people off a little bit to think like why are they asking this? What is the purpose behind this? Will this be used in some kind of nefarious way that I don’t consent to? And third, you should care because you get asked a leading questions all the time. And Karyn just shared some great examples of the kinds of places you might be asked leading questions like, political polls, poorly written surveys, twitter polls, the Internet in general, and to be able to spot these in the wild.
Karyn: So that is why you should care. Because if you’re doing this and you think you’re getting unbiased data, you’re not. So you’re wasting your time. And it makes you look bad to everyone you’re trying to survey and we want to help you be able to recognize them because they can be sneaky. Not all leading questions are blatant. Some of them are sneaky, they fit in emotional words. We’ll get into the nitty gritty of that, but we want to help you recognize them, so that when you’re writing your own you can give yourself a checklist and say, am I leading everyone to a certain answer and I don’t mean to do that. And when you’re seeing other ones in the wild and you recognize them.
Karyn: Who are we and where can you find us? I am Karyn Kelbaugh and you can find me at heykaryn.com. I help small businesses with customer research and I help nonprofits with evaluation, which is another fancy way of doing customer research.
Maggie: I’m Maggie Hodge Kwan and like Karyn, I work with nonprofits and philanthropic foundations around their evaluation efforts and other projects as needed and you can find me at www.creativeclarity.ca.
Karyn: So let’s get into the fun stuff. Leading questions, which again, they are not inherently evil, but like if you’re designing a conversation. Let’s think of a few examples where they’re fine or they’re innocent, like you’re having an interview with somebody and say you’re a podcast interviewer and you’re interviewing someone and they have a really exciting story to tell. And so you say, what was the most exciting thing about that? And you’re thinking, well, I’m assuming that it was inherently exciting and I am leading you to tell me something exciting. So it’s more of a guiding question that’s helping the interview along that you’ve already agreed to do. Right? So it’s less about, I want your unbiased opinion of what happened. We’re telling a story together and I’m helping guide you along in that story. That’s one example where it’s fine, it’s harmless. You’re not necessarily doing damage by guiding the questions.
Maggie: Yeah, yeah. You might be in the checkout line at the grocery store and the cashier might say, “Are you enjoying this beautiful weather?” when the sun is shining after days of rain. It’s not a harmful question.
Karyn: So it’s not like every time we should say “Hey! That was a leading question. You should ask me how do I feel about the weather today?” You can’t. You can’t call them all out. They’re not all bad because that cashier is likely not trying to collect unbiased data for a report. They’re just making conversation. So there are plenty of situations in our daily lives where we use leading questions in an innocent way and not in a nefarious and sneaky way. Now granted you can use them with your kids. Like “Are you sure…” I’m trying to think of a good one.
Karyn: Something about brushing teeth and eating vegetables. I’m sure there’s a million of them that I use probably every day. Are you sure that you want to say that? You said you brushed your teeth? And they look at you and say “I’m gonna do it again just to be extra sure I got them all.” So they can be totally innocent so we don’t want to just bad mouth anytime a leading question gets used.
Examples of Leading Questions
Karyn: But let’s go through some examples of leading questions that might not be the best use of a question.
Maggie: And we have some fantastic examples that we’ve put together from our own experience and from all the corners of the Internet. Karyn, do you have a favorite you want to share?
Karyn: We were doing a little googling around and seeing what we could find to find some really good examples. And there was this one. I kind of want to save it. Let’s start with the, “do you believe in yourself” one first. We tried to get some examples from different contexts, like from like a business to business relationships or corporate and employee relationships. Political polls. I don’t know if we actually pulled one from a political poll, but we have a good political one and then just some other ones from different contexts that we wanted to pull together.
Karyn: So one that we got, and this was a real one pulled from the audience that people had seen on a coaching intake form or something like that. And it was “Do you believe in yourself enough to invest in your business and success?”
Karyn: I have feelings about that.
Maggie: Lots of feelings.
Karyn: This one was kind of shows up as a double (barreled questions) too, doesn’t it? Because they’re saying, do you believe in yourself? And, are you willing to invest in your business? It’s a solid guilt trip.
Maggie: Yes. Right up front. It’s telling you, you joined this coaching program and if you don’t see the results you want or if I as a coach don’t deliver, I have an automatic out. I’m going to say that it’s because you didn’t believe in yourself enough.
Karyn: My first gut reaction to that question is I do believe in myself enough to invest in my business but not with you because you’re being manipulative, right? That’s a solid example of okay, I have more questions for you because this feels slimy.
Karyn: Another example that we got from a corporate environment where there’s an employee and it’s a corporate survey trying to gauge how the employee feels about his or her contribution to the company. And the questions was “I know how my everyday actions contribute to the company’s overall goals.” And you’re thinking that doesn’t feel too bad? Right? I either know, or I don’t. But who’s going to answer this honestly? This is your boss reading it. Who’s going to put “I have no idea how my actions tie in to the company’s overall goals. Why am I here?” There’s a lot of subtext in this question. “I don’t know my purpose in this company. Why should you keep me?” Oh, we should keep the people who know why they’re here. It’s got a lot of subtext.
Maggie: The whole time you’re crossing your fingers hoping that it’s anonymous and there’s no IP address connection or anything like that.
Karyn: Is this linked back to my desk? And that goes into the whole thing called the Hawthorne Effect, which is that people act differently and respond differently to questions when they know they’re being watched. You know, your boss who’s in charge of your employment is reading this survey and so you’re going to have a natural tendency to people please and to do whatever. Go along with it. You put “yes, I absolutely know how my everyday actions tie into my company’s goal. That’s a ridiculous question. Why would you even ask that?” There’s that people pleasing element that’s already there and have the questions that are pushing them in a certain direction and then they get data. And even if everyone answer that question honestly, this is the part that kills me. Even if everyone in that company answered that question honestly, that data is still not helpful because they don’t know that. I mean, there’s always going to be a few people who put C C C right?
Karyn: But there’s no way for the people administering the survey to really know that that’s the truth, because there’s that effect of people pleasing and they just want to look good for their boss.
Karyn: Do you want to read another example?
Maggie: Yes. I have some examples up in front of me. One that I think is great for its simplicity is a survey question that says “You don’t smoke, do you?”
Karyn: You don’t do that do you? There’s no judgement in that questions at all! Not any bit of judgment. OMG, so judgy.
Maggie: When it would be so easy to redesign a question to find out if people smoke or not by just asking them if they smoke. Yes, no, sometimes, blah blah, blah.
Karyn: Have you smoked in the past 30 days? Yes, No. Ok. Thanks. Super easy!
Maggie: Without judgment.
Karyn: And this is on a real survey?
Maggie: Yes And of course when you say “you don’t smoke, do you”, you’re judging people, and shaming people from the get go. And if for some horrible reason you’re the survey giver and you’re there. Hello Hawthorne Effect. Like Karyn mentioned, of course you’re going to be like, nope, no, I don’t smoke. “You don’t drink, do you?” Nope.
Karyn: It’s like when there’s that bias and judgment and then bullying in the question, it’s clear what answer they want you to give. Then this might be a leading question. What else do you got?
Karyn: I have another good one. Again, this one you should be able to spot from miles away. It’s a question that says, “I assume you would agree that teachers do a heroic job for our children.”
Karyn: And the answers would be something along the lines of strongly agree, agree, also agree, completely agree. Those are hard because you read that and you’re like, of course I agree. It’s almost like they get into this ridiculous redundant self pleasing question kind of thing where, well yeah teachers are heroic, but why are you even asking that question?
Maggie: Yes. And that’s exactly one of those times where this red flag raises. Okay, so who is asking this question for what purpose? What are they doing with the data? And don’t get me wrong because I truly do believe that teachers are heroic and do a wonderful job of teaching children. Even I who loved teachers would look at that and be thinking like, is this from a lobbyist, a union that wants to say that 100 percent of Americans agree that teachers are the nation’s true heroes. Like how is this going to be used? And why are they asking it that way?
Karyn: Yeah. Like when, when there’s inflammatory language in there, one of the things that we were talking about earlier was how sometimes really people aren’t actually out to collect the data. You will get a survey or a phone call that’s literally just trying to sell you something and they’re covering up their sales call or their pitch by pretending to ask you questions about the topic so that they can tell you about their product. So there’s that. And then another one was an example of political parties or groups or community groups trying to fire up their base of people by sending out questions that use inflammatory language that they know they’re all going to agree with. And then they’ve got data to feed back to them and it’s just, they’re using it as a hype machine almost. And with all that language in there.
Maggie: And do we have a good example of one of those questions?
Karyn: I have to read this in my best southern lawyer voice. I’m sorry. I love lawyers. I love people. I live in Texas, but I just pictured this in a courtroom setting with somebody from a movie. Please forgive me. See if you can spot all the inflammatory language. There’s emotion, there’s extra adverbs and adjectives that don’t need to be in there. Okay. “What should be done about murderous terrorists who threaten the freedom of good citizens and the safety of our children?”
Karyn: Yes, every single adjective builds and builds.
Karyn: And you’re just like, whoa. That was loaded and you know, it’s just, it’s just a good example of they’re leading you to a certain answer. Does this questions even need to be asked? Of course we don’t like terrorists. They cause terror, okay. We don’t have to do this thing where we build in and have this question that’s just leading people in a big direction. So those are some big outrageous examples so that we could kind of show you the different gamut of those.
How to Spot Leading Questions
Karyn: But the next thing that we wanted to talk to you about was how to identify them. So we’ve just given you some good examples of outrageous questions. But they aren’t all outrageous. We came up with a quick two question filter or a checklist. Maggie, would you like to talk about that?
Maggie: Yes, I would. So super easy. A couple things to keep in mind when you’re looking. So if you read a question and you feel kind of unsure about it or maybe something to ask as you read any questioning and get more familiar with this.
Maggie: First thing to think: Is this question seeding a particular response? So with the one that Karyn just mentioned about terrorism, the question is clearly built to get you to say yes, terrorists threaten the safety of our nation and our citizens and our children. Like she said, does that question need to be asked? When I think probably we’re all in agreement that terrorism is not good. And like Karyn said, probably that was designed to elicit a strong response to feedback to the people who feel that way. So is it seeding a particular response? That’s the first question.
Maggie: And the second: Is the purpose of this question to gather unbiased data? So, by saying, “I assume you agree that teachers are heroic” or something along those lines, we know just by looking at that question statement that it is not designed to collect unbiased data. A question that was designed to collect unbiased data, would lose the word heroic, would ask for your perspective or feelings, would give lots of options or would leave an open ended question where you could share your own feelings and perspectives about the role of teachers.
Karyn: I was thinking the other day about another way of approaching whether or not something is a leading question. Especially if you’re writing your own and most of us, I hope are not going to have these dramatic goals of really wanting to get this one response out of people. Because we want to actually learn from our clients and our community members.
Karyn: Would the person responding, feel comfortable giving any answer to this? If they gave an answer that they knew wasn’t maybe a mainstream response. Would they feel comfortable based on your question giving any answer that they felt compelled to give without having to get up the courage to say, well actually no. Do they have to give qualifiers before they answer it? But in general, do they feel comfortable giving any answer to that question? To me also look at it from their perspective on the answerer side and not just the question asker side.
Maggie: So if we think about that heroic teachers example, if you were given a statement that says, “I assume you would agree that teachers do a heroic job for our children” and your response options were to agree or disagree, you’re kind of bullied into agreeing no matter how you feel versus, you know, if you’re asked a non leading open question about teachers where you can say, “The public school teachers I had growing up were wonderful influences on me. I appreciated their energy and enthusiasm. I have some concerns about my children’s teachers because blah, blah, blah, blah blah. But overall I’m, I’m grateful for them and thankful that my kids are happy at school” or something. You’ve got the full nuance of a person’s own experience rather than a person sort of having reservations but being goaded into saying I agree because that’s the road they’ve been led down.
Karyn: Because they’ll look bad if they don’t. Especially if any of these are in a public setting where you have to answer in front of other people. Social pressure is real. I clearly have to give this answer because I don’t want the backlash of not. And even though I have all these other things I would like to say about the topic, there’s no spot for that.
How to Fix a Leading Question: Example Walk-through
Karyn: So how do we improve? We’re going to go ahead and give you another example that is a real example from real surveys that currently exist in the world. And that’s all you need to know. So this one would be an example of a community survey for youth about alcohol and drug use. So these are really common. They are all over the country, communities all over the country use these types of surveys to try and see how young people feel about alcohol use and all that kind of stuff.
Karyn: So one example of a question that is absolutely a leading question. “How wrong do you think it is for people your age to drink alcohol? The response choices for this question, are “not wrong at all”, “a little bit wrong”, the neutral option, “wrong” and “very wrong” See if you can spot the bias in that question and those responses. Do you see anything wrong with that?
Maggie: I can’t even believe there’s a neutral option. I thought you were going to say medium wrong.
Karyn: Maybe there’s just like wrong, very wrong, super wrong.
Maggie: Completely wrong.
Karyn: An undecided. I’m sure it’s undecided, the good old standby of undecided.
Maggie: So from the hopper, it has a judgment, right? How wrong do you think this is? Well, I guess I know it’s some kind of wrong.
Karyn: And then you’re the kid and this question is being asked of teenagers. And so, I feel for the people who wrote the question. These are people who are trying to honestly get young adults and teenagers to not drink alcohol. Like that is their goal and their mission. So to them it feels completely wrong to ask a question that might give the perception at all that they’re okay with alcohol consumption. I understand where they’re coming from when they wrote this. However, it is a totally biased question because it is putting judgment on the kid immediately.
Karyn: Even if they have a slightly different opinion, the furthest away is saying “not wrong at all”. That’s the furthest they go away from wrongness.
Karyn: So now we’ve got a nice solid example of a leading question. How do we fix it? How would we go about improving it? I was just going to go through some of our ways of looking at it. And then we’ll go ahead and apply those to the question and we did this ahead of time because it took some thinking.
Maggie: Well before you even do that, Karyn, can I say I appreciate you talking about the intent of the people who wrote this survey. Again, I just wanted to point out, it goes back to what you were saying. Sometimes it really is egregious. Sometimes the person writes a leading question with the intent to get a politicized or a response that leans heavily in one direction or the other.
Maggie: In this case, it’s a leading question without that intent, but with some blinders on around potential responses and actual youth perspectives. So not villainizing this question or the people who wrote it, but just noting as you are working through your own surveys, be aware of blind spots in your own thinking.
Karyn: And it goes back to one of our little tests. Would a teenager feel comfortable, I mean they’re teenagers so they do what they want. But for the ones that might care what you think, are they going to think, well everyone else is going to put this. They have that extra social pressure and not that anyone is going to know what their response is, but they might think their teachers are going to read it. So that all plays a part, anonymous or not. Would they feel comfortable giving whatever response based on the question alone.
Karyn: So how to improve these. That is an excellent point about the purpose. Most leading questions I feel happen because we’re trying to get a certain answer and we just asked the question that way and we don’t think about it. We don’t stop to think about it , we’re not all malicious people.
Karyn: How to improve these. Identify the words or the phrases that are biased. So using adverbs, adjectives, one sided adjectives, like only using wrong and words that are like wrong and wrongish and wrongness and heroic, super, all those extra adjectives. Emotional language, words that imply judgment, and ensuring that the response options are also not leading. So you could have a totally fine question and then might actually have leading responses so they can go together and work together or they can work against each other and mess up your question. So just going through our question and right off the bat, WRONG. I would circle the word wrong because that implies a judgment and that’s our first problem with that question.
Karyn: Identify the purpose of the question. This is the point of all of these videos that we’re doing.
Maggie: I’m fully with you on the purpose of the question. And I think that people go so sideways in survey writing here that they lose the thread on the purpose or they try and dress it up or make it academic. But what are you actually asking people? What information are you trying to get out of them? And then how can you ask that in a way that makes sense, doesn’t lead, isn’t loaded and still provides you useful information? And yeah, this is a rant for a whole other video. But if you can’t justify why you’re asking a question or what you’re planning to do with the responses, please don’t ask it.
Karyn: I have a budgeting wizard friend that talks about everything you spend money on needs to earn a spot on the budget. Every question needs to earn a spot on the survey or in your group of questions, whether it’s a survey or an interview or whatever it is. It needs to have a purpose and you need to be able to say what you are going to do with that information. So yes, we could soapbox this all day, but that happen in another video. So “it’s nice to know” is not a reason to ask a questions because that just means that no one’s ever going to do anything with it and now you’ve wasted everyone’s time. So knowing the purpose of the question can help you. We had to drill down the purpose of this “how wrong” question. Now I don’t know what the original intent was, but based on my experience with this question, the main purpose is to know what young people think about alcohol use by people their age.
Karyn: So why don’t we ask that question? I have my handy dandy notes here because we wrote them down. So we thought, okay, what if we rewrote it so that’s the question we asked. What do you think about alcohol use by people your age? It feels like it could go either way. I don’t feel immediately pushed. And it’s hard to eliminate every ounce of bias, but you can make it as neutral as possible.
Maggie: Yes. You are leaving room for someone to say, I think it’s perfectly okay, or for someone to say, oh my goodness, that’s awful. Or really anywhere, any of the gradients in between.
Karyn: This next one was tricky because you could leave it open ended, but some of these surveys go out to thousands and thousands of kids. That’s a lot of data. So if we were to replace the response options, that all said some version of wrong, how would we do that? And so another way to help improve your question is removing and replacing the biased words and phrases with neutral language to create more balance. So we were able to totally eliminate the “wrong” word from the question, which was awesome. We just took it away. We didn’t have to replace it with anything other than just “think”. But if you’re giving a range of options, you have to put something there. And so it’s about creating balance and so it’s not just wrong and not wrong. What are words that are less judgy than “wrong”? And there could be a lot of different ways to do this, but the ones that we came up with were “harmful” being on the one end and for the other end we thought it can’t just be safe.
Karyn: What’s an actual antonym or a balance word for harmful? And the one we came up was “beneficial”. I don’t know how many times I’ve seen a study talking about the benefits of alcohol. There’s data out there (good or not). For the record, I’m not arguing about the benefits of alcohol at all. I’m just saying this is a way to approach it. There’s stuff out there saying that it’s beneficial and the opposite of harmful would be beneficial. Now I know the folks doing the survey are probably not going to put beneficial on the survey for kids because then the parents would get mad. But anyhow, we were trying to fix the question.
Maggie: When we talked through it, we were looking at what response the survey writers were looking for and what kinds of information could they harness from these potential responses. And so, it’s pretty obvious that if a large contingent of youth say harmful, you know where they’re sitting. But what about kids who might say beneficial and who are thinking, “I’m super stressed from trying to keep up a high GPA and get into a good school and whatever. And so I sneak one of my dad’s beers and drink it to fall asleep at night. And to me it’s beneficial because I’m anxious and it helps me be less anxious and it helps me sleep.” And again, we’re not endorsing this as a public health practice. But we’re just saying that if you had a high number of kids or an interesting number of respondents choose that, that would be a spot to do some more digging and to figure out, what are the attitudes and behaviors and perspectives there? Because that would inform the work of this entire coalition that wants to see teens and young adults drinking less or not at all.
Karyn: Right. And so it goes back to the purpose of the question. What are they trying to learn? Are they trying to learn the actual beliefs? Or are they trying to learn the forced beliefs within this certain window of allowed beliefs? Well, you’re allowed to think these things. If you’re out here we don’t want to worry about you. You must respond these ways. Versus, okay, we’ve got a nice solid group of kids who said harmful, but we’ve got these other kids, say 25 percent, who were on the zero to beneficial. Let’s find out WHY they think it’s beneficial. Let’s dig, let’s do more research.
Karyn: So it all comes down to are you just trying to get data to fill in a report and meet your grant requirements or are you actually trying to learn? Don’t get me started.
Maggie: We can do a whole other video about that!
Karyn: Are you just trying to meet your quota or do you actually want to know what people think about it so that you can learn and help them and learn from them.
Karyn: I could come up with a million scenarios we can dig into that, but we are just trying to talk about the question.
Maggie: I know I’m going down a rabbit hole of. Okay. Why else would you say it’s beneficial? Does it have to do with parental attitudes or coping mechanisms?
Karyn: But again, it goes to the integrity of the question and the integrity of the question asker. Think about it from the kids’ perspective. Okay, you’re asking me to answer this? Yeah, yeah. I’ve seen this survey a million times. And then they see one that literally gives them real choices. Now, granted there would still be that suspicion of my teacher’s going to look at this. So there’s still always that Hawthorne effect. It’s just there when they know they’re being watched. But if you gave people the leeway to answer how they really felt, you would learn so much more. I could go on for days.
Karyn: Moving on to other ways to think about improving the question. Consider respondent comfort. So this the one I talked about already. Would the respondent feel comfortable providing any answer. I can answer what I actually think and I don’t feel like I’m going to be penalized for that.
Karyn: And also consider using open ended questions. So we wouldn’t have to put beneficial or harmful if it was open ended, but there would be a lot more data to look through. But that’s why you hire people.
Karyn: Those were our main things, like if you see a question and you think, okay, this is probably a leading question. How do I fix it? Find the bias, find the judgment words, do your best to eliminate or neutralize them and/or create more balance in the question. If there has to be a range, do you like this or do you not like this? Actually we looked at poll questions and I was pleasantly surprised that they were a lot less inflammatory. I thought they were going to be more inflammatory. I could tell for a lot of them that the polling people did their work, the reporters….ehhh We’ll save that for another topic.
Maggie: They must have needed a hook. We could do a whole other video on presenting data with integrity. That’s going to be a lot of videos.
Karyn: That’s a lot of soap boxes. Oh my gosh. Okay. So wrapping this up, I feel like we’ve covered it pretty well. Is there anything that you think we’re missing or anything we need to go over again and again and again and again?
Maggie: I think provided some great obvious examples, where average non survey expert would read those and think, oh my goodness, this is strong phrasing. And we’ve provided some ideas, some questions to use as a framework to figure out if you’re being asked a leading question. We have provided some ideas on how to de-lead your questions to how to make them not leading questions. I think we’ve left people in good hands.
Karyn: Excellent. So yeah, the benefits of all this are that if you are trying to get unbiased data and you can keep your questions as neutral as possible, then your data will be useful. You’ll be able to feel confident that these people gave you real answers and not just tried to make you happy. Which is good because it’s not wasting your time or theirs and then you’ve got data that you can actually use and for whatever purposes, and it’s quality data that is actually giving you real information. So that was our fun rant on leading questions.
Wrap Up
Maggie: Thank you for sticking with us.
Karyn: And I think it was less than an hour! We were fully prepared. We were like, okay, this is going to go forever. Oh my gosh. I was fully ready to just ditch it and go off on like 10 tangents.
Maggie: Me too but I just mentally bookmarked them for future conversations.
Karyn: If you have questions about our questions, feel free to comment anywhere you see this video. We will do our best to answer and help you out. And if you found this content valuable, please like and share it with anyone you think needs to see it. You can send it to somebody who really needs to see it who is on a question writing team or something. You’re like, “hey, they talked about your questions in this video, maybe you should watch this.” You can totally do that and just subtly leave it open on somebody’s computer.
Maggie: And if you are an educated person, you will agree that this needs to be shared. (Maniacal laughter)
Karyn: Are you leading them? Shame on you! You’re going to make them disagree just out of spite. Anyways, thank you so much for watching. If you would like to keep up with the rest of our video series, please go ahead and subscribe to this channel and we will see you in the next video. Thank you.
Related Articles
Ep 2: Questions You Can Ask Your Clients Right Now
So. It's been a minute (or 2 months) since we recorded our intro episode and we were just getting ready to start publishing our first few episodes when the entire world started dealing with a global pandemic. We are living right now in a constantly...
Ep 1: Introducing the Just Enough Data Podcast with Karyn Kelbaugh and Maggie Hodge Kwan
In our first episode of #JustEnoughData, we give you a quick idea of what the Just Enough Data podcast is going to be all about and give you some background on who we are and why we're doing this. Tl:dl; It's about examining how small businesses can...
How To Decide Who to Interview
Tips for Selecting Customers to Interview You Want Diversity of Perspectives Whenever possible, you want people with different perspectives, experiences and goals. This gives you the widest range of stories for others to connect with if you are using your...
Stay Up to Date With The Latest News & Updates
Follow