AI Can Give the Answer — But It Can’t Give the Transformation

AI is accelerating faster than anyone predicted.
It can write. Research. Design. Strategize. Coach you through decisions. Even mirror your thinking back to you without judgment.
So what happens to coaching?
In this powerful Thursday coaching edition, Kellan is joined by Dr. Matt Markel and Lucia Rester to explore the uncomfortable truth: AI can replicate frameworks, tools, and even insight — but it cannot embody lived transformation.
This conversation cuts through the hype and exposes the illusion of competence, the danger of polished but inaccurate output, and the difference between giving answers and facilitating real change.
If you're a coach, consultant, therapist, or leader — this episode isn’t optional.
Key Takeaways:
- The real difference between coaching and consulting
- Why AI gives the illusion of insight
- Hallucinations and the danger of polished misinformation
- The 80/20 rule in AI accuracy
- Why “information” is not the same as transformation
- Embodiment vs. borrowed expertise
- The future of coaching in an AI-driven world
- Consciousness, energetics, and what machines cannot replicate
- The illusion of competence in low-tier coaching
- What will remain when 95% of coaching disappears
🔥 Ready to turn your truth into impact? Join the Dream • Build • Write It Webinar — where bold creators transform ideas into movements.
👉 Reserve your free seat now at dreambuildwriteit.com
Mentioned in this episode:
Visit www.dreambuildwriteit.com
Visit www.dreambuildwriteit.com
Kellan Fluckiger: [00:00:00] Welcome to the show. Tired of the Hype about Living the Dream. It's time for truth. This is the place for tools, power, and real talk so you can create the life you dream and deserve your ultimate life.
Kellan Fluckiger: Subscribe. Share, create. You have infinite power.
Kellan Fluckiger: Hello, this is your Ultimate Life, the podcast I created to help you live a life of purpose, prosperity, and joy by serving and creating with your gifts, your life experience, and all the good stuff that you have inside. This is one of the special Thursday coaching editions.
Kellan Fluckiger: You'll know that already 'cause I've got two guests, which I only do on Thursday. We talk about the AI and its relationship and development in the context of coaching. Not, we're not talking Skynet and taking over the world here, but [00:01:00] just that. And I've got a couple of good guests here, uh, Matt Markle and Lisey Rester.
Kellan Fluckiger: Welcome to the show, Matt.
Dr. Matt Markel: Hey, uh, Kel, it's great to be here again. It's great to see you. And, uh, Alicia, great to see you as well. It's, uh, it's, uh, fun being on a show where we're friends talking about fun topics.
Lucia Rester: You bet,
Kellan Fluckiger: Lucia, welcome to the show.
Lucia Rester: Thank you so much, Kelli. It's really fun. And I'm delighted to be here with one of my favorite P persons, Matt, and as well as you What a, what a fun idea.
Kellan Fluckiger: All right, cool. So I've set a timer to kind of keep track of where we're at and let's just kinda get started. So our, our topic today is about coaching in the context of ai, and I wrote a book about that and all that jazz. But I don't want to, we'll, we'll explore that as we go along. I guess the first question I have, both of you do some kinds of coaching.
Kellan Fluckiger: Everyone does it different and in different contexts and so forth, and that's wonderful. [00:02:00] Uh, my first question's easy. Are you using AI and in the context of your teaching and coaching? And if you are, how Matt, you can start.
Dr. Matt Markel: Lucia go ahead
Lucia Rester: Um, I use AI every day, um, and for several hours a day actually. So, um, it is, I could say it's, I, I rely on it quite a bit, but I rely on it in a very particular way and to me, I like to call it kind of that 80 20 rule applied to ai.
Lucia Rester: So. As a, I think the bigger issue here is really as a subject matter expert, we can really use AI and then crosscheck it to accelerate what we're doing, our tasks. It has amazing capability in certain [00:03:00] ways, but it also still has a lot of weaknesses and one of them is fallibility and you know, over multiple sessions, I'm just saying.
Lucia Rester: Uh, generally speaking, you can expect about 80% of it to be, um, somewhat accurate. And I think for me that's, uh, that's how I use it. I use it as a very faithful, uh, 24 7 assistant, but I don't rely on it as the subject matter expert
Kellan Fluckiger: when you. Matt, I'll get you in a sec about how you use it. Um, when you do that, when you say you don't rely on it as the subject matter expert, tell me a little bit more about that.
Lucia Rester: Sure. So, you know, AI does a great job of pulling things together. However, as we all know it, it's one of its tenets is really to please us. And [00:04:00] so in an effort to please us, uh, it will make, it can make stuff up, it can give erroneous information, and then just sometimes it just goes off the rails. Like, uh, you know, so when, when I use it, I use it within my own ex.
Lucia Rester: Expertise so that I always have a crosscheck of, is this valid, is this accurate? Based on my experience and. To me that 80 20 is also the gap within which, going back to your topic, Kellen, within which people who are experts in their field can really rely because it, it is not at this point, um, AI isn't something we can depend on as we can a subject matter expert, somebody who truly understands.
Lucia Rester: Whatever that field is or discipline or topic, um, [00:05:00] that's not where it is right now. It may be later, but to me that's where that gap is that I take solace in because, um, I think that's where. True coaching comes that, and the difference between information and transformation, those two things. So we'll
Kellan Fluckiger: talk more about, about that later.
Kellan Fluckiger: Yeah, that's another place to go. Matt, I wanna give you a chance now to jump in and tell me how you use it and, and whatever makes sense there.
Dr. Matt Markel: Sure. So I think one thing that, that we can perhaps make a distinction on here is the difference between coaching and consulting. Okay. Because I think we may broad, broadly lump those two together, but they, they serve different.
Dr. Matt Markel: They serve ultimately different roles, and I think each one of those will use, uh, ai just like they use tools differently depending on what the applications are. When I think of consulting, that is perhaps strategic business advising. The where you are, you're imparting certain types of knowledge and experience to someone, um, to a, achieve a goal.
Dr. Matt Markel: Whereas coaching it's more of [00:06:00] a. You're working with them to help them develop. And I, and Laia used the word transformation here just a minute ago. You're helping them transform into some version of themselves, uh, that may not be known by either party at the onset. So there's a little bit of a difference, and I think they'll probably approach the use of ai, just like they'll approach the use of tools slightly differently.
Dr. Matt Markel: But specifically for your, your question, I use AI in a variety of. Methods. And I, I like the idea of saying that this is an assistant that can, that can assist me in tasks where it can do things either quicker or better or without the, the cognitive drain for me that, uh, I could do myself. In most cases, I've got an exception here, I'll show you in a second, that I could do myself, but it's, but I don't want to because I can go much faster without it.
Dr. Matt Markel: So for example, you know, the, how do we do used to do research 50 years ago it was going to a library, right? And, [00:07:00] uh, and searching things and maybe maybe doing interviews with people directly and, and you know, again, trying to get them on the phone, looking up their names in the phone book, et cetera. So there was.
Dr. Matt Markel: You know, a lot of, you know, very slow things. Then we moved in this situation of the, of the internet where we can type things into Google and Google will provide information. Most of, most AI is now integrated into, into the modern search engines so that, so when you type in Google, you're getting AI helping you there as well.
Dr. Matt Markel: So whether you type it into chat, GPT. Or Claude or uh, or Google itself. It's using some aspect of AI in the research there. So that's, that's one way that we use it. But I think specifically for me, the way I use it the most right now. Is on, on writing where it will help me edit and help me, uh, make changes on things much faster than I can type them.
Dr. Matt Markel: And I'm mean even a pretty fast typer. So I, whenever I'm using it, I use it in voice mode where I'm talking to it and I can talk. I'm from the Midwest, I can talk pretty fast and we, [00:08:00] we, I, I talk to it and, and it can iterate a lot quicker with, with, uh, the things I'm asking you to do. So a lot of things on, on the iteration.
Dr. Matt Markel: With, uh, whether that's in documents, whether that's in, in content for social media, et cetera. That's how I use it a lot. And I also use it as a graphic designer. Now, this is the area where it is better than me. I can envision what I want from a, from a product a lot easier than I can make it in Canva or PowerPoint or, uh.
Dr. Matt Markel: Harvard graphics are whatever, so I can describe what I want to see and have it iterate on the drawing, uh, a lot quicker and a lot actually more, uh, with higher quality than I can. 'cause I'm the, I'm the worst artist ever. So those are a lot of the ways I'm using it. I think one more thing that's a, that is a, uh, an an interesting aside on that is when you think of a.
Dr. Matt Markel: Business, whether it's coaching or strategic advising [00:09:00] or or whatever, having someone where you can bounce the ideas off of with zero aspect of them judging you. Is a very, very useful tool, right? It's hard for us to have that, that sort of interaction from a human. We will show up differently for a human than we will in front of the computer.
Dr. Matt Markel: That's just a fact. It's not better, it's not worse, but it is different. But you can tell things to the computer. You can tell things to the AI about, Hey, I'm struggling with this, or I need help with this. And basically use it as a way for you to talk yourself through something. It's like a really, really good friend that wouldn't be judging you.
Dr. Matt Markel: But you, but can also, you know, forget exactly what you said, you know, at the, at a moment's notice so they don't think the worst of you. Um, that is a really useful tool. I've used it for myself and for example, like I was, you know, during the Christmas break, I was. Struggling with some things about the, how to set up this, this one facet of my, uh, of my business.
Dr. Matt Markel: And I had been burned by some, some [00:10:00] advisors in the past that charged a lot of money and produced very little. And I was like, you know, trying to figure out, well, how do we really get to the zen of like what we're trying to do to help help these, uh. Help our customers and, and is this gonna work and so forth.
Dr. Matt Markel: So just hours of just pouring myself out into that and really getting the feedback back in a way that no one's gonna judge me. No one's going to, to think, wow, you know, Matt looks like he's got it all together. I thought Dr. Markle always knew everything. No, he doesn't. He's he's, he's human like us. Well then no one's gonna judge you with that.
Dr. Matt Markel: And that's, uh, that I found to be very, very therapeutic. So that's probably one. Now we don't do that every day, but that is one that we're an area where I personally used it in the business.
Kellan Fluckiger: Cool. So I, I liked that. And thank you both for sharing that. I think you're both right. You said something, Matt, that I wanna hone in on.
Kellan Fluckiger: You said there's slight difference between coaching and consulting and I think that difference is as wide as the Grand Canyon. Uh, and, and you know, as a, and I've been a consultant. I was a consultant for [00:11:00] 12 years in a very. Contentious, bunch of angry people, many billions of dollars at stake situation.
Kellan Fluckiger: And as a consultant expert, you know, you're paid a lot of money and you frigging better know the answer. Like you're supposed to come in and know stuff. At least that's how it was in the work that I did. And as a coach, you're not, that isn't part of the skillset to know the answer. It is the discovery process that makes it, but you makes it valuable.
Dr. Matt Markel: Yeah. Just so you know, I, I agree. It's a, it's a, there's a substantial difference between, between them. Sometimes the lines get blurred, especially if you're doing Yeah. Uh, advising or consulting. And the, the CEO has personality issues that you need to, you know, you need to address before the, before the company can move on.
Dr. Matt Markel: Then you kind of, the lines get blurred a little bit.
Kellan Fluckiger: Absolutely. Absolutely.
Dr. Matt Markel: But, but the, um. But I think that just the one, the one difference I would say is like, yes, as a consultant you're [00:12:00] expected to know things, but you're also supposed to deliver results in the client, you know, on some aspect of a timetable.
Dr. Matt Markel: And that timetable is kind of pre-negotiated. It's usually, that's usually not a. Uh, most co coaches, like, you know, I'm gonna do this for you and it's gonna take x number of months. You, it might take x months to get through your program, but the, the results are a little bit more loosey goosey on, on what that's going to, what that's going to give you.
Dr. Matt Markel: Um, you know, coaches are supposed to know things too, but they're supposed to know how to get you on the path as opposed to how to get you to, um, you know, okay, we need to turn this around. We need to be profitable by this quarter. We need to have free cash flow of whatever metrics. So.
Kellan Fluckiger: Yep. Agreed. Hundred percent.
Kellan Fluckiger: So great answers. Um. Felicia, what's, or maybe either one of you, I don't care. Some people have this, like you described something, uh, Matt, that I've done also, I talk, [00:13:00] I right now I'm not doing it, but I can talk really fast and do I use the thing, the, the chat. Same way you do, especially when I was writing the book, I'm preparing a concert that I'm doing on the 18th.
Kellan Fluckiger: And I'm gonna invite both of you to it. It'll be a Zoom concert, it's got some music and stories and stuff, and I put the lyrics into the chat and told the stories and you know, talked to it and asked it to give me some, you know, stuff to say in between songs and all that kind of stuff. And did it in the way you described sort of just.
Kellan Fluckiger: Talking to it. And what I really appreciate about it is, uh, it's, you know, I could say, no, no, no. Forget that, uh, do it this disorder and I don't have to stop or do any of it. And it untangles it all really well. And I hear lots of people talking about hallucinations and all these mistakes that it makes.
Kellan Fluckiger: And I don't know if it's because how I'm using it is in a [00:14:00] particular way or because I've put literally four and a half million words into. The threads that I have with it. I don't find that, I find that it's 95 or 97 3 in terms of understanding what I meant and creating for me accurately. So I guess I'm interested in what are the, what are the problems that you see, the hallucinations or the difficulty or the, uh, craziness that you see?
Kellan Fluckiger: Because when you described its ability to talk you through something. That's the function, or at least one of the functions of coaching. And so when I say that, that sort of lower and mid tier of what passed for coaching is gonna be gone, you know, that's an example of, of what I'm talking about. So tell me some about what are the problems that you've seen?
Dr. Matt Markel: So the, I think the hallucination gets into areas that [00:15:00] are. Slightly different than the, than the rag case that you're ta talking about there. So I think what you've done is you've created a retrieval, augmentation, uh, generation system that basically you put in a lot of stuff and that's tokenized your, your input phrases are tokenized.
Dr. Matt Markel: Does a, uh, probabilistic matching of those, and those go into like the standard LLM, right? That's kinda like the, probably the way that most people when they, when they make their own custom thing, that's actually what they're. What they're doing is they're, they're loading in documents and instructions into, into making a custom LLM in cloud or chat chip t or whatever, and that, because that is probabilistic in its matching, but we'll do a matching.
Dr. Matt Markel: There is less of a, see, how do I not go completely off the, off the rails geeking, geeking it up here. Um, there's not a good way to provide what we call confidence intervals or confidence estimation on the quality of the answer. You are [00:16:00] assessing that basic on like, Hey, I've, I've told it all these things that I'd like to say and now it's using those and that, and that sort of sounds like me.
Dr. Matt Markel: And, and you're, you're assessing that that quality. Um, but when it's more factual based then, and it's, and it's making an, an assignment on that where there is more of a right or wrong, wrong answer, then uh, it can get it wrong and it is struggling currently with the ability to accurately say, Hey, I am not very sure about this.
Dr. Matt Markel: That is one of the differences between the AI and a human coach that I think we will continue to see, uh, to give the humans an advantage. There's a lot of advantages for the AI based, you know, either assistance or tools or whatever, but for the humans, I think number one is a good human we'll confidently say.
Dr. Matt Markel: You know, I don't know. I need to do a little bit more research on that, or I know this, I've had experience with this, but I'm not sure if that's entirely applicable to this situation. [00:17:00] And a good coach or good consultant, either or a good friend, whatever, will provide that level of confidence on their answers to you.
Dr. Matt Markel: We don't get that with the, uh, with the ai. And that's the, that is one of the issues. So when we talk about hallucinations, hallucinations is basically. Providing something that was a probabilistic match, but not really correct, uh, at all. Um,
Kellan Fluckiger: you know, so, so Alicia, what do you think about that? Where have you seen some, uh, things that are in that 20% that, that bother you, that drag you, like, ah, you know, whatever, what have you seen?
Lucia Rester: Yeah. Um, I love how Matt articulated that because there, you know, the different kinds of tasks that we can help, that we can be helped by with ai. And let me just say, um, I do some videos and imagery that to me is less, um, problematic because we all have a [00:18:00] sense of accuracy. Like if we see, you know, an image.
Lucia Rester: Of a person with six fingers, we, even if we're not a graphic artist, we'll be able to say, what are you doing with those six fingers? Like, stop. The, where it gets tricky is actually. In the simpler levels, which is the tech space things when we aren't the subject matter experts. So here's a case in point to answer your question, Kellen, I was doing some research for, um, the curriculum for Blue University, and I really wanted to extend some additional resources for students.
Lucia Rester: Along a particular topic if they were interested in diving deeper. And so I asked chat, and I was using chat and, and keep in mind I have multiple custom gpt, so this isn't just me kind of throwing it at chat in a kind of a new chat. This is a, a lot of, of custom prompts and stuff like that. But I asked it like, give [00:19:00] me three different references and it did.
Lucia Rester: And the links were broken and, um, the, there were no, the references it cited were not actual references, um, that were accurate on the internet. So then my response was, only give me links that work for the general public and that are not broken. Did it again, still broken. So that's kind of a very specific example that just kind of shows.
Lucia Rester: What I'm talking about, if I had just assumed it was correct, I could have put those links into a book or you know, whatever, some ancillary materials and they would've been inaccurate. And the, the thing, the, I'll just say one more thing. The issue that I, that concerns me, I mean, I love ai. I'm a real proponent of using it, but using it intelligently.
Lucia Rester: [00:20:00] Um, but the thing that does concern me is that a lot of people, because it's sounds so polished, they are seduced. By the polished quality of the response as opposed to the accuracy, uh, and quality of the content or the, or the message or the, um, so to me that's really where the rubber meets the road on this, because if somebody is, isn't, isn't a subject matter expert, they don't know what's accurate, and it all sounds so good.
Lucia Rester: It all sounds so polished.
Kellan Fluckiger: You know, that strikes me as not a small thing. That's a catastrophic failure as far as I'm concerned, for anything to come back with things that simply aren't there or don't work. And, and I, you know, the sky isn't falling and I don't mean that and, but it does absolutely illustrate.
Kellan Fluckiger: And that kind of failure could go across many things in terms of someone using [00:21:00] it. I'm in the, uh, when you said six, Matt, you're Diane, you're exploding, finish.
Dr. Matt Markel: Well, no, I, I think that, but that's because we are still nascent in our ability to understand the relationship between us and the, and the computer on this or us and the, in the, in the ai.
Dr. Matt Markel: Right. You know, Kellen, if you and I are having a conversation and I tell you something about, uh, about finance, or I tell you something about engineering, you know, the things that I'm, you know, either a world class expert in or, uh, pretty darn good in, then you'll, you'll pretty much trust me, right? Um, if I tell you something about.
Dr. Matt Markel: Uh, I don't know about, um, bobsledding or about, um, Taylor Swift. You know, I, you know, and, and I sound very authoritative. Authoritative in that, you know, you're probably gonna say like, you know, Matt is probably not an expert on Taylor Swift, by the way. My wife is, you know, you should ask her. But for me, I'm like, I'm not.
Dr. Matt Markel: No, I don't know. But it, but I, but I, you know, I look the same. You know, I'm, you know, [00:22:00] whatever level of good looks I've got, you know, they're the same, whether I'm talking about radar or engineering or, uh, defense or self-driving cars or any of the things that I'm really good at, and whether I'm talking about Taylor Swift.
Dr. Matt Markel: So we haven't yet learned, because again, the relationship is nascent. We haven't yet learned how to trust, how to not trust, basically the, the, the outputs of a conversation we have with ai like we have with humans.
Kellan Fluckiger: And I wanna make two observations. One's funny and then one's about what you just said. Uh, I have a friend who uses a lot of AI tools in creating paintings, and it did create a picture of someone playing a saxophone that had six fingers.
Kellan Fluckiger: Just so that you'll, just so that you know, that was a funny when you said that, I thought, oh yeah, I've seen that. Um,
Dr. Matt Markel: it was on, on the cover of Vogue recently, I believe. Wasn't there a, um, uh, I, I wanna say it was Ariana Grande. It was on the cover of, and I wanna say it was Vogue. I saw this 'cause I'm in some of the LinkedIn groups I'm [00:23:00] in.
Dr. Matt Markel: We're very much into. Into AI and AI regulation and ai, secondary and tertiary effects on the environment and so forth. And they, they basically posted one which has like some AI thing, a very underground with like, you know, six, six fingers. Uh, yesterday my wife, you know, there's a lot of things going on.
Dr. Matt Markel: This is, we're taping this the beginning of February. There's a big thing going on, uh, on social right now about how you can make. An AI sort of version of yourself, which kind of has like a, remember like when you used to go to like a carnival or something like that? They're the do they're the guy doing the caricatures of you.
Dr. Matt Markel: Yeah, yeah, yeah, yeah. And they put some of the stuff around you like, are you into music? Are you into sports? Or fin, you know, finance, whatever. And they put some of those things in the background. Well, those are very popular. Well, they did one of. My wife and two of their friends at the, at the gym after a workout and the original picture had one of the friends had her arm around my wife.
Dr. Matt Markel: Uh, they all kind kinda like, were, uh, arms around each other and the, and was doing like, you know, the, the piece up sign. And, uh, my wife had her, her hand up there [00:24:00] too, but they were separated by this much. The AI merged them and it looked like my wife had six fingers. You know, she had a normal hand and these, these two fingers popping up as well.
Dr. Matt Markel: We're still seeing things like, like that. Uh, so the, so the, the six fingers is like an easy tell, but there's other things as well that are sure that are, you know, a little bit less on the, on the just ridiculous humor side.
Kellan Fluckiger: So that drives me back to what I said. You gave a very articulate and specific example of people talking about their area of expertise and being believable and then talking about other things that they're, that are not related.
Kellan Fluckiger: And us as the receiver, having the good sense and or Spidey sense to question the viability because they're not an expert. But we don't do that because our whole world is full of musicians and sports figures that are busy opining on world affairs and politics and everything else about which they know exactly Jack and yet are [00:25:00] accepted as authoritative.
Kellan Fluckiger: And yeah, that's what we ought to do. I question the ability of humans to effectively differentiate between areas of expertise and wild ass opinions.
Dr. Matt Markel: Yeah, there's, there's, there's definitely the. We always are subject to the halo effect. Right?
Yeah,
Dr. Matt Markel: that's what I was, so that, that, that's, we always, we always fall prey to that.
Dr. Matt Markel: Right? Uh, but, but I think in general we have a pretty good, and I like how you said the, the spidey sense of it.
Kellan Fluckiger: We do, we do. And, and we ought to know it and stuff, but I just noticed that, and that's far as I'm gonna go on social commentary right now, but, all right. So let's, let's get into a couple of other things.
Kellan Fluckiger: Uh, 'cause I wanna, I want us to, to cover some other things. What. Um, in, in terms of that ability of a coach to create the space needed for a person to change, to see what needs to happen, to see a new possibility and change. [00:26:00] Uh, lots of what passes for coaching today is formulaic. And in the process of writing the book, I did an analysis of 11 different existing coaching methodologies, and I asked How good is it?
Kellan Fluckiger: I ask in in the research, and this was using the AI tools, how good is this method at achieving its results? And obviously it's gonna go look for whatever it looks for and give an opinion, but it gave me a quite detailed analysis that as I read, it felt accurate in terms of what I know about each of these systems.
Kellan Fluckiger: And then I ask it how vulnerable it is to AI and its advances. And, you know, AI doesn't even know its own course of development, but watching it over the six months, I saw staggering growth, double growth, doubled and doubled again. And so made some projections on that, recognizing that it might be 451 degrees next Friday, but doing that and ask it for [00:27:00] all that and the, the results were were frightening in terms of.
Kellan Fluckiger: It makes me ask the question, okay, what is left? If AI can write better, faster research, more, give you frameworks, tools, prompts, accountability questions, call you out on pattern deviation and all those kinds of things that it does really well. What do you think is left for the truth of real? The, the essence of what coaching is, which is causing, uh, people to have a new insight, make long-term sustainable behavioral change to use of the Dan is.
Kellan Fluckiger: What do you think is left?
Lucia Rester: Well, from my point of view, the most important thing.
Kellan Fluckiger: Good. What is it? Talk about it
Lucia Rester: to me. I look at coaching, I, I love us kind of identifying, consulting, coaching. There's another angle of this, [00:28:00] another layer of this that for me is brought into the conversation, which is therapy.
Lucia Rester: So a lot of coaches mistake themselves as therapist. I am both a licensed therapist as well as a coach and a consultant. Why I bring that into the, into this conversation is because. There's, there's four from, from my point of view, there's four different levels of consciousness and levels of transformation.
Lucia Rester: There's the behavioral, physical, behavioral, mental, emotional, and then the transcendence. Some people would call it spiritual, the kind of otherness of it. To me that fourth level is, can be simulated, but it is not. Um, replaced by humans who have a level of consciousness, um, and who can facilitate deeper [00:29:00] levels of transformation.
Lucia Rester: So certainly on the behavioral level, on even the mental level, there may be equivalencies here. I would say though that the. The, the fac, the ability to facilitate and to create, uh, bespoke responses based on what is happening in that moment. The energetics that are happening in that moment. The relationship between the, the coach and their client, whether it's single or group.
Lucia Rester: That, that transcends a, a, it's a very different level of the game and, um, I don't see that being replaced. Uh, I, I, I don't have a concern about this, uh, whole topic for myself or for those who are really at, you know, who are at, at, at their top in their game. I don't see that. Where I see [00:30:00] it is really people who are maybe at the beginning.
Lucia Rester: Of their coaching career who don't, haven't created that level of fluency and facility. Um, and more importantly for pe for the general public who doesn't even know that that's a possibility. And so there, there, you know, this, this area of transformation is one that has a lot of misunderstanding in it and, um, a lot of hype from my point of view.
Lucia Rester: So,
Kellan Fluckiger: yeah, I agree with that. So what do you think's left?
Dr. Matt Markel: So I think just a couple observations before we kind of jump into this one. Many coaches that are coaching today really don't have the expertise to truly be a coach. And I think that that's important that we recognize that the, I think, uh, cia, you were, you were hinting at that towards the end of your, uh, of your answer there that, you know, if they're early in their, in their journey on that, most likely they're not qualified.
Dr. Matt Markel: And I've seen many, many people that, [00:31:00] that think that because they had a job for 18 months and. Didn't really like it and, but now they want to be a coach because they've seen other people be successful at that. But they're gonna be a coach and they're gonna be a good one. And they just need to get, you know, they just need to get a few clients, you know, set up a funnel, do some, do some ads on meta, and they're gonna be, and they're gonna be making $10,000 a month in their pajamas.
Dr. Matt Markel: You know, something, something along the lines like that. And
Kellan Fluckiger: I ads on meta, specifically with some dude in a beach chair on a laptop with the ad saying, uh. Make a good life, make a difference in people and make a living being a life coach. So I'm still seeing that nonsense floating around.
Dr. Matt Markel: Yeah, exactly.
Dr. Matt Markel: That's, that's by the way, that's why that sort of just come on, you know, that sort of feeling that we get from that. That's why I built the interpreneur coal concept was like, no, that's a, this, that's some moron who's trying to sell you on something so he can hopefully have, be able to take a vacation this year.
Dr. Matt Markel: Uh, don't fall prey to it. But, but let's get back to your, your [00:32:00] question. Then. You know, 500, 600 years ago or so, the, you know, the concept of having books and having books be able to be proliferated, uh, just didn't exist. Right. The, before the advent of the printing press, before the advent of, of affordable paper, before the adv, you know, just the, the, the, uh.
Dr. Matt Markel: The proliferation of the ability even to read. So books weren't as, you know, we didn't, uh, didn't have, uh, books like they are, like they are today. You know, we all, uh, these are real books behind me and I'm, I'm a, a prolific reader and I think both of you are as, as well, the. But books didn't solve the problem, right?
Dr. Matt Markel: Uh, YouTube videos very, very cool, you know, and gr and I have now found myself, if I wanna learn how to do something, I had to fix a lock on my door. Recently, I'd looked it up on YouTube and I watched videos and I did everything and, uh, uh, surpri except for this one little part that was broken and I couldn't actually fix.
Dr. Matt Markel: Uh, I would've fixed the lock on my door because of [00:33:00] watching YouTube videos, but. They don't solve the ultimate problem because what people want is they don't just want the information. They want somebody to do that with them. Full stop. And that is going to be the, the, the part that remains is those that can actually effectively do it with someone, with the experience, with the lived experience that provides the context and the ability to articulate things well, and articulate things in a variety of, of manners to connect with the, the client.
Dr. Matt Markel: That will be what remains because people want. Someone to go down that journey with them, someone who's qualified, someone who can guide them, and someone who can, they can look to for advice and inspiration and see in their eyes that yes, they understand me, they get me, and they have my best interests at heart.
Kellan Fluckiger: All right, so I'm gonna just love that. And I couldn't have planted a better, a sto, [00:34:00] not a sto, but a shill in the audience to say all the right things that both of you have done. Um, and because what, what I've said in, in,
Dr. Matt Markel: did he just call us a STO in a shill? I said, uh, what, what that, oh,
Lucia Rester: I think, what is the show?
Lucia Rester: Honestly, I think that was the 80 20 issue. I think that was the,
A plant in the audience,
Lucia Rester: right? It was the off, like it was, it was the Kellen bot going, going a little.
Kellan Fluckiger: Well, no, I just meant a plant in the audience to say the right thing. The volunteer, right. The magician calls for volunteer and the right person say that.
Kellan Fluckiger: Right. So you said, you said a very leading thing and that Is this the conclusion I came to after all that analysis? And in the book I also propose a coaching model that is based specifically on what you've talked about, and I also ran it through the same analysis and everything else to see how vulnerable it came out to be in order to validate, at least to a degree my own thinking.
Kellan Fluckiger: I. It seems to me what is left the 5%, if you will. 'cause I said [00:35:00] 95 will be gone. And that prediction is just that. As I watch and experience coaches, I don't see very many that operate in the realm that you just described. And the way I describe it is they are the embodiment of what they teach. In other words, without even speaking, it is obvious by the energetic presence that they have the truth which they espouse because they live that which is the look in the eyes, the feeling, the essence, the energy.
Kellan Fluckiger: And there are so many people who talk about something about things that they have heard or read, work or help, but they have not lived the truth of those things. And to me, that's all that's left and everything else is gonna be gone. I may be wrong, but it feels like that both with your example of, you know, coaching yourself through interaction and its ability to do everything that's less than that in terms of information and frameworks and questions and tools, and [00:36:00] add infinitum faster and better than anything.
Kellan Fluckiger: And the only thing that's left for us is the most important thing, as you said to start with, which is that the truth. Of the lived experience. So that's fundamentally what I came to as a conclusion. And then I also proposed a coaching process and methodology that adheres to that. So when you look at coaching and ai, what's, uh, what scares you if, if anything.
Dr. Matt Markel: I think that it's,
Kellan Fluckiger: or a scare, that may be the wrong word,
Dr. Matt Markel: have any concerns? It's one more way to have someone who's not really. The, it's one more way to look impressive without actually being able to back up what you're talking about. Right. You know, the, maybe long time ago, if you wr wrote up something on a [00:37:00] typewriter, if you wrote up something on a typewriter and dressed well.
Dr. Matt Markel: You might be able to look like, you know, what you're doing without really knowing what you're doing. Right. The, so AI gives the, gives the illusion of competence, gives the illusion of quality, gives illusion of insight, uh, without really, without really being able to back that up. So it's just one more tool that supports that level of, of, um, you know, you could say aspiration for someone who's on the, uh, who's not very good or very early in the career or, um, um.
Dr. Matt Markel: Uh, deception, if you wanna be a little bit more, uh, negative about it. So it helps out with that. You know, you can definitely sound a lot smarter with ai, but, um, and people will, you'll, what's the phrase? One's born every minute. Uh, then, you know, there's, you know, there are people that'll, that you'll be able to make some money off of that.
Dr. Matt Markel: But eventually, I don't think that that's going to, um, uh. I don't think that's going to change the proportion of things anymore than it really is. You know, the, [00:38:00] and just to clarify on that, I don't think you're gonna find any more unqualified or disingenuous coaches after AI than you came up with before.
Dr. Matt Markel: I think it'd be gonna be, it's gonna be about the same. 'cause the people that want to do that just now have another tool to do it.
Lucia Rester: That's a i I, that's, I think, very insightful, Matt. I, I don't have much concern about this because in a way, it's kind of a good thing. It's kind of like Hogwarts sorting hat, you know, like,
Kellan Fluckiger: I'm, I'm with you.
Kellan Fluckiger: I think it's gonna have a big impact, and I'm grateful.
Lucia Rester: Yeah. I mean, I think what's gonna happen is more and more. Of the true quality that we're speaking of, whatever that is defined at in that, within that discipline. For me, it's the consciousness, it's the authentic service. It's the training, it's the intelligence, it's [00:39:00] the energy, it's the skill to be able to see somebody and really hold.
Lucia Rester: For their magnificence, whatever line they're doing. All of those things take a lifetime to develop and one has to be basically a, uh, devoted to that as their path. If they're really going to be in that space, that higher level of space. And so this kind of instantaneous, you know, I've, I've read three books and now I'm gonna hang up my shingle as a coach.
Lucia Rester: That's fine. Because what I, how I see this is. That we, we are always the teacher to those that are coming behind us in terms of their journey and the student for those who have proceeded us. And I don't mean in terms of chronology, I mean in terms of like skill and consciousness. So for those people [00:40:00] who are at the very beginning of their journey of working with coaches, that level of coaching may, may suffice.
Lucia Rester: Um, I would say that Matt's ex explanation during Christmas of him working with AI is because Matt brought his consciousness and AI reflected it as opposed to AI bringing the consciousness and Matt. Gleaning from it, you know, you know what I mean? Right. It's like this amazing mirror kind of thing. So it, people are gonna find wherever they are in their journey.
Lucia Rester: And you know, for all of us, I think we're at the place in our career, in our life that we say, yeah, well we're, we're, we're really serving. We wanna be part of the 5%, um, as you said, Kellen. And we also want to serve from that level.
Kellan Fluckiger: So I love what you said, and I wanna say, just ask a, a question or, or [00:41:00] describe something.
Kellan Fluckiger: I, when I said the whole middle of all of that's gonna be gone, I, I don't know if there's gonna be, I mean, like, how do you make a living? I define Make a living is a hundred k just because that was an arbitrary number. But if you can't make a hundred K, you can't make a living or else you better have another job or.
Kellan Fluckiger: Partner that works or whatever, and because I think AI is gonna be so available and so good, everybody that did something at a less than that level will be replaced by an $49 a month. You know, bot and Tony Robbins just released one for 99 bucks a month, and you know, everybody's doing that. I'm building a coach by Kellen Bot that I am, and it's full of all the stuff that I have, and I'm gonna make it available to clients for in between sessions and things.
Kellan Fluckiger: But the, I think I see three things, and you guys can argue with me if you want three things that I think are in the way. Or can be in the way of someone who is not what you described, at least truly devoted to this. And [00:42:00] one is what I call the head in the sand problem. People that are pretending it's not happening, it's not that bad.
Kellan Fluckiger: It's not that thing, ah, I've heard this, believe me. Nah, I'm not worried. I've already got that. It's a human connection. I've already got that. You're. Because the, the, that, that person who talks like that about that sort of thing in my mind, doesn't even get what they're talking about. The second is what I call the anti problem, and that is imagine a casino where there's all these blackjack tables and they're all $10 tables and you go in to play blackjack and all the tables are full and they're all robots, and the only place for me or you to sit down is in the high roller room in the antis, 10,000 bucks.
Kellan Fluckiger: And so to me, what it feels like. Is that the anti to do well, to truly perform in this game's gone way up. Because you can't perform in that sort of mediocre way anymore. And the third thing is what you described exactly, Lisa. And that was this is a mountain without a top. And if you're not consistently devoted to that practice [00:43:00] because you decide to be, you're not gonna be able to hold that kind of space.
Kellan Fluckiger: So I see those three things as the the barriers for the transition that people are gonna have to make a choice. And decide about if they wanna stay in this business. And, and maybe I'm pessimistic, but I don't think there's gonna be a lot of room for that other stuff. Now, I, I think it will live as, as this improves be replaced by these apps and bots.
Kellan Fluckiger: And I don't necessarily think that's a bad thing because it'll get rid of a lot of the nonsense that we see.
Dr. Matt Markel: But someone could make a, make a business out of the, the apps and box, right? Mm-hmm. And you get a, of course you get a good marketing, get good, good digital marketing, uh, set up. You do a good funnel.
Dr. Matt Markel: And, you know, instead of selling, you know, to make a hundred grand, you gotta sell what, like, uh, two, four grand packages a month then, and then you're there, right? The, um, so that's, so that's not a, not a huge bar. I think what happens is you see, you find people [00:44:00] that. That are. This is an aside from the answer, but like they think that that's good, that that's enough.
Dr. Matt Markel: 'cause what they, they, what they're not doing is they're not really thinking about building a business. They're trying to basically bath them, build themselves a job and they think, find that their job can be a hundred grand. A, if they can make a hundred grand a year, they can pay their bills and they, for that, I've got a four grand product and I can sell it.
Dr. Matt Markel: I can, I need to sell two a month and I'm there. You know, so, because they're basically there just. They've, they're not building a business. They're, or that's scalable and, and can operate without them. They're basically bought themselves a job. But I think that the, um, the, the concept of, of marketing out these little, you know, Kellen bots or Tony Robbins bots or whatever, yeah.
Dr. Matt Markel: There's still maybe a market for that as well. It's kind of like, and, and there is nothing intrinsically wrong with that. Right. The, the. One may decide that, Hey, I don't need to have a financial planner. I'm gonna read a book on finance and then kind of do it myself. Well, that's, that's kind of the same [00:45:00] model.
Dr. Matt Markel: You know, you could make the, make the comment that maybe that's sort of like socialized medicine. It's pretty cheap. Um, you know, at least in terms of like what you pay, if you exclude your taxes. Um, but it's usually better than nothing. And that may be adequate for, for many people, especially to get started and to give them the, the level that they want.
Dr. Matt Markel: So yeah, spending 50 bucks a month or a hundred bucks a month on some product, maybe that's, maybe that's great. It's a lower price point is a easier, uh, easier path to entry, easier path to exit if you don't wanna continue it. So there's, there's nothing necessarily wrong with that, and I don't think that necessarily is gonna.
Dr. Matt Markel: Necessarily replace coaches trying to fill that gap also, because they're gonna put, they're gonna put advertising, they're gonna put marketing out there and try to fill in that gap with their own little bot.
Lucia Rester: Yeah. I love that. I, I love that. I, I think that what you're talking about, Matt, actually is a really positive thing because we have the ability for people [00:46:00] who will never.
Lucia Rester: Spend $10,000 or $20,000 or $30,000 on a high-end coach, they now have the ability to have some level of service, albeit AI bots, but still have a level. Of service because I can only assume Kellen, that you and Tony Robbins and everyone else who's doing this bot, I'm doing one myself, are gonna cross check it right before they send it out.
Lucia Rester: They're gonna, it's not gonna be like, oh, I know. I just trust that this bot's gonna sound like me and be as wise as me. You're gonna be cross checking that thing and making sure. It's going to represent you in a fairly accurate way. So how wonderful is it that somebody who's a five figure person who really suddenly caught, found out about you, gets your wisdom in a way that they may never be able to reach?
Lucia Rester: The other thing is [00:47:00] scalability for the coach. Like again, I'll, I'll use Blue University. Blue University is really based on the idea of, um, that we, you know, Dan Vega, we all know him, who's an incredible, um, master in terms of finance and business, simply cannot coach. Thousands of people in his busy schedule.
Lucia Rester: I mean, he doesn't even coach a couple of people right now. He's, he's beyond that. So now we've found ways to be able to bring that kind of, of insight and training. And with AI it can actually, with good prompting and good engineering, it can actually walk people through a sequence that could really serve them.
Dr. Matt Markel: Mm-hmm. And the, and it may be a form of content that is more consumable by today's market than getting a book. Right? Which is the way we would've done it for a long, long time [00:48:00] before that. It's like, hey, I got, uh, I got Dr. Mark's book on this, uh, you know, his anti entrepreneur book, right? I got Kellen's.
Dr. Matt Markel: What if the, I don't know, 75, 250 books that Collin's written. I don't know. How many have you've written up to Come on 23. Let's get, I lose. I don't talk to you for a week and it goes up by seven. I don't, you know.
Lucia Rester: Exactly,
Dr. Matt Markel: exactly.
Lucia Rester: I think he just wrote a book while we were on the show.
Dr. Matt Markel: He did. I saw him. I saw, I think, I think he was off typing on it or I dictating to ai.
Dr. Matt Markel: He went on mute for a second. Anyway, so the, so yeah, the how we consume. Information is, is always changing how be like, and when I write on LinkedIn, the, that is a different way of writing than when I write for a, a journal or when I, if you were to write for, you know, HBR write for, uh, uh, a, a corporate environment, it's a different way of writing because this is a different way people consume content and how we consume things is always.
Dr. Matt Markel: Changing, you know, just as a, as a, as an example of that, nobody used PowerPoint in 1974. [00:49:00] Right? We did a lot of stuff in 1974. I mean, I was really young at the time, but it's like the, uh, but it's the, but we accomplished great things and we were sending people to space and all sorts of stuff without using PowerPoint at all because we communicated differently.
Dr. Matt Markel: So how we communicate and how we communic and consume information continues to evolve the bot, the maybe a way to that people can, or more. Comfortable with and more efficient with consuming information today than reading a book would've been, you know, 10 years ago.
Kellan Fluckiger: So we've come to the end of our time.
Kellan Fluckiger: I really wanna appreciate, uh, express appreciation. It's been a very textured and rich discussion and like always it's completely different, uh, than other ones. And I, you know what would be fun? I would, uh, suggest you guys wanna, might wanna go back and listen some of the other Thursday episodes just to hear some of the other things that have been talked about.
Kellan Fluckiger: I don't know if you do that sort of thing, but you ought to
Dr. Matt Markel: just send
me the transcript
Dr. Matt Markel: [00:50:00] so I can put 'em in a bot and, uh, and, uh, have it tell me what y'all talked about.
Lucia Rester: Yes. I think Matt and I are gonna write a book in, I think, do you have 15 minutes, spare minutes, Matt? We'll just write out,
Dr. Matt Markel: I've already got
a draft,
Dr. Matt Markel: but I'll just, I'll just email that to you.
Kellan Fluckiger: Alright. Anyway, at least thank you. Is it Alicia or Licia?
Lucia Rester: Lisa,
Kellan Fluckiger: Lisa, cia. Yes, Lisa. I wanna say it right. Anyway, thank you. Thank you. Thanks for being here with me today and sharing your heart. Matt, thank you for being here too and sharing your heart.
Lucia Rester: It was great. Super, super fun.
Want you to take. Have a
Dr. Matt Markel: great day, everybody.
Dr. Matt Markel: Thank you so much.
Kellan Fluckiger: I, I, so just, uh, your listeners, I want you to take this to heart because like the other Thursday episodes, especially if you're a coach, lots of ways to think about this. Lots of opportunities and like anything, AI is gonna create a mess and a lot of opportunities. And each one of those can be used.
Kellan Fluckiger: If you choose to create your ultimate life,[00:51:00]
Kellan Fluckiger: open your heart right you
Kellan Fluckiger: right now. Your opportunity for massive growth is right in front of you. Every episode gives you practical tips and practices that will change everything. If you want to know more, go to kellen fluger media.com. If you want more free tools, go here. Your ultimate life.ca, subscribe, share










