Testing and Investing in Customer Education

Episode 001

In This Episode

  1. Testing Customer Education and Strategies for Reducing Financial Risk
  2. Consistency Over Quality - a Discussion With the CEO of Unscripted
  3. Formats for Customer Education that Organizations Might Not Think Of
  4. Misconceptions of Customer Education Solutions and Technologies

Video Transcript

Let's go ahead and kick it off. So on the agenda today, everybody, I was thinking for the first topic, we could at least talk about investing and testing customer education. 

I think a lot of our clients are, when they first reach out to our team, they're on the fence, they're exploring, they're dipping their toe into this idea of training people and maybe that generates revenue or maybe that's marketing's job. Not really sure where it lives, but they know they need it for some reason, or they think they need it for some reason. 

So let's dive in on how to test and then maybe some key indicators on when to double down and invest on customer education.

All these questions are formulating in my head. So when a customer first comes to us and says they want to talk about customer education, how do they scale that? 

  • What do we do when we talk about testing? 
  • Is it the size of the company? 
  • Is it the content that they have? 
  • Is it the content they don't have? 
  • Is it the subject matter?
  • What would be the first question that you would ask a client if they have content, if they don't have content?

So when thinking about customer education, I would say, we have our illustrations and our whiteboards and things that we've worked through with clients. But I would say the first thing that we have to do is identify a really core strategic goal. 

And you'll hear that in other forms. Like, why are we doing these things? Why does this matter? But we have to understand the why before any of the what or how really matters. 

And so I think, really, as we start to work with folks on testing, kind of creating a hypothesis, a little mini experiment of customer education, we have to understand why we're doing that activity in the first place or why we think having an on demand video series is great or why we need a learning management system or why we need product-based certifications. 

Like, we need to get to the core of what that is. And normally, that sounds like, 

"Well, our competitors are doing these things.” 

“We're seeing other people in our space do these things.” 

“We want to be the leading provider of these things." 

And so there's a lot of brand. There's a lot of authority. There's a lot of like a competitive differentiator that's motivating it. 

But underneath that, when you really talk to the people who are like putting feet to it, there's [reasoning] ]like:

“We need to reduce our support overhead.”

“We need to help customers become more familiar with a particular area of our product,” or

“We want to help users understand why our solution is drastically different than our competitors, even though maybe things look the same or the language is the same. Like how the architecture is really different, why it matters, and all those things…” 

And so if we get past the kind of like VP, C-suite, “we need brand adoption and authority,” and all that stuff and we get underneath the hood, there's really tactical things that we could move the needle on. 

So it's like, which of those is important? Why is that important? Okay, now let's talk about all the fun stuff and the content and things like that.

You said something so interesting about the VP and the C-suite identifying the why. And I'm curious about the other members of the company having sort of other levels, other touch points. Really customer-centered folks are probably going to have different opinions than more administrative people, manufacturing. I bet everybody just has like a different viewpoint there. I don't know if that's a question. I think it's a comment that I would love for you to continue talking about .

You're absolutely right though, Emma. 

There is, there's like top-level kind of motivations, right? And this happens like in every department, right? 

So if this is marketing, this could be like MQLs, like qualified leads generated by marketing. And like, we need “more leads” in our pipeline or whatever that is. Or if it's sales, it could be, we need more closes in a certain shorter amount of time or “velocity,” what all those things are. 

But in this training space, it's, “we need these certifications and that's going to make us industry recognized.”

And largely that could be true. But folks who just kind of launched the certification for the certification’s sake miss the opportunity to go really deep with their customers and become irreplaceable. I think customer education can make you irreplaceable as a brand because you solve problems that customers didn't know they had, and you educate them on things before they knew they needed it and you kind of bridge gaps and to help them advance their career. 

And so in some of our clients' domains, going really deep in that customer relationship has gotten their clients promoted, has gotten them advancements in all sorts of areas of life and federal experts and showcased as thought leaders and all sorts of things. And it all started with like, how do we help our customers solve their daily problems with our product? You know? And it just blossomed from there over time. So I don't know if I expanded correctly or not, but I hope that was helpful.

Question: Do you ever talk companies out of customer education? I mean, should every company invest in customer education? And do you ever talk to somebody where you say maybe they're just not ready for it, but they might be a year or two or five down the road, but they're just not at that point where they can start doing it?

Wow. That's an interesting question. Okay. So and that maybe like leads into a little bit of what we're talking about with the format of customer education. 

Because I think everyone should be doing it, but the complexity or the sophistication, or like the production factor of it looks different across the board. And so whether it's a software company or it's a product manufacturer or it's a services based business. I don't know where I got this from. There's a book somewhere, actually, hold on.

There's a book somewhere. Wait, it's right here. Hold on. Let me brush the dust off of it. "How Clients Buy." This book and by Tom McMackin. He was a fantastic author on just the art of consulting and things like that. 

And kind of one of the underlying premises of his whole methodology was you're not out selling, you're out educating. 

And if you could out teach your competition, if you get out educate your competition, you build trust far faster, far deeper than the next sales letter or the next video promo. And so out educating, out teaching, is a really strategic way to win in really highly competitive markets, especially in service-based business. 

So, but to your question, Dave, do we ever talk people out of it? I think the answer is no. But I think we do strategically say, "You don't need to go all in on something that you haven't proven has worked or is working yet." So there is that experimental phase. 

I think this, what we're doing right now, is like the meta experimental phase of this concept, right? And so what we're doing today in practice is what our clients should do if they're interested, but are concerned about budget, concerned about resources, concerned about production space, what to teach, how to teach, when to teach. 

Start in a very low-fi format like this, $20 Zoom meeting subscription, start recording, start inviting people, and really start to get feedback as quickly and as cheaply as possible, then go all in. Because the last thing we want is for our clients to invest $100,000 in a solution and it'd be beautiful but ineffective.

Those baby steps. 

Yeah. Yeah.

Question: What are some of the things you look for in that feedback? Like what are the things that you're measuring right at the beginning when you're just messing around and trying to figure out what works?

Yeah. I think qualitative feedback is probably more important than stats, hard data. 

Really, if you can bring in those early stage experiments, if you can bring really sophisticated, knowledgeable people who really, really understand the subject matter well and can improvise on the fly in some of those early days, they can react as much as they can structure. And so you can begin with a plan in mind, but have the confidence that your team can adapt through those sessions. And through that adaptation, you start to refine the model and then figure out, okay, this is the programmatic approach that we want to take. So, and I think you're getting that from qualitative feedback, peer review. Like, "This session was great. When's the next one? How do I enroll? Where do I find out more things?" That probably matters more early on from folks who attend than just hard data on, you know, we had 74 enrollees and 60% attended and whatever that is. I mean, those numbers matter at some stage, but if you have 10 people that really say, "Oh my God, this changed everything. Like where do we go to learn more?" Is that more important than the thousand registrants that didn't show up?

It's literally quality over quantity.

Yeah. Yeah. Yep. Completely. And you know, I don't know if that's like a marketing thing. Like marketing expertise is not the thing here, but I think it does bleed over into like what you're starting to see a lot of agencies and teams do around demand gen. 

Like instead of driving high volume pipeline, we want to just freely make things available and build kind of that audience for free, like open the gate a little bit, let people in. And then at the moment of like really high intent, then convert people. So it's like building demand, building interest, building demand, more than just building random people who have no idea what you're doing and then blasting them. 

So I think training and education is likely part of that funnel, part of that process, but you can do the exact same thing where it's, we're looking at quality, not high volume. And the more people spend time with us, the longer they go with us as customer education people, eventually that turns into a tent and then we can easily make that conversion. So it's like streamlining your sales process. You need less sales resources and just more helpful, trust building, informational, supportive resources.

Well, I guess that falls into testing too, when they want to test a customer education program. Because you find out their needs, you find out where they are in the spectrum, and then, or the scale, and then you start testing slowly, like you said, with Zoom meetings or small piloted programs.

Yeah. And real talk. So we have a client that is delivering hundreds of trainings around the world each year. And part of that workflow that we do, and they're generating like 20 plus million in training and another 10 plus million in virtual training. But, so the workflow that we've been a part of and helped architect in ways is literally this: 

  1. The subject matter expert authors a very rough beta of the course that then gets launched internally. And you teach your peers. So in your own company, about your own product. So that could look like sales professionals training their immediate sales team around them about a particular approach, or that could look like software teams training immediate people, maybe your product owner puts on a training internally, something like that. You get peer input, refinement. The subject matter expert goes back, rewrites, redrafts, restructures a few things. 
  1. Then we do a customer launch. And this particular client is in a really highly regulated industry. So they have a couple of key clients that they really trust. And they have a lot of previous training experience with these clients. So we'll do a pilot, usually at a very, very discounted rate. And we'll get a room of 10 or 15 or 25 clients in a room and we'll run that new refined training. We'll learn again.
  2. Once we go through that process, then we launch to the world. So there's an internal pilot, an external pilot, and then a public launch. 

And so that's a really good kind of three-step process, but it gets the subject matter expert to finish something faster, kind of a crappy first draft, if you will. Like get it done. 

No one cares to wait for years for the perfect course that no one's ever going to see. So get it done, get in front of people fast, get helpful criticism, then go external to customers. See the spots that you missed because you're internal, so see the spots that you miss and then fix those, go live.

Question: When you say fast, what do you mean? What's a typical life cycle there?

Yeah, if, well, it could depend on the industry and it can depend on the risk. So I would say in like more tech oriented industries, six weeks is a release for most of these customers. 

So every six weeks they're rolling out something, or at least every three months, they're rolling out something. So you want to do it faster than the product can evolve. So that's kind of the race in a lot of tech software product training. In highly regulated or really risky sort of training scenarios, maybe you say, "Look, we'll take safer and slower." But again, it's really dependent on that skillset that we're training... 

So I would say we looked to pilot in six weeks. That's a pretty realistic thing for most of our clients is - can we go from blank page-to-program in six weeks? In other industries, maybe medical or aviation or something where people might die as a result of it, we'll take longer.

A little more regulated.

Yeah . A little more regulated, a little more risky, but if we could do it in six weeks or less. It's really just speed to reaction. We want to know how close are we? And it's okay if we're way off. At least we learned, and we didn't hibernate in a conference room for six months and then come out and learn that.

Global Distribution and Adoption:

I was just thinking about designing programs that are dispersed worldwide. That must be an interesting thing to monitor and see if you have certain success in certain parts of the world where it might not do so well in other parts of the country.

Yeah, this is, that's a really good point, Dave. Because we actually just had this in January of last year. So January of 2020, one of our customers ran a promotion. They had training for sale. It was revenue generating. And a lot of folks immediately went remote, quarantine life 2020, that whole story. 

And in order to go deep, again, the role of customer education is to build trust and to support and help. That doesn't mean not make money. But that means that there's a real intent of making customers' lives better first. And so in that kind of mission or method, everyone's world was kind of upended. And we said, "Wait a minute, you're going to be at home more than ever before. You do have an opportunity to learn more than ever before. So let's make all training free." 

And so when they did that, [the client] got 8,000 enrollees within the first month of making the programs free. And then since then, people have paid for another 14,000 certificates. So just by saying free for 30 days, we then got 14,000 paying customers afterwards. So it's just kind of that timing, that kind of motivation there, but what's really fascinating is, Lenny, you remember seeing that stat chart from the report. I will mess this up, but it was like 20 plus countries that are not North America were the largest participants of that online promotion. 

And of that, it was like 40 of the top 100, top Fortune 100 companies, were participating in all these things. And then that got really interesting because those companies were not [then current] clients of the team that had launched the training. And so we immediately just handed them a spreadsheet of, "Here's 40 top Fortune 100 companies that are not your customer that are actually enrolled in your learning. You should call them and see what they need your software for." So it was interesting. 

But that kind of global adoption was really, really neat to see because we hadn't had exposure like that for that client before that kind of free promotion. And then they got a Rolodex of people to go follow up with.

Multi-Language Formats: 

What about translating programs, different languages and stuff? How difficult is that to achieve?

Like, well, yeah, like anything, it probably depends on the mix of internal versus external expertise that's needed. Just the like quick take on that. 

Most of our clients that do global programs have global staff. So there's a really convenient way of leaning on internal resources to bridge gaps. However, I mean, there's plenty of translation firms and it comes down to format and kind of intent instruction in the program. 

But I would say this, if we have learned anything, it's that if you are going to go global or multi-language, begin multi-language first. 

Make that like a core tenant of your program, a guiding principle of your design, and say, "Whatever we make should adapt to these languages." Because you can get into a design language that is very English, North American-driven. You can get into a scripting that's very English North American driven. And all of that's fine, but as your catalog scales and then three years in someone has a great idea and says, "Oh, and we need all of this in Arabic. Or we need all of this in German." You now have just years of catalog that are so polarized to a North American point of view that it's really hard to adapt and keep updated.

And you can write all those languages, correct?

Alyssa does, actually.

Oh, Alyssa. Wow. Very impressive. Very impressed.

Yeah, very impressive.

Real-Life Example - Testing Virtual Live Experiences, with Unscripted CEO, Emma Supica

I'm curious, Emma, so thinking about testing and implementing, a little known fact. As you rolled out, so for those on the audio and/or subsequent video of this, Emma also leads a nonprofit group that does improv and all sorts of things with Third Coast Comedy Club. So I don't want to “butcher” what all you do, Emma. I have a tendency to do that. 

How did testing and all of that for the format, how did you approach that testing? Because it's a very similar dynamic. You need people from around the world to join an experience to get something out of it in a digital virtual arena. How did testing and all that go for you?

So when we began the sort of trial period of doing a daily, let's see, I'll explain the thing that we tested, which was a daily 25 minute improv workshop where people could drop in totally free online in order to connect with people during quarantine. This was sort of came about in March of 2020, as many of these things did. And so our approach was very improvisitory which was let's just try it and see what happens. And then, so the first thing was, let's just see how people want to come. 

So very simple metrics, like how many people show up? And then there was, where are you coming from? 

And then it was, how did you find us? And then we went into, what did you get from this? Is this a good experience for you, just positive, negative? 

And anyone who filled out the feedback form, it was a little biased in the way that we asked that because anybody who's going to go click on that form because it was an optional thing for them to do said it was good. So understanding that bias in the testing is good to begin with, but it, regardless, we wanted to capture that data. 

We started doing it three times a day. So we went, oh, let's just throw it at the wall and see how it goes. So we were logging on three times a day just allowing people to come. And so those numbers helped us to narrow it down then to two times a day. And two times a day, went for a long, long time. 

Then we cut one weekend day and then we settled into our routine and our rhythm. But I think the thing that mattered more than anything was that we were there every single time. It was super consistent. And we stuck with it over a long period of time. 

And I'm trying to think of the other, we started to get into how many of our people who came to the free ones, did they sign up for other sort of paid things that we did? Did they take our improv for anxiety class? Did they become donors? 

Since we're a nonprofit, that's a separate sort of motivator metric that we determined. Did they get engaged in improv broadly? Did they start to do other online things that came up? And most of the time the answers to most of those questions were yes. Sometimes we just had individuals who came and went and thank you very much, but many of them, it was a cultivated relationship, not just with our organization but with the art form itself, which was a really cool thing to see both on the micro and sort of macro level.

Question: You said something that was really interesting just then is talking about consistency being one of the driving goals or visions for the program. How do you think that consistency impacted the overall, what it has become?

Consistency, I think is, well in a few ways:

  1.  It built trust with our sort of core followers, like our community who's there every single time. 

  1. And then it was also, then that became sort of peer-to-peer marketing for us to get people to come. It also helped us to, it just takes a while for things to latch on. So I think that just that alone, like the ramp. By the time it got to here, and we had people logging on all the time, like we had been doing sometimes breaks with no one for a month already.And so it took a while to see our ROI come back into play there. And we, what one was it? There were others. 

  2. It was our sort of competitive edge as well, which is that many others, and now we're only once a week and that's because the world is opening and people don't want to be online as much maybe for what, many, many reasons. 

But at the beginning, when it was that daily thing, we were the only ones doing it daily every day at the same time with the same people and the same level of energy and support. And that was it. There were a lot of online jams that came up, online workshops, online whatever, some paid, some free. Ours was free daily, all the time, every day. No matter what, we were there. And that just helped us really build a name for ourselves. 

And we've definitely seen the ROI on that in our other courses that we offer and our other activities and programs. People who show up globally, we have a really good relationship with a whole improv troupe in Munich now. That would never have happened had we not shown up everyday online to just try, try something.

I love that. That is, yeah, I love that. 

And it's interesting that you mentioned ROI and the concept as well, because I think we get the request, or like we actually had someone ask us in a sales call the other day and they said, "Well, what do you look for to measure an ROI of training?" 

And it was kind of a double-edged sword, right? Because we're actually pitching them services and they're asking us how we measure ROI. And so the only way that we knew really how to facilitate conversation in a productive way was, "Well, what are you hoping to see increase or decrease first?" 

And not so directly, but kind of is, "And what's that worth? What is lifetime kind of brand champion status worth to your team, or what is being known for that worth?" And it sounds like in your case, Emma, like maybe indirectly, but over time, over the course of the year, having been known, probably and announced in some ways is like the leading kind of virtual improv sort of experience. That became like a differentiator versus when you're looking at A option versus B option. It's like, "Oh, well, they went above and beyond in that area." And so it's just like, it's one more area where you can stand apart versus just, especially, well, especially for products. 

Like (I know this isn't directly a product, but as more and more products start to look and feel the same, feature parity is real), as that happens, you have to get away from the widgets on the screen to make a difference with your customers. And so indirectly, I know improv is very different than the “software product” mind that I was kind of going in there for a second. But it seems like there's something to that that says our ROI was showing up. And we didn't see that for three, four, five months, but months seven, eight, nine, 10, 11 got real interesting because now the whole network of people who do this is talking about how we do it, which is very different. So, it's cool.

Yeah. And we also, you know, there was a book written at a writer's collective that he, the man who runs that, came to our sessions to try to learn if that was possible to use, to do improv online. So he sort of piloted his own experience with us. And then he wrote this article in this book about that experience. So it was like, did that, and that's out in the world now. So it became a whole thing. So it really was, it's become this stew, this richness of possibility. And that's because we invested in that and in such a consistent way with what we do best. It was like, what do we do? Let's just do that over and over again the best that we can.

And I think that circles back over to, Dave, where you were saying like what should, like what does testing look like? Like what should we be looking for? Is that qualitative? Is it quantitative? Like what's the feedback mechanism? 

And I think in today, especially in tech and in software and in product, like there's so much of a push, especially when you have like investment dollars behind it of like “now, now, now.” Like things have to work immediately. 

And so if customer education is tied to “now, now, now,” I mean, you could see some things happen depending on like the level of product adoption usage. Like you could generate a new revenue stream. We have plenty of clients who do from the training side, but the larger opportunity, I think, even speaks to what Emma story is, is like, yeah, “now, now, now” can happen. That's kind of a short-term, short-sighted goal. But if you can, if you have other departments to pick up like a sales and a marketing function, pick up and let customer education do what customer education does best and go deeper on relationships longer, then you start to see that exponential curve at the end of it, where it's like, oh, you have all these certified experts in your solution. You have all these consultants and authors recommending you. You are the kind of shining example in your domain area. And it's really because of how much you help and how much you serve. Not because of like how much you sell. So it could be interesting. Lenny B, what you got over there, bud?

Hey, just tuning in here. Listening up. I'm like, there's a lot of new info that, I mean, you and I have talked about a lot of things and it's just interesting to hear completely different perspectives.

Agreed. I agree. I love this format. I'm hoping more and more people hop in here because I would love to see, I don't care if they're a customer or not. I would love to just hear and help as many as we can. I think that's the mission. 

Formats for Customer Education

So something else that I wanted to just touch on is formats for customer education, at least in this initial customer education live session. In a lot of ways, this is like a meta discussion about how to pilot something. 

But formats for customer education, Dave, I think this was something that you brought up early on. Like how much opportunity there is to do something for, like how many opportunities there are for like non traditional delivery. 

I think a lot of people, especially in the online course world, Skillshare, CreativeLive, MasterClass, LinkedIn Learning, well that was formerly Lynda.com. I don't know, you name them. There's a format that people associate with online learning. I'm going to click into this portal. I'm going to go to a list of videos. I'm going to hit “play.” And when I get done with that, I'm going to get some badge that says, good job. And maybe I'm better. Maybe I'm not. Hopefully it was entertaining. And I'm now at least familiar with the concept, right? 

But customer education doesn't have to be that. It could be something very, very different. And so I'm curious, Dave, especially as you leading a lot of graphic design, a lot of creative design UX stuff for us, you started testing things across social media, different things like that. What misconceptions maybe did you have when you first joined the team versus where you are today on how we could deliver education?

Well, it's funny coming from the graphic design print world, web design world, packaging world. I never really thought about customer education. And it's kind of funny because we all are involved in it one way or another. We've all been online. We've all taken a course. We've all taken a quiz. We've all taken a test online. So it's real funny for me to enter this world of online education and designing the UX and design for that because I have to shift my brain and think differently about how to design, because it's more about the experience of the user and how we get there. And what I find interesting is every client that I've worked with so far since I've been with ThinkThru, it's been a different experience. Some are heavy on videos. Some are heavy on testing, some are, you know, it's just, it's not the same format all the way through. And that's what I find really exciting. And especially working with you and Emma and Lenny and building these programs out, it seems like it's a new avenue, an exciting, if it's a widget or a platform or some cool thing that we can introduce to a client, it's an evolving machine that I think it's going to keep evolving over time and it's going to be more and more exciting if that answered your question. I'm not really sure.

It did. No, I think, I think there was a lot of good in there, Dave, for sure. 

Misconceptions of Customer Education Technologies

And I guess one of the maybe the misconceptions that clients have when it comes to customer education is not only on the testing side, but they have been told, rightfully so, by every other learning vendor on planet earth that you need this product to deliver this training in order for it to be effective. And while all the tools help, I don't think just because you have the tool means it's effective or what you have to do. 

And so I love exploring the idea. At least, if this is the first time we talk about it, I hope we can talk about it more. But I love exploring the idea of customer education should live where customers already are. And it doesn't have to be in a learning platform to be learning. 

There's a lot of like learning L and D, training and development people, who are brilliant. And there are a lot of platforms that are amazing and wonderful and worth everything that you pay for them and more. But in this kind of dynamic where there's customers, there's prospects, there's thought leadership, there's all these other external facets of it, I really want to challenge clients or folks who are thinking about customer education to explore. 

What if we don't invest in all the platform?

 What if we invest in finding and reaching the right people and worry about all the backend tech, backend measurement later? Focus on helping first and data second. 

And I feel like a lot of people solve it. Like “if we had this platform, we could help more people.” Yeah, you probably could. But how can you help before you have all that? Like, how do you start low-fi and scale up? And so that could look like LinkedIn groups. That could look like Facebook groups. That could look like Instagram carousels through particular workflows. That could look like open format Zoom calls for a $20 or $30 subscription a month. That could look like micro events, meetups in cities, especially as the world opens back up. 

Maybe we rent an Airbnb in every city around North America and we throw like a small customer education live meetup. And we actually really help people like answer questions and like lower the iron gate and say like, "Okay, if we were going to do this, like let's sketch this out. Here's how we would do it. Here's how we think." And I think life gets really interesting when you stop worrying about the data as much sometimes and just start worrying about making a difference. 

And so the formats of education, I think, don't get hung up on the fact that it has to be a video library that lives in the LMS platform that you launch to the world and you have a Stripe integration and an e-commerce plugin. Stop worrying about that and just start worrying about how you help and then the numbers and the data and the metrics and the cash and all that starts to get interesting.

Well, there's a lot of platforms out there. I'm sorry, Emma.

No, you're just, I'm very fired up about the way that you're saying all that, Matt, because that's very non-profit speak, which is that when you arrive, like when you hit a wall and you go, "What's happening, why isn't this working?" You just go, "What's our mission. What's going on?" And that's the answer. Like that's why nonprofits have mission statements. And most companies have a set. They have a mission statement or they have a set of values, and that's why you establish those core values. And that means, and then even if your bottom line is money and selling a product, you still have these values that you can stick to. And like, everything that you described is exactly that, being driven by your values and everything else will come after that. That's just how it goes. Anyway. I'm just, that sounds great. It feels good.

Fired up.

Well, I love it because we have, I mean, we have helped people who are one, two people shops. Like our clients have been small businesses that are highly successful, very lean and super high margin teams. And that's fantastic. And then we've at the same time, even in the same month, we're working with teams that are global, and they're trying to solve the exact same problem. And some folks are saying, "Well, I can take my cell phone and a $90 subscription and I can just focus on helping people. And I can really go deep on that and I'll invest time because I don't have as much cash." And then you have larger companies who are just saying, "Get me a platform, give me some automation, go." And if you said which one was better? I mean, better is different. But like which one made a bigger difference in the customer's lives? Like that's where that qualitative subjectiveness kind of happens when clients say, "How do you measure the ROI of this?" Well, I don't know. What if you had one customer that was your customer for 30 years and never left because you made all the difference in their early stage decision-making. I don't know. So what's that worth?

Well, I think what we're saying is content matters. It doesn't matter the platform. We're going to get t-shirts, content matters.

Content matters.

Love it.

I think so. I think content and listening. Content and then listen to what happens as a result of that. Listen a lot, ask more questions, and then make a lot more content as you listen. So it's like this kind of weird hybrid, which is why testing could be the biggest key to unlock is what can your company, like for people listening, what can your company take on as an acceptable modality or method for testing? If the process is that everything has to be completely e-learning, completely SCORM packages on an LMS, if that is the acceptable format that's been managed, nothing wrong with that. But I would challenge anybody listening or participating in this to say, what could a testing look like in a future state? How could we get to the leanest low-fi curriculum faster? And then how can we up their production value as we go? So how can we iterate on that? And be comfortable with the low-fi. It's hard because we all, I think initially the ThinkThru team, we were producing 600 to 1,000 videos a year. And so we, more than anybody, we were champions of studios and production value and like motion and camera rigs and all that stuff. But when you look and say, what can some of the most passionate people that really want to help do with a cell phone? Like they can do a lot, you know? They can reach a lot of people. They can make a big difference. So what can your company take on as an agreeable standard of brand as the lowest means of testing for the maximum result and then worry about production factor later. So that's all I got.

Well said.

Maybe. Wordily said. Wordily said but said.

I love that you love this so much that you're just willing to chat about it. You know, just, I love that. I think that's so great. And it makes me love it too, even more.

Good. Well, thank you. Yeah. Alyssa, I hope all is well in Florida. We miss seeing you, but we know that you are, oh, you are driving. Don't drive and Zoom. All good. So, well, this has been fantastic. I think we should look, we have another one of these coming up on the 18th, so in two weeks. What would be really interesting, at least on deck, some topics that I've seen and things that have come up on LinkedIn so far, the concept of revenue models seems to be one that comes up a lot. People who are really thoughtful and really great educators may not necessarily be great about putting a financial framework around that expertise. And so I want to dive into, there's really like six models. There's really six ways to do this at scale. So maybe we can jump into that. If we hear anything else over LinkedIn or in chats where this is posted, we can learn from that and adapt. But I think that's a good place for us to go next week, or week after, next session, week after next.

All right.


All right.

Awesome. Yeah. Thanks everybody. I'm so glad the first one is done. On to episode two.