Episode 38:
Enhancing Communication Skills Through VR Simulation
May 31st, 2024
Hosted by Bill Ballo, Rick Casteel, and Jon Brouchoud.
Jon Brouchoud:
All right. Welcome everyone to Simulation Pulse Live. I do have the episode number this time. This is episode 38 total. We started this a long time ago. We’ve called it a couple of different things. It was the Acadicus Roundtable, the Pulse, and the Pulse Live. It’s gone through iterations, but this is episode 38. Super excited to be here today.
Rick Casteel:
Well, we should kind of take the audio or even the video and put it out there on some podcast channels.
Jon Brouchoud:
We really should. It’d be pretty easy to do. Some of the things we do are so visual it doesn’t work, but a lot of times we’re just talking about stuff.
Rick Casteel:
Yeah. And they’re up on the YouTube channel too, right?
Jon Brouchoud:
They are on the YouTube channel. I’ll actually paste a link to that. If anybody’s looking, go to acadicus.com and then at the very top in the middle, there’s going to be a Simulation Pulse series button. You can click on that. I’ll paste a link here so it’ll go with this video. Oh no, that’s only to the panelists. Send it to everyone.
Rick Casteel:
Mitch is here. Hi, Mitch. Are you able to make him a participant?
Jon Brouchoud:
I’m going to make him a panelist. Hello, Mitch. Mitch is a panelist. See, we do this to the guests who come. That’s why people aren’t going to come here anymore. Every time I try to come, I just want to lurk in the background and be muted and not have my camera on. And Jon makes me a panelist. I was just telling Rick that I made an AI theme song for us for the Pulse Live. Here, I’ll share my screen. It’s pretty cool. I’m using a platform called UDIO. We’ll get to virtual reality, health care, medical stuff soon enough. But first, Jon’s going to geek out over some AI. So bear with me. Hang on a second. I had to spell it A-C-A-D-I-C-U-S because it was having a real hard time pronouncing Acadicus. Here are a couple of apps. Some of them are just instrumental.
AI Music:
Jon, Bill, Rick, they’re bringing it live in VR worlds. Watch medicine thrive, touch the pulse. It’s a new sensation, health and tech.
Jon Brouchoud:
I was trying to find some that had more vocals in them because the instrumentals are fun, but they’re not as interesting. I think this one here, hang on. John Will and Rick Stowe. Playtime on Friday afternoon anyway. Yeah, it’s pretty interesting. There’s two platforms I’ve been working with, Udio and Sumo, and I still haven’t worked out how I feel about it from an ethical perspective. I know that this is being sourced from hard-working creative artists, and it’s been absorbed into these large language models, for better or for worse. Sometimes I think the ship has sailed and this is just the new reality we live in and I’m creating and prompting something new. But on the other hand, if it’s being built on the backs of people who have created this originally, and now it’s being taken without their compensation, there’s obviously a consideration there. So I don’t know.
Bill Ballo:
It’s going to be like streaming. When streaming came out, actors weren’t getting the same residuals they were getting from, you know, when something was shown on cable TV or whatever. They had to augment the models of how they were doing compensation for those types of things. So, that’s going to become part of this discussion as we go forward. I think being cognizant of how it happens is important. We can’t just abandon the work that’s been done on these things, but we also have to figure out a way to make sure that if there’s something you created that gets pulled into this large language model, there’s a way to track that and give some type of credit or payment for it. The rules for that stuff will catch up with the technology as it becomes more commonplace.
Jon Brouchoud:
Mm-hmm. Yeah, I agree.
Rick Casteel:
I agree. I think eventually we’re going to have things that come with a tag that says “created by a human.”
Jon Brouchoud:
Yes, that’s right. Authenticity tags. I thought about that too. Well, you know, the other thing I think about too is that you can’t get these platforms to give you a replica of existing music. I can’t say I want something that sounds just like Willie Nelson with the same lyrics as Willie Nelson’s music. It won’t do it. But it is inspired by it. And I wonder if that’s really different than how humans create music. I’m certainly inspired when I create music. I listen to other artists and incorporate what I hear into what I’m doing. I’m not necessarily giving them credit because I’m not creating something that’s a replica. I’m inspired. It’s a new thing that synthesizes what I’m taking in. And I think that’s how they’re doing it. So is it okay for a digital entity to be inspired by other artists and then create something unique and new? We’re off the rails here, but it’s fun to think about. Obviously, we’re in the tech space and the VR space, and there’s a lot of overlap with where we’re going with the future of AI. Something we’re tracking for sure in Acadicus.
Rick Casteel:
Well, right. I think we can say it’s just an eventuality, Jon, before that’s in Acadicus in some form.
Jon Brouchoud:
Exactly. Exactly. We’re seeing it already. I know over in the Engage platform, I went in the other day and they have these AI characters. I think they had Benjamin Franklin, Rosa Parks, and Nikola Tesla. You could walk up and talk to them. They have some prompt questions that guide you, but there’s a pause and then the character speaks out the history related to the question you’re asking. So, it’s already being dabbled with.
Jon Brouchoud:
It’s amazing. And the speed of response is getting so much faster with like ChatGPT Omni now. It’s just fast. It’s conversational. You can have a dialogue with it. Unfortunately, I find that it asks too many clarifying questions. It keeps on like, “Now what I heard you say,” it’s reflecting. It’s using real typical cognitive-behavioral kind of responses like, “This is what I heard you say.” And it’s clarifying and then it seeks another clarifying question. It becomes sort of interesting in a way, but sometimes I’m just like, “I just want you to answer my questions.” I don’t want to have a big, long dialogue and wait for you to ask me questions like that. Then I actually corrected it. I said, “I would like our dialogue to be more concise.” And then it sounded a little insulted like, “Oh, well, I’ll be more concise from now on. Excuse me.” It had this attitude about it. So you have to be a little careful. I might just remove that conversation from its memory.
Rick Casteel:
Talk about geeking out. We have Sean with us now.
Jon Brouchoud:
Sean’s here. I always try to make Sean a panelist and Sean’s like, “Yeah, you know, I’ll offer it.” I’m always going to offer it. You can decline it if you want, Sean. That’s fine. We’re going to spook people from coming to the Pulse because they’re going to be like, “I don’t want to be a panelist. I just want to listen.”
Rick Casteel:
We promise that if you come visit us in the future, we won’t make you be a panelist. You can always decline.
Jon Brouchoud:
That was an option. Hello. Hey, Sean. Hey, Sean. Sean is the rockstar developer here at Acadicus, so we’re always glad to have Sean on board.
Rick Casteel:
Well, today…
Bill Ballo:
I’m pretty sure Sean is not an AI, but you know, we’re just not 100% sure.
Jon Brouchoud:
I’ve seen him in person. He was actually on video with us a couple of weeks ago on a Pulse Live.
Rick Casteel:
You showed me a video of you that wasn’t you, so I’m still not convinced. I’m still not convinced. I think Sean could possibly be an AI behind the scenes, but you know.
Jon Brouchoud:
Could be, could be.
Rick Casteel:
I’ll show that someday.
Jon Brouchoud:
Yes, yes, 100%. Well, Rick, you had a great idea. I’m glad Mitch is here for this as well to talk about some of the OpenAI, OpenRN project, some of the scenarios and some of the documentation to support that. I think that would be great to review. We’ve talked about the OpenRN project quite a bit, but there’s so much there that I don’t think it’s possible to dedicate too much time to it. There’s so much that I think, and everybody I talk to, even people that have been using the platform for years, still come and they’re like, “I had no idea about all these scenarios.” So I think that could be a great use of our time today.
Rick Casteel:
This came to mind because as I continue to work with and train our new groups and organizations that are joining us, I always try to guide them to review those documents for two reasons. Number one, I think just the template itself helps guide them when they think about how they teach in VR because I think that’s a hurdle that instructors have to get by. And Mitch, you can speak about this. It’s like, “How do I do this?” That’s a template you can use. Plus, there’s already all this material there that, especially when you’re first starting out, it really does hold your hand. It gets into the details of the room that’s chosen and the character that’s chosen and the assets that are available and what the expected outcomes for the students are. So I thought it would be great to look at some of that material, let folks know it’s there, and how to get to it.
Jon Brouchoud:
Yeah, absolutely. That’s a great point. Here’s a link to the Google Doc. That has links embedded in it to each of the scenarios. If you click on any one of those, it’ll open into a scenario plan that guides you through that particular scenario. Keep in mind that we didn’t build 25 nursing scenarios. We technically did, but what we did was build content that was used to create 25 nursing scenarios. Because we built those building blocks, those building blocks can now be used to create an unlimited number of scenarios. So it’s 25 scenarios, but it could be 25,000 scenarios. Literally, you can tweak things and change things and use this content to create an unlimited range of different scenario types. That’s the goal.
Bill Ballo:
That’s how I got started with developing. I took the ones from OpenRN and I would just tweak them. So as I was trying to build new scenarios, instead of starting from scratch, I could take, for example, the administering cardiac meds to a patient with dementia and just tweak it. I added different meds and different assets. I didn’t have to start from scratch and be like, “Oh, let me choose the environment, and now I have to put every single thing in the room that’s needed.” I swapped out the patient for a Gen 2 patient because I like the functionality of Gen 2 patients so much better. So now, for our second semester, I have a Gen 2 patient in there so there’s more to assess and figure out. I just wrote some new objectives and tweaked it around and boom, I have a new scenario now. Being able to scale the scenarios, so if something is that basic entry-level first semester clinical all the way up to then our most advanced clinical course, I can do that really easily in Acadicus, which is really nice.
Jon Brouchoud:
Yeah, early on when we built that cardiac administration, the scenario number one in the document that I pasted, that was Miles Johnson. At that time, we wanted to build around that scenario and that patient was really built for that scenario. If you open up the simulation manager, Miles can really only do a few things that are kind of aligned with that scenario. He doesn’t have any more capabilities. As Acadicus evolved and as the OpenRN project became more clear, we realized, with brainstorming from Dr. McGonigal from Chamberlain, why don’t we build one patient that has lots of capabilities? It can still be used for that scenario, but it can also be used for all kinds of other things, exactly like what you just said, Bill. So all of the latter scenarios are built around those Gen 2 patients. Those are very flexible at this point because they’re not hard-coded to be just one way used for one scenario. They have a lot more flexibility.
Bill Ballo:
Yeah, I put four of the Gen 2 patients in our dental scenario because I built the dental scenario for us. There’s one that’s kind of the primary patient, but I put in Gen 2 patients in the other three dental chairs because, oh, now we want to do it for a female who’s Asian, or we want to have this or that. The nice thing is that at this point now, my dental faculty could write almost any scenario they wanted. Because everything’s prebuilt and set up, I don’t have to really do any more creation. I might have to add like, “Oh, you want some different meds in there,” or “You want this or that,” but I don’t have to start from scratch every time because those Gen 2 patients are so flexible that I can just tweak it.
Rick Casteel:
Right.
Bill Ballo:
And have a whole new scenario. So at this point now, they’re talking because we’re building now a dental VR lab at our school. They want to have a total of six different scenarios, and it’ll take me 30 minutes to build each new one. So we’ll be able to have six scenarios. It’ll take me probably three hours to put it all together with just the tweaks I need to make.
Jon Brouchoud:
That’s amazing. That’s amazing. And that’s really the ultimate use case for this. Rick, you could probably attest to how many demos you give, where we get these questions about like, “What does the application do for me? What scenarios do you have?” They want to know, “What do they get in terms of that automation?” The challenge there is that we’ve covered this many times on Pulse Live. Anybody that’s watching these videos in hindsight, we’ve been over this before, but it’s an important point. The more we pre-program into these patients, the less flexible they are. We can make it do whatever you want. We could hard code it. That’s easy, to be honest. That’s the easier path. If you want to keep everybody on the rails, there’s only one way to do it. It’s one plus one equals two, and it’s always the same every single time, like a comic book. That’s easy to do. We could program that in a long weekend, no problem. But that’s it then. That’s all you can do with that patient. That’s unfortunately Miles Johnson in scenario number one, he’s stuck being Miles Johnson in that scenario. He can’t be used for very much else. But the Gen 2 patients, they have that flexibility. They’re not hard-coded, but you do have to facilitate. You go in there and actually teach your students instead of the AI or the program teaching your students. It’s actually the instructors, and that live facilitation is what we’re after.
Rick Casteel:
And it’s interesting, Bill, you’re going to be familiar with this because I remember it clearly from my clinical days where it was a negative thing to say that you were practicing cookbook medicine, which is this kind of checkbox, you know, just go down your list and ignore all these other factors and critical thinking that go into treating something or someone. Like you’re saying, Jon, we can do these procedural-based things. To a degree, that’s important. You want to follow certain steps when you’re putting in a Foley catheter, but that’s such a minute part of it. There’s so much more that goes into that process in terms of understanding what the patient’s going through, your environment, and so many other factors that when you get locked into just that procedure, you’re really limiting what you can do with the platforms.
Bill Ballo:
And those ones that lock you in, the problem that I have with them as a nursing educator is it doesn’t promote critical thinking. It promotes learning a step and doing something exactly the same every single time, but just like putting in a Foley catheter. What if my patient has contractures in their legs and they can’t bend their legs the right way, or they’re in a cast? There are a million different things that can happen that change how I have to approach any procedure. I want students to learn to think about, “Okay, here are the rules I have to follow. Here’s where I have some wiggle room to make things work.” I’ve checked out other platforms, and the thing that always got me once Jon got my head where it needed to be was thinking about, because I kept being like, “Why don’t you have it where you… happens in this process?” He’s like, “Because that’s not the point.” And I finally got it when I started using it and teaching with it. The thing I like is that based on the student’s questions as they’re talking to the patient, I’m responding to them as a patient would. Whereas if it was something that’s canned, you’re not testing your student on how well they communicate therapeutically with patients. You’re not testing them on how well they can adapt to the situation that is presenting itself. Because if they say something wrong, then the patient needs to get mad at them and be like, “How dare you say that to me?” My favorite is when we started VR and they just come in the room and start caring for the patient. I’m always like, “Who are you? Why are you in my room?” “Oh, I need to introduce myself and explain what I’m doing.” Having that ability for them to, and I have surveys that students say, “I love the fact that I can make a mistake and learn from it.” That to me is the goldmine of Acadicus and using this platform is that it allows me to allow students to make a mistake and not be perfect. Because I think we sometimes try to hold students to a standard that we shouldn’t because they’re students. They shouldn’t have this all down perfectly. They shouldn’t have everything ready to go and set because what do you learn from that except for how to memorize and then forget.
Jon Brouchoud:
That’s exactly right. Maybe it’s a reflection of if I was a bad student when I was in school, but for me, if I get into a scenario like that, all I’m trying to figure out is what does this program want me to do? How do I game this program so I can get the best grade? I’m trying to figure out how does it work on the backend so that I can gamify it to figure out what it wants me to say so that I can get through it and get the best grade. I’m not role-playing. I’m not immersed in it like I’m actually a nurse. Also, simulation as a whole, just for practical reasons, has always been about a sliver of a healthcare experience. You take a little tiny sliver of it and say, “Okay, we just walked in. What do we do right now?” And that’s it. You walk out, debrief, and it’s over. But like you said, four patients in that dental suite, that’s a closer reflection of what you’re actually dealing with in reality. You don’t just check in, deal with one moment with one patient, then go home. It’s complicated. There are lots of people. There are lots of factors. You’re stressed. You’re trying to prioritize, right? The more we can tease that out into the broader spectrum of healthcare, the better.
Rick Casteel:
Yeah, exactly. Well, Jon, if I share my screen a minute, I have way too many windows open. I just wanted to go over this real quick. So this is the document that you referenced earlier. That is the list of content that we have in Acadicus based on the OpenRN material. You can see we have 25, 26, I forget, listed here now. But behind each one of these, and I want to start up here at the top, right, you get a quick overview and what the learning objectives are. And again, this is where every instructor needs to start. Mitch, please jump in, right, because you’re doing this every day. But, you know, again, this is kind of what I try to teach. Look, you have to start with what’s your story? What are you trying to tell the students, right? Those are your objectives. And then you go, you know, you can’t start with a scenario and then figure out what you’re going to teach. You have to have the objectives first. So if we look at this one, right, the safety hazards, this is kind of the one I start out with, with most organizations because it’s pretty direct, fairly simple and easy to understand and get through. It’s two things here. Identify the common safety hazards in the environment and communicate therapeutically with the patient regarding them. Pretty direct. But then behind this, if you go actually into the plan, you can see there’s a whole lot more information. You get this whole background and overview. Again, our objectives. How is this aligning with curriculum? Another really important factor. Why are we doing this? What are we teaching? Where is it going to enhance what’s going on in the classroom? We get into a whole map of how and why the room is set up the way it is and what’s in there. And Mitch, this is where you mentioned, hey, you know, anybody can look at this and go, and Mitch will say, “Hey, maybe I want different meds in there for a different reason,” like you did with your administering cardiac medications, right? We can see that. We even have a link that if we click, this goes to another document, which is the MAR that you’d find in the room. And again, I’ve had organizations actually take this document and expand on it and then bring it into the room later so that they can have a different set of circumstances and conditions. We talked about one organization that added an order for O2 and an IV. They put an IV pole and O2 in there for their nurses to go in and go, “Wait a minute, this isn’t the right flow rate for the O2. And it’s not the right flow rate for the IV. Let’s correct that.” So just two more things that they added. But this is also, again, how you can kind of structure your thinking around even creating something from scratch, understanding what the pieces are that you need to get together in order to do that. And then here it is. It’s walking the students through this. Students are going to perform an environmental assessment. I’m not going to read this, obviously, but you can see the level of detail it gets down to, like how you can prompt Miles, how you can prompt the student, what the student is expected to do or be able to carry out. And then all the way down to what you can review in your debrief. So these are really, really detailed. And again, I love to go over this because it’s so much great information. And I think a lot of people kind of gloss over it or don’t take the time to really dig into it and understand what they can get out of even this simple scenario.
Jon Brouchoud:
And it’s important to remember, too, that… Go ahead, Bill. Go ahead. I have a real short point, and then you go. Well, that, you know, these were not something we created because we think it’s cool. This was vetted by a committee of experts. This was created by nursing educators for nursing educators. It’s been vetted. It’s aligned with the best practices. And it’s been refined over time with input from nursing educators. It’s not just, you know, “ta-da, here it is, we hope it works.” It’s continuously being refined. So I think that’s an important point that it’s not Acadicus that’s creating this content, it was OpenRN and a committee of experts.
Bill Ballo:
I’d say my favorite part is the fact that when we helped write all this and helped create all this, because I was part of the schools that were involved in this, and when we wrote the debrief questions at the end, you know, they were all based off of nursing theory and that’s all linked in all the documents when you go to OpenRN’s website. You can actually see, like, here’s where we can prove that all this is your best practices. As an educator, we always have to be able to prove how we know what we’re talking about and what we’re saying. You can’t just be like, “I think this is just best, and that’s what I’m going to do.” You have to have evidence that shows why you’re doing something and how you came to that conclusion. So it’s all documented. And it also takes the onus off of you as the educator, because the first time I started teaching with Acadicus, it was overwhelming because it’s a new format of teaching. It’s a new, it’s like the very first time I ever ran a real SIM. I almost panicked because I was like, “This is a lot to try to figure out and do,” especially when you’re on your own. Not having to figure out how to write the scenario, not having to figure out how to write the debrief questions and everything else. I can just come in and focus on learning myself about how to teach in this new method. That’s my, I think, my favorite part is that I don’t have to figure out all this other stuff. I can just take these and use them as is to get started. And then as I run them, I can tweak this because I needed to do this in this scenario to meet the needs of my students. So you don’t have to start from, “Okay, here’s a headset and a computer, go make stuff and do it.” You can literally just start with something that’s there and work with it until you get comfortable.
Jon Brouchoud:
Yeah. And to that point, like, I think if I come back three years from now and I visit this document or I look at how these scenarios are being used and it’s still just like this, I will see it as a failure. Like this should change. This is designed to be flexible and ever-changing, just like healthcare is constantly changing. We shouldn’t just take this and say, “Okay, it’s been told, this is how it’s done, and I’m just going to replicate what’s been done.” It’s about like, no, every time Bill runs this scenario, he sends us feedback. We get feedback about like, you know, it would work better if we did this or that. And even these scenarios, writing different versions of it and iterations and sharing those with each other in the community of practice. Every quarter we have a community of practice event and we get together and share what people are building. I’m hoping that we can get to a point where these scenarios are evolving and constantly changing more like a liquid than a static artifact that just never changes. It should be constantly refining. And also one more point on that is that these are aligned with open-source textbooks. If your school wants to adopt these OER resources, it’s all available to you. You can take it, you can adapt it to the way you want to teach with your curriculum. These VR scenarios align with those texts, which I think is a really important point. What a tremendous resource this is. And I just feel like more people should be aware of it. These are open education resources that are available.
Bill Ballo:
And there’s those references I was referring to.
Rick Casteel:
Yeah, here’s the references, right? that all of that support everything that happens in this scene and scenario. And again, just like Bill was mentioning, here’s the cardiac medications to a patient with dementia. He got in here and went, “I’m going to make a change. I’m going to do something different in here because that meets my needs as an instructor and for my students.” So he can get down here and go, “Well, I see these medications that are included, but he has the toolset in Acadicus to go and change that.” He could take them out. He can create new ones that aren’t even in our library because we have the medication created.
Jon Brouchoud:
And chances are pretty good that other schools are also going to want similar changes, right? If there’s a reason why Bill is making that change. And if they think that’s important, then other schools will probably say, “You know what we also think?” Hopefully, it’s a matter of, and then they do it and they say, “Well, I want to change it like this.” And yeah, that’s what I’m talking about. That constant evolution, that would be the dream.
Rick Casteel:
Yep. And there’s so many little things, and Jon, you say this all the time about the number of possibilities. Just starting with a scenario like this, how you can change the dynamic by adding two-character family members or significant others, put them in the room and suddenly it’s a whole different scenario, right? Different possibilities for things you can do. You can start with these vital signs differently, right? Do they come in and find Miles already kind of partially decompensated for some reason? Or during their assessment, he decompensates. Again, whole different thing, but you’ve started with this as the basis and then you can move forward from there.
Bill Ballo:
And I do that a lot. I have my, I call it my generic med-surg room. I built just a med-surg room. That’s what I use for starting off. I keep a stethoscope, a penlight, blood pressure cuff, you know, all the normal stuff in there. Literally just by changing vitals, heart sounds, lung sounds, if they have edema in their legs, and I actually use that for one scenario that I kind of made that, so students come in and they get no prep information because their job is like, we set the scenario as this patient has arrived on your unit and you received no information about them. You need to figure out what’s going on. So you can call the physician, the hospitalist that’s admitting them. And so then they just literally have to go through and I, I do it in six different ways. So I have a patient with hypertension, a patient with diabetes, a patient with CHF, a patient that’s having a stroke. And I literally just go in and change the vitals and the sounds and the things that are going on. I touch nothing else because it’s not about getting meds. It’s about, can you interview someone? Can you talk to them? Can you dig in and figure out based on your assessment? Because sometimes students get very hung up on, well, what are the orders say? What does the doctor say? And I’m like, that doesn’t matter. That’s inconsequential. What matters is can you assess this moment and figure out what’s going on? So that way, you know how to do things. And literally I make no other changes, but that one room gave me literally off the bat, six scenarios.
Jon Brouchoud:
And all of that’s probably a lot closer to reality than anything else, like trying to, you know, change those things, changing up those variables or having family members, like Rick said, you know, suddenly that just changes the whole thing. And that’s probably closer to reality, right? I mean, you do go into a room and deal with a single patient by themselves, but there’s often going to be a family member there. And the family members, at least in my experience, are the ones that are really going to be having the questions, right? They’re going to be grilling you. The patient is just like, I’m, you know, A lot of, you know, sometimes the patient is advocating as well, but family members are going to be there with questions and you have to be able to communicate with them. And those are live people and they’re going to have live people questions. They’re not going to be robots. You know, you have to learn how to communicate with a real person. And the more opportunities you have to put yourself in that role-playing environment, the better off you’ll be when you have to do it in the real world.
Rick Casteel:
You know, here’s one with just that situation, Jon, with the family involved with the fetal demise. I mean, this is a tough, tough situation to have to imagine to go through. And any kind of practice in being able to handle a situation like this is going to be miles beyond probably what Bill and I ever got in school. Right. We maybe read about it in the book. And then, and then suddenly you’re in a clinical scenario and situation where you’re faced with it.
Bill Ballo:
Yeah, we didn’t even have simulation when I went to school.
Rick Casteel:
Right, right.
Bill Ballo:
We did everything to each other.
Jon Brouchoud:
Well, in hospitals and medical schools, I mean, they’re, you know, I’m told over and over that that’s woefully inadequate, the training that you get in actually communicating with real people. It just doesn’t happen. It’s very difficult. A lot of times you’re a resident before you get an opportunity to really do that. And then you’re watching, you know, one person do it one way, it’s like, okay, now, now it’s your turn, go and do that. You know, and I know Dr. McAdams is always talking about that, how difficult that is to teach. And you just don’t have those opportunities. But yet that ends up being the lion’s share of your day. And how impactful that is. I mean, how, and how there’s a right way and a wrong way to talk to people. And if you don’t do it the right way, especially in a stressful situation like that, you can cause a tremendous amount of stress and anxiety for people that’s unnecessary if you just had the skills to communicate adequately.
Rick Casteel:
Right. And that’s the thing I was going to say, this is where VR gives you that safe space because you will fail. Right. Nobody handles a situation like that. Great. The first, second, third time you have to do it. And so that capability to just, you know, again, have that rope kind of repeated exposure to it kind of helps each and every time for you to get a little bit better, a little bit more empathic, a little bit more aware of how your words affect the other folks.
Bill Ballo:
Well, and that, that’s when I say the, the power of using simulation and in the VR and is because I can focus on, like, I have several scenarios that literally the only objectives I have is communicate therapeutically with patient about dot, dot, dot, like whatever it’s on that scenario. Because that’s the thing, like we hear from our employers. So we have advisory committee meetings where the employers in our community, so hospitals, nursing homes, doctor’s offices, they come and they tell us what we’re not doing well enough and where they need us to focus our work as educators. And every single meeting we have with them, it’s communication, soft skills, interacting with their peers. One of the big things I want to work on building at some point is having a scenario that is two healthcare people in a situation where maybe they find their peer stealing out of the supply room or about to make a mistake, or something. And how do you deal with that in the moment? And so like, cause that to me is one of the things that, where you were, where this, where simulation and VR really shines is that it allows you to do it in a space because even in SIM, trying to do those feels really bad because let’s say I’m playing that person that they’re catching doing something wrong. Students don’t want to correct me because I’m their teacher. Right. And they don’t want to have that confrontation with me. And so in VR though, they can have it with me, but it doesn’t look like me. So therefore it doesn’t feel as threatening. And that’s one thing I hear from students all the time when they say, I would rather do VR than in-person simulation, because I don’t feel your eyes drilling a hole through my soul because they feel me. Even if I’m behind the glass and they can’t see me, they still, they say they feel me looking at them. But in VR, because they’re so immersed in the world, and I have to tell you, I had a really fun experience just the other day. My student was going to, was doing some community teaching with her patient in VR. And all of a sudden I see her start to fall over and I’m like, “Oh my gosh, are you okay?” And she goes, “I forgot the table wasn’t real.” She went to lean on the table in VR and almost fell over because she was going to try to, she was like, “Well, I wanted to seem like I was being, you know, more like in the moment with them and that I was leaning in.” And she’s like, “And I forgot the table wasn’t really there.” But they don’t feel it because they’re so immersed in the world. And because it’s so realistic that they don’t feel the judgment and the stress of in-person simulation. And so as I’ve been collecting my survey data, as I do all these scenarios with people, I keep hearing over and over, you know, that’s one thing they really like is that they kind of have, they have to do less of that suspension of disbelief.
Jon Brouchoud:
Yeah. Yeah. When I was in Green Bay with you and I had a chance to talk to your students in that conference room while we were waiting, and that was one thing they were sharing with me is what they like about it is that ability. One of the students said, “You know, I have a job in the hospital right now. I do it well. But when I’m in a sim lab with the mannequin and the teacher staring at me, I’ll make mistakes that I know I would not make on the floor. But when I’m in VR, I don’t feel that way because I can’t see them staring at me and I don’t feel that stress. So I feel more focused on the objectives and more focused on the patient.” I think that’s a good thing all around.
Rick Casteel:
When we think about the impact that VR brings and why we see study after study indicate the better grasping of concepts and retention of information, I think a large part of it, and there’s probably some very deep psychological and neurological reasons we could geek out in, but really it’s the focus. You’ve cut out all this extraneous data. One of the things we have such, Bill, I’m sure you would agree with this, the distractions that we have to put up with, you know, every student having their cell phone. Right. And we got messages and texts and searching stuff out and right, you’re competing with all of that all the time. And so when you think about what do they really absorb during a lecture or during a study session, boy, VR just blocks all that out. There is nothing but the content in front of them to focus on. And I think sometimes it’s as simple as that. You are just 100% hyper-focused on that content that’s in front of you.
Bill Ballo:
But there’s also the increase of, like, because like, I love the hospital soundboards. Because then you hear the ding, ding, ding in the background and you hear all the noise. And so while you’re blocking out all the other stuff, you are creating, because in my sim lab, our in-person sim lab, we don’t have that. You don’t have the background noise as the background noise. And there’s also that part of, you know, when the mannequin is talking to them, but it was talking like a ventriloquist because the move. But in VR, I hit talking. And so as I’m responding for the patient, they’re along with me. And so it helps with that. And then I can move their head to look at, “Oh, you moved to that side of the room. OK, now my head’s turning this way.” But it’s that piece of it that adds that realism that again, that because when you talk about suspension of disbelief and you know that you have to have that discussion with people prior to going into a simulation, but I don’t have to do that as much with VR because I can make it so much more realistic than the plastic mannequin because there’s no head tracking. There’s no eye tracking. There’s no lip movement. There’s no weird metal screw going through the elbow that looks weird. And the mannequins we have have metal bolts coming out of them because that’s where you attach the telemetry on and stuff. And the sounds are so clear. On a regular mannequin, my students will be like, “Am I hearing the motor or is there a murmur?” Because they can’t tell what’s what on a mannequin. In VR, they get that. They don’t have to be like, “Wait, am I hearing a murmur? Or am I hearing an S3? Or is that just the motor of the mannequin?”
Rick Casteel:
Right.
Bill Ballo:
That realism is huge.
Jon Brouchoud:
It’s amazing how much sound impacts visual fidelity. That’s one thing we learned early on is like these environments can be photorealistic, but if you don’t have the audio to match, it just goes clunk. It feels fake. But as soon as you crank up that ambient audio and like, we have all those different options that you can change, you’re right. You have to learn how to communicate with an agitated family member, but you probably have to do it while you’re hearing an occluded IV pump in the next room and you know that patient is driving them crazy, but you still have to keep being sensitive to the current task. But you got to remember now I’ve got to, all that’s what’s happening. Those alarms, they get just baked into your head when you’re working in the field, but you have to simulate that too, because it’s a part of it.
Rick Casteel:
Oh yeah. You hear that all day long when you work in a facility for sure.
Bill Ballo:
I remember this about those.
Rick Casteel:
Oh, the effort. I remember we had programs to minimize noise. But there’s kind of this background minimal amount of sound, you just can’t get rid of it. Call bells are going off and IV poles are going off. And boy, when they introduced bed alarms. Oh, I mean, people would just roll over and an alarm would go off. So yeah.
Bill Ballo:
Well, and then just even, you know, the people who were stocking rooms, who were pushing the cart down the hallway, you know, hearing the clunk of the wheels going down the hallway and the physician who’s at the desk, you know, talking to other people while you’re trying to chart. I mean, there’s just all those things that are going on. So in my first semester groups, I don’t ever have the ambient sounds on because I want them to focus more here in this moment. And then in second semester, I crank it up a little bit. And then by fourth semester, I’ve got them go, I’ve got a lot of noise going on then. I’ve had students who were like, “That was really hard to focus and concentrate.” I’m like, “Good, that’s the point. You have to be able to drown out what’s going on and get yourself hyper-focused in that moment.” And that’s not a skill that you can teach in regular simulation. You need something like a VR world to do that. Or you have to have a high-tech simulation center that’s going to cost you a lot of money to build.
Rick Casteel:
Yep. Part of the purpose of having a VR lab is to help you save some of those funds for other purposes.
Jon Brouchoud:
Save money, but also expand capacity, expand capabilities. Saving money year after year. That’s the thing. It’s also kind of, I guess this is a little bit of a segue, but there’s always this question about, “Well, we decided we’re going to do VR at our school and now we’re choosing a platform.” You don’t ever do that when you’re stocking a simulation center where you’re like, “Okay, we just…” I mean, I guess I’m sure there are turnkey packages that one company will provide everything, but it’s pretty rare. Usually, you’re going to have, “I want this skills trainer from this manufacturer. I want this mannequin over here. I want this piece of equipment. I want this kind of warmer.” And you’re working with multiple vendors. I think we’re keeping the cost down to the point where you could get to a point where you could maybe, we were absolutely not trying to replace mannequins and simulation centers, but maybe you could trade up one of them and buy a couple of these VR platforms. I often talk to other vendors, the founders of some of these other platforms, and we talk all the time about, we should be bundling our software because they’re just apples and oranges. They’re just different things doing completely different things that each have their own value, but they’re just different possibilities.
Bill Ballo:
Well, just to think about costs, not only you have the cost of building the joint and having it set up the way you need it and the mannequin and skills trainers and what else. But then you also have the increased cost of you have to have IVs, IV bags, tubing, thermometers, blood pressure cuffs, I mean, all those things. So if you can half that need by doing the first part of your simulation work in virtual reality, you’re not only saving money on the number of mannequins and space you need, but you’re also saving money on all those other things that come along with it, gloves, I mean, just that alone, I mean, you’re talking about hundreds of thousands of dollars a year that you’re saving. So that’s our goal in our program. Our state allows 50% of our time to be in simulation. So the way we’re hoping to divvy it up is 25% of the students’ time will be in virtual reality, 25% of their time will be in in-person sim, and then 50% at the bedside. So not only are we decreasing the amount of time in the sim lab itself, but we’re saving all that money from that 25% of time with not having to buy the IVs and the gloves and the pen lights and everything else that goes along with it. By doing 50% simulation, 50% in-person clinical sites, that means that we can increase our capacity because the greatest challenge that we have in education for capacity is clinical sites because that’s such a hard thing to get. So if I can say, “Oh, your clinicals are on Tuesday and Wednesday. Tuesday is your simulation day and Wednesday is your in-person day.” For the other group, your in-person day is going to be on Tuesday and the other ones are going to do simulation on Wednesdays. I can double my capacity for accepting students into my program then, which means more booties in seats equals more money and increased revenue. So there’s so much to be said for how not only is this something that can save you money, but even could make you money.
Jon Brouchoud:
Yeah, absolutely. Even if you’re going to an organization, a school that has the resources to do all that, and they buy all that equipment and have all that on staff, because of that expense, they tend to be pretty guarded. If all the simulation centers I’ve been to, you have to wait for somebody to let you in. You can’t just wander around as students. You can’t just go in there anytime you want. These are very expensive facilities with very expensive equipment. So even if you’re at one of those really high-end organizations that has a big multimillion-dollar sim lab, you can’t just walk in there anytime you want. The access is extremely limited. So I think once we get this, the open door of virtual reality, it’s a lot easier to have a VR lab that you can let students come into. Instructors don’t even have to be on-site. They can be at home or camping as long as they have a decent Wi-Fi connection, they can be teaching. So there’s a lot of opportunities there to expand access that otherwise isn’t there.
Rick Casteel:
Absolutely.
Jon Brouchoud:
Well, we’re approaching the top of the hour. Is there anything left unsaid that we want to cover? We kind of covered a lot, a lot of stuff here, a lot of good stuff.
Rick Casteel:
Yeah, that’s great.
Jon Brouchoud:
Cool. Well, I guess with that, we can wrap it up unless there’s anything else.
Rick Casteel:
Hey, thanks, everyone.
Jon Brouchoud:
Yeah, thanks for joining. Have a great weekend, and we’ll see you next Friday. Catch up with you later, Mitch. See you, Sean. Yeah. Thanks, Mitch. Thanks, everyone. Thanks, Rick. See you.
Subscribe to Our Newsletter
Keep your finger on the pulse of VR simulation with educator interviews, featured content, and best practices.