Brad Scoggin:
Welcome back to another episode of the XR Industry Leaders podcast. I'm your host, Brad Scoggin, CEO and co-founder of ArborXR, along with my co-host, Will Stackable, CMO and co-founder as well. Today, we're sitting down with Sophie Thompson, CEO and co-founder of VirtualSpeech, and Gema Molero, Project Manager of XR at IE University.
Brad Scoggin:
IE University is a global institution located in Spain. Sophie and Gema are here to talk with us about how they've partnered to integrate VR-based soft skills training into the student curriculum. Thank you both so much for joining us today.
Sophie Thompson:
Thank you very much. Pleasure to be here.
Gema Molero:
Thank you. My pleasure to be here from Spain.
Brad Scoggin:
Sophie, I’d love to start with you. I know you have a very interesting story o how VirtualSpeech came to be, and I’d love to hear that.
Sophie Thompson:
We originally created VirtualSpeech as a solution to my own problem: a fear of public speaking. Actually, it was more than that—I had severe social anxiety. I wouldn’t even order my own food in a restaurant. I would ask my dad to speak for me or ask friends to help. I had a presentation coming up at university that was going to be assessed, and as a very studious person, it was really important to me that it went well. But I was terrified. I was waking up three months beforehand with real dread about it.
I was speaking to my business partner, Dom—at the time, just a friend—who worked at Jaguar Land Rover, where he was using virtual reality. It was actually his idea that VR could provide a unique, psychologically safe space to practice skills like public speaking. A place where I could not only build skills but also gain confidence, without the fear of real-world judgment.
So we created VirtualSpeech, which became the first VR app specifically aimed at overcoming the fear of public speaking. I used it myself to bridge the gap between total avoidance and actually completing that presentation. Within 18 months of creating VirtualSpeech, I went from not ordering my own coffee at Starbucks to being interviewed live on BBC World News.
Will Stackable:
Wow.
Sophie Thompson:
Yeah, so that’s how it started. Honestly, it was an accident. Just a bit of nerdy fun. And here we are, nearly ten years later. We’ve expanded beyond just public speaking into soft skills more broadly—but at the core, it's always about building not just competence but also confidence.
Brad Scoggin:
That’s awesome. I mean that is really powerful to make something out of your own pain, essentially.
Sophie Thompson:
To be honest, no one else was really supposed to see it. We didn’t expect the response we got. Within six months, we had over 100,000 downloads. Back then, it was VR through Google Daydream or even basic cardboard headsets where you’d slot your phone in. It was on the app store, and people just found us. I think a pivotal moment for us was when someone left a review saying the app had helped their son, who has autism. That’s when we realized maybe this could be something bigger than just a solution for my own problem.
Our first release was in February 2016.
Brad Scoggin:
I like to call that out because that’s a big mile marker in VR – the ten year celebration. That’s when we started as wel.
Sophie Thompson:
And I think we agreed we’d have a joint party next year to celebrate our anniversary, didn’t we? An “ally” event or something like that. (laughs)
Brad Scoggin:
Oh we did! And now I’m held accountable by our millions of listeners. (laughs) Gema, I’d love to hear from you.
Gema Molero:
Nice. Thanks for the introduction to VirtualSpeech. To give a little more background about IE University: We’re an international university where all content is taught in English. We’re based in Madrid and also in Segovia. Because of our global focus, our students need to be ready to operate internationally, especially in regions like the U.S. and the U.K.
When we were looking at educational trends, Sophie and her team actually met with my Dean of Learning Innovation before COVID. At that time, VirtualSpeech was working on elevator pitch training, and there was already some connection there. That was before the pandemic, before the explosion of interest in the metaverse and VR.
When that wave hit—around 2022, if I’m not mistaken, correct me Sophie if I’m wrong—it was the time when Meta launched Horizon Worlds, and everyone became focused on the metaverse. That’s when our senior leadership team said: “We want VR headsets in the classroom.” The idea was to take a step beyond traditional blended programs, beyond 2D content like podcasts, films, and video conferences, and move into fully immersive education.
At first, it wasn’t just about our blended programs. We wanted to explore how we could use VR hardware and software for students on campus too. That’s where our collaboration with VirtualSpeech really started to take shape. Back in 2022, we launched an initial pilot, but even then, we knew that we had to “evangelize” a bit. I use that word intentionally—it felt like we needed to spread the message and encourage faculty to adopt this innovation.
IE University already has a reputation as an innovative institution. But we wanted to push further. We realized that regardless of which school a student belongs to, public speaking and communication skills are essential. Training in leadership and presentation skills are core to our mission. So, we created a transversal program that cuts across departments.
In 2023, we made VR-based public speaking training compulsory for over 1,000 master’s students. That was the first big step—not just bringing hardware and software into the classroom but integrating them fully into our curriculum. And behind that, of course, was a major effort: training faculty, preparing infrastructure, and making sure everyone shared the same vision.
After two years of hard work across different programs, we’re now expanding. Starting this September, we’ll have between five and seven new courses using VirtualSpeech as a mandatory part of public speaking training. I can explain more later about how we handle logistics, delivering both software and hardware to students.
Brad Scoggin:
It’s great to hear at least one good thing came from Zuckerberg’s “go crazy” moment – IE University has fully embraced VR. I’d love to hear how students and faculty have responded to the adoption, both anecdotally and measurably.
Gema Molero:
It’s important to mention that adoption wasn’t easy. In 2022 and 2023, finding early adopters was a challenge. Faculty were hesitant—just like what we’re seeing now with AI in many organizations. But the ones who adopted VR early saw its value, not just for students but for themselves as educators.
Working with VirtualSpeech specifically has given us something really valuable. Students can now train at home, using self-guided modules. We use metrics to track their progress, and that benefits both students and faculty. It gives faculty insights they wouldn’t have otherwise—helping them understand where students need improvement without the need to listen to every individual practice session.
That’s really been a game changer. It’s helped us on both sides: supporting students in developing their skills and supporting faculty in tracking and guiding that development.
And now, with artificial intelligence, things are evolving even more. Sophie can explain more about this, but after students complete their virtual simulations, AI-generated feedback gives them immediate guidance. They receive metrics that tell them where they need to improve and what’s working well.
The technology really helps transform the way we’re teaching and learning. So that’s been a valuable asset for us.
Will Stackable:
So interesting. Sophie, I want to go back to you. What was your initial approach to working with a university? And how did you choose which VirtualSpeech modules would fit into their academic structure?
Sophie Thompson:
As Gema mentioned, we’ve been working together for quite a while now—even pre-pandemic. I think when we were first approached by IE, I was immediately impressed by the university’s hunger for innovation and their commitment to improving how students learn. From the beginning, it felt like a very natural partnership. We were both aligned not only in focusing on learning but also in using technology to enhance that learning.
We’ve had a collaborative approach from day one. Even with that very first pilot class—I think it had maybe just 20 students—we were listening carefully to feedback. From there, over the past five years, that feedback loop has continued. Gema and the IE team listen to students and faculty about how VR can be incorporated into the classroom and how it’s received, and we take that feedback seriously. We’ve built new features based on what we’ve learned from their students. If 100 IE students are asking for something to improve their experience, there’s a good chance thousands of other students globally would benefit from the same feature.
In terms of the modules, the first one—or at least one of the early ones—focused on storytelling and presentations. From there, we evolved into other areas. For example, we now have a Shark Tank-style experience, where students can pitch ideas in a simulated environment.
Will Stackable:
I probably should have asked this earlier, but can you explain how VirtualSpeech is actually used day-to-day in the university? If a student puts on a headset, what exactly are they experiencing?
Sophie Thompson:
Sure. If we take public speaking as the first example, students put on a headset and select from different environments of varying sizes, with audiences of different sizes. They can upload their own presentation slides, notes, or questions, and they deliver their presentation to a virtual audience.
The virtual audience listens and reacts to what the student is saying. They’ll ask questions that are specific to each presentation, so it’s a different experience for every student. After their session, the student receives detailed feedback. That feedback covers two key areas: delivery and content.
On the delivery side, we analyze things like pace, volume, tone, filler words like “um” and “uh,” and even eye contact. On the content side, we assess how the audience is likely to perceive the speaker—for example, if something in their wording sounded aggressive, or if they could have sounded more empathetic by phrasing things differently.
That feedback system also applies in other areas of the platform, like role-play conversations. For example, in a job interview scenario, students can upload their CV, the job description, and receive a tailored interview experience. They’ll answer questions specific to their application and receive feedback—both on content and delivery—plus suggested responses.
The next step we’re adding is an AI coach, which actually launches on June 3rd. That will help complete the learning cycle. After practicing, receiving feedback, and reflecting, the AI coach will help students apply what they’ve learned to real-world situations.
Brad Scoggin:
Two thoughts come to mind. First, when you initially explained this, I assumed it was mainly about overcoming the fear of public speaking. But it sounds like it’s much more comprehensive than that—there’s actual coaching to improve the quality of their speaking. Can you explain what that feedback loop looks like?
Sophie Thompson:
Yes, absolutely. Students can practice by standing in front of virtual audiences or participating in role-play conversations. But beyond practice, there are in-app training modules as well. For instance, there’s a 10-minute exercise focused on building confidence and mindset before presentations. There’s also one focused on eye contact. Students can work through these exercises before practicing.
Then, after their session, they receive feedback that’s grounded in established learning design frameworks and proven theories related to communication and soft skills.
Brad Scoggin:
Interesting. And Sophie, I’m curious—since you personally struggled with a fear of public speaking, how well does the VR experience replicate that fear? When someone puts on the headset and stands in front of a virtual audience, does it trigger the same kind of nerves as real-life public speaking?
Sophie Thompson:
It definitely evokes that emotional response. Logically, of course, you know they’re not real people, even in the role-playing simulations where the avatars are obviously computer-generated. But despite knowing that, the experience still triggers real physiological reactions.
We noticed after the pandemic, people started using the public speaking content even for small meetings, simply because they weren’t used to standing up in front of others anymore. After being “safe” behind their laptops on Zoom for so long, standing up and having attention focused on them—even in a virtual environment—brought back nerves.
Will Stackable:
In their boxers! (laughs)
Sophie Thompson:
Exactly! (laughs) But yes, to answer your question directly: I spoke to a client in the insurance industry recently. One of their directors—someone who’d been working for over 20 years and had a lot of experience giving presentations and handling difficult conversations—used VirtualSpeech and started sweating during the session. Even though she logically knew it wasn’t real, the VR scenario made her feel put on the spot. The avatars were looking at her, waiting for her response, and she felt that pressure.
Brad Scoggin:
That’s amazing. Will and I actually come from the location-based entertainment space, and it reminds me of Richie's Plank Experience. People know they’re standing on the floor in a safe environment, but they get into VR, walk out onto that plank 20 stories up, and completely freak out. People are falling over, sweating, panicking—even though they know it’s fake. It’s wild how powerful immersion is.
Will Stackable:
Absolutely—the power of immersion is incredible. If you haven’t tried VR, it’s hard to explain. From the outside, it looks silly—someone waving their arms around in a headset. But once you put it on, you suddenly feel something. It’s not like watching a movie. There’s that famous Stanford experiment where kids watched Oscar the Grouch dancing on a screen, and afterward said, “We watched a video.” But when the same kids saw him in VR, they said, “Oscar was here. We talked to him.” The way the brain processes VR is fundamentally different. It forms memories.
I want to go back to something you said earlier, Sophie. You mentioned you’re now building an AI-powered engine to coach students. You talked about uploading resumes and job descriptions, which is amazing—because that means the training isn’t just generic, but completely tailored. Could you explain how that works? How does AI change the experience for students?
Sophie Thompson:
Yeah, I mean, the AI makes the learning much more personalized. Traditionally, especially in enterprise environments, learning methods are very standardized. We've all done those boring e-learning courses that aren't interactive, and whether you're terrible at public speaking or excellent at it, you’re often given the exact same course. There's no meeting people where they are—no tailoring to individual needs.
What AI allows is completely personalized learning. You don’t need pre-scripted scenarios or a one-size-fits-all approach. It also means learning becomes continuous rather than a one-time module you complete and move on from. AI makes the experience adaptive—so as you improve, the system can challenge you to improve further.
Brad Scoggin:
Gema, I want to ask you a question. Sophie got to talk about a lot of the fun and positive outcomes. But you have the challenge of managing over 600 devices across multiple campuses. Could you speak about how you do that?
Gema Molero:
The first challenge for us was figuring out how to control all of this. Everyone knows how a traditional library works—checking out a book, returning it—but we had to trust our students to use the headsets responsibly. So we set up a loan period of four to seven days, which worked. But of course, we also needed a platform to help manage and track the devices, especially as we scaled. At first, we didn’t have enough devices for everyone, so we needed a strategic plan.
The big benefits for us were automated data collection and tracking. Especially in our blended programs, where we use our own metaverse, we sometimes get messages from students saying, “I can’t access the app,” and then I can check their device remotely and see that it hasn’t connected to Wi-Fi since they left Madrid. So, from an IT service standpoint, it’s been incredibly helpful to have that visibility.
Initially, we had about 300 headsets in the first year, which grew to 700. Right now, we’re only lending out Meta Quest 2 headsets because they’re the most durable and easiest to move around Madrid and even to other countries. Quest 3 and Quest Pro devices stay on campus.
Another big point is the software itself. VirtualSpeech doesn’t just give real-time feedback to students; it also tracks engagement. Each faculty member can see how their students are performing across two, three, or four sessions. This data helps with grading, too. We’ve realized that if an exercise is compulsory and linked to a syllabus, students will complete it. If it’s optional, many won’t. That’s just human nature, especially for busy MBA students who need to see the value in what they're doing.
At first, we introduced VirtualSpeech primarily as a tool for training students in job interview skills before graduation. But adoption wasn’t immediate. It was important to connect the VR exercises directly to academic outcomes. I want to specifically mention one faculty member, Javier Burnett, who Sophie will recognize—he’s been a key advocate and early adopter of VirtualSpeech, especially in public speaking training. Having champions like him makes a huge difference.
Once faculty understand how it works and see the benefits, adoption spreads. But that was one of our biggest challenges: making sure faculty knew how to integrate VR into their teaching in a meaningful way.
Another factor is the university’s broader mission. Leadership wanted headsets in classrooms—that was part of their strategic vision. And from there, we worked to integrate VR into the academic areas in ways that aligned with that mission. So yes, it’s been a challenge, but today we have over 700 headsets being used daily across different campuses. And that’s the result of having a clear strategy around using technology to enhance teaching and learning.
Brad Scoggin:
That’s great. You mentioned the people side of change management, which can be easy to forget. I joke that step one of VR adoption is ‘donuts’. Bring IT donuts (laughs), and that’s how you start a successful program. As we move towards a wrap, I’d love to give you both a chance to give advice to someone who’s considering bringing XR to their university. What’s advice you might give?
Gema Molero:
For me, it has to start with a clear mission and vision from the academic side. I always say, I’m not here to bring technology into classrooms just for the sake of technology. If that’s what someone wants, they can come to our XR Lab anytime to experiment. But if we’re putting VR in classrooms, it has to directly connect to what’s being taught. Otherwise, there’s no point.
It’s also critical that faculty understand what they’re asking students to do. If they don’t fully grasp the purpose and value, the students won’t either. For us, it’s a win-win strategy: when faculty and students both see the value, adoption takes care of itself. Of course, it’s great for marketing too—you can post pictures of students using headsets on LinkedIn—but for me, the real success is how faculty are being challenged and trained to rethink their teaching.
This is what makes IE unique: we’re not just challenging students, we’re challenging faculty as well.
And I always remind faculty that this work isn’t for today’s students—it’s for the students who’ll arrive next year. This isn’t the future. This is now. If you’re not training yourself in these technologies, you’re already behind.
Sophie Thompson:
Wow. How do I top that? I’d say my main piece of advice is to measure results from the start. Too often, people plan everything around rolling out a program or a pilot but forget to plan how they'll measure success. It’s important to capture both qualitative feedback from students and quantifiable learning outcomes.
For example, at IE, our latest data shows that students increased their confidence by 24% and their skills by 18% after just two sessions. That’s not a one-time survey—we’re consistently tracking those results. If those numbers ever change, we can intervene immediately. But hopefully, we’ll only see those improvements continue.
My second point would be: have a real champion. IE is unique in how quickly they've scaled up to 700 headsets, but that’s because they had internal champions, listened to student feedback, and worked closely with us as software providers to keep optimizing the learning experience.
And my final tip: make it as easy as possible for yourself. Choose intuitive headsets, intuitive platforms, and intuitive software. Even though many students are familiar with VR, a significant portion aren’t. The goal is to eliminate as much friction as possible. Even if something slightly more expensive, if it’s more user-friendly, it’s worth it.
Gema Molero:
I’ve shared some of our business case data in the chat. In that report, you can see the real improvements that Sophie mentioned.
Sophie Thompson:
And that’s what’s so powerful for students. Increasing confidence by 24% in just two sessions—basically 20 minutes—can open up so many opportunities for them, especially at a young age.
Brad Scoggin:
In less than 20 minutes, students using VirtualSpeech increased their public speaking confidence by 24%. That’s incredible. Gema, we’ll make sure to link that report in the show notes. Thank you both so much for your time. I know everyone’s schedules are crazy, with people joining from all over the world. We appreciate you making time for this, and we look forward to connecting again soon.
Sophie Thompson:
Thank you very much.
Gema Molero:
Thank you so much for your support—and for giving us this space to share.
Brad Scoggin:
I said this at the beginning, but it’s just so cool to see someone take their own real personal challenge and turn it into a business and solution for themselves and thousands of other people.
Will Stackable:
I didn’t mention this during the interview, but before we started, I was telling Sophie that when I was in high school, I was terrified of public speaking. Somehow, I signed up for speech and debate—actually, I think my counselor put me in it. On the first day of class, we had to get up in front of everyone and do a mini improv. I thought I was going to die. My face turned red; it was horrible. But it ended up being a great experience and helped me overcome that fear.
It reminded me, hearing Sophie talk, that public speaking really is a bigger fear than death for a lot of people. What a cool application of VR—to help people work through that in a safe, controlled environment.
It also highlights the importance of finding the right content provider—someone like Sophie who understands your use case and can help tailor the solution. VR isn’t just something you bolt onto a program—it needs to be intentional and thoughtfully implemented.
And, as Sophie said, you need to measure the outcomes. Otherwise, you won’t be able to scale, improve, or even prove that it works.
That’s something we’ve been thinking about a lot lately with ArborXR Insights, which is now in beta. I think we have about 40 or 50 companies testing it. Being able to track what's happening in the headset and integrate that data with your LMS or BI tool is so important—not just to understand ROI, but even just to answer basic questions like, “Did someone pass or fail?”
And often, that tracking piece is overlooked at the start of an XR program.
So yeah—great interview. I’m excited to catch up with Sophie again soon.
Brad Scoggin:
Absolutely. And I’m going to end this a little differently. We’ve been having an internal debate as to whether or not people actually listen to the outro. So if you do listen to the outro, email me at brad@arborxr.com and write “Gotcha” and we’ll send you a little gift of some kind. Thanks for listening. If you’ve made it this far, be sure to subscribe wherever you get your podcasts. We’ll catch you next time.