Episode Summary

In this fascinating episode, we go behind the scenes at MDA to explore how they are leveraging XR to prepare for outer space operations. MDA chief software architect Bart Verzijlenberg and VR lead developer Jessa Zabala give us an inside look at their pioneering work with augmented and virtual reality.

Discover how MDA utilizes VR headsets to enable training for outer space. Bart and Zabal share valuable insight into how VR is enhancing pre-operative planning, with astronauts and flight controllers able to immerse themselves in realistic experiences prior to critical missions.

Additionally, we discuss how VR assists onboarding and continuing education for MDA’s growing team. New hires can explore full-scale models of the International Space Station, gaining an intuitive understanding of MDA’s complex operations and technology. The episode also covers MDA’s use of XR for engaging demonstrations and mission planning.

Verzijlenberg and Zabala share challenges faced during initial XR implementation, including security constraints and hardware limitations. However, increasing comfort with the technology has enabled streamlined workflows and multi-site training. Listeners will come away understanding how XR has become an integral part of MDA’s drive to push boundaries and helpful advice for implementing your own VR program.

Key Moments

  • Personal background, what led to VR and MDA (04:20)
  • Process of bringing VR into MDA (05:36)
  • A-ha moments (07:32)
  • How MDA is using XR today (08:45)
  • ArborXR helping MDA (18:53)
  • Challenges with deploying XR (26:29)
  • Where do you XR going from here at MDA? (33:58)
"We’ve been able to use ArborXR with the Canadian Space Agency because it's so secure. And then, we can now distribute apps that we own, and we have control of the headsets. And even with future customers, we can say, ‘You guys can buy the headsets. We still have control over the deployment. You won't have access to all of the code and that sort of thing anymore,’ which is a huge win for us."
Jessa Zabala
Lead VR Developer for MDA

About the Guests

About Bart
Bart is currently the lead software architect for the Gateway External Robotics System (GERS) heading for lunar orbit. He is a founding member of the DREAMR lab (Dynamic Robotic Emulation and Mixed Reality) where he explored mixed reality value opportunities. Starting with simple lessons it quickly grew to remote training (in Covid context), mission review/planning and test support.

Bart Verzijlenberg has been with MDA’s software department since 2010 working on a wide variety of robotic systems and next generation technologies. Projects have ranged from 4 medical robotic devices (MRI based biopsy system and multiple generations of a robotic exoscope used widely in neurosurgery), and testbeds for various flight systems, to next generation robotic development for Lunar Gateway.

About Jessa
Jessa Marie Zabala is the Virtual Reality Lead Developer at MDA and is the DREAMR Lab’s Project Lead. She develops and maintains Meta VR apps used for training astronauts, preliminary mission operations planning, onboarding new employees, and public outreach.

As the DREAMR Lab Project Lead, Jessa is responsible for leading major robotic upgrades in the DREAMR Lab, such as replacing the joints of their zero-gravity robotic arm used for ground testing.

In her 5 years at MDA, she has worked on flight hardware projects, such as repairing one of the latching end effectors of the CANADARM2 and the design of flight support equipment for sending up spare joints to the International Space Station. Jessa has also worked as a mission planner for MDA on the OSIRIS-REx mission that recently brought back a sample from the asteroid Bennu.

Her previous experiences with hardware and mission planning have been invaluable to her work in VR development for astronaut training.

About MDA

Serving the world from Canada, MDA is an international space mission partner and a robotics, satellite systems and geointelligence pioneer with a 50-year story of firsts on and above the Earth. Today, they’re leading the charge towards viable Moon colonies, enhanced Earth observation, communication in a hyper-connected world, and more. Together with our many intrepid partners, we’re working to change our world for the better, on the ground and in the stars.

Episode Transcript

Brad Scoggin: Hey there, welcome to “XR Industry Leaders” with ArborXR. My name is Brad Scoggin, and I am the CEO and one of three co-founders of ArborXR. And we’ve had the opportunity of working with thousands of companies since 2016, and we’ve learned a ton about what it takes for XR to be successful in your organization.

Will Stackable: And I’m Will Stackable, co-founder and CMO. This podcast is all about interviewing the leaders who are on the ground making XR happen today. True pioneers in the space from Amazon, Walmart, and UPS to Koch, Pfizer, and beyond to uncover the pitfalls, lessons learned, and secrets that you can use to help grow XR in your organization.

Brad Scoggin: All right. Well, today, we get to sit down with MDA, and this is one of the rare occasions where we get to interview two people.

Today, we have Bart Verzijlenberg with us. Bart is the chief software architect for the Gateway robotics System. He’s also the co-founder of the DREAMR Lab, which is the VR portion of MDA.

We also have Jessa Zabala, who is the VR lead at MDA ,and she is the project lead at the DREAMR Lab, which means she makes sure everything’s runs well. Great to have you both on the show today.

Jessa Zabala: Good to be here.

Bart Verzijlenberg: Yeah, nice to be here. Thanks for having us.

Brad Scoggin: Definitely. I always like to start with a little bit of personal background, so maybe if you guys would each take a turn and just tell us a little bit about your work history prior to MDA and what led you to MDA, and to VR specifically.

Jessa Zabala: I am going to let Bart start because I actually haven’t heard his story.

Will Stackable: Oh, boy. I came straight out of my master’s. I was working in underwater robotics, and MDA doing space robotics is a pretty cool place to be.

At MDA, I started out with medical robotics, MRI compatible breast biopsies, a little bit of neurosurgery, and then, spent a good number of years working on a more detailed medical robots for looking inside of people’s brains and helping with surgery there, so stood over top of an open brain and that in turn led to other safety critical things like the growth program that I’m on now, and in between, played with VR quite a bit in our robotics lab. Awesome.

Jessa Zabala: Mine is a pretty random story. For me, I was just really, really into space and everything like that. For me, it was, age 15, see a picture of the International Space Station and the Canadarm on a history textbook and I was like, “That. I’m going to do that.”

I went through university and just decided, this is where I want to end up, at MDA working on those robotics, as my end game in my career. I was lucky enough to actually get hired right out of school, and they gave me a HoloLens on my first day and said, “Hey, we have an app on this headset that interns have been working on for a really long time, but it’s broken. Can you do that for your first six weeks?”

And I went, “I wasn’t hired for software. You guys know that, right?” And they’re like, “Give it a try.” And it was just the rest of my story after that, which is pretty cool because, when the DREAMR Lab got interested in mixed reality and the HoloLens and Bart jumped on it, he started working on that throughout the first year of COVID and then looped me in around the end of 2020, when we started really working with the Oculus for the VR portion for training our astronauts. And yeah, a few years later, I am the VR lead now after training with Sensei Bart.

Brad Scoggin: I love that. Okay. We should take one step back because you just said “training astronauts.” Arbor is a fully remote team. We actually had our all-team call this morning. We were talking about this interview coming up, and we’ve got a big chunk of our team in Canada. And when we mentioned working or interviewing MDA, working with MDA, and then, also, the Canadarm, the Canadians all got very excited. Maybe, for those who are not Canadian, if one of you would just take a minute and tell the world a little bit what is MDA, for those who don’t know, and the significance of the Canadarm.

Bart Verzijlenberg: Go for it, Jessa. It’s your passion there.

Jessa Zabala: Okay. Yes. MDA designed and developed the Canadarm, which put together the International Space Station when it was first launched. This is just a huge feat for Canada that this is our big claim to fame. The International Space Station is up there, and yes, NASA gets to rubber stamp everything that goes up there, but we built it so … Exactly. It’s on our money. The Canadarm2 is on our money, which is really cool.

The Canadam1 built the International Space Station and the Canadarm2 and our dextrous robot called Dextre, it has two arms, helps maintain the station. That’s essentially what MDA does. We help maintain the International Space Station, help bring food up to astronauts and stuff like that. Yeah.

Brad Scoggin: So it’s slightly a big deal.

Jessa Zabala: Slightly.

Brad Scoggin: Very, very cool. I love that. Yeah, that’s awesome.

Bart, maybe you could tell us, as being a co-founder of the DREAMR Lab and that being the VR portion, what was the process of bringing VR into MDA?

Bart Verzijlenberg: I wish it was something crazy. We looked at, initially, the HoloLens and we were like, “Well, that’s really cool. Let’s play with that and see what we can make it do.” And we started toying with it. The DREAMR lab, we have a couple of robots in there. That gives us a little bit of a more physical manifestation of things. And the HoloLens specifically let us play with overlaying virtual data over top of real world data. When you touch a robot, it has a sense of touch to it. Being able to visualize this is how hard that pushed right in front of you instead of having to look at a computer was pretty useful. We started playing with both mixed reality and virtual reality just to see what we could do.

And initially, it’s very easy to go down the paths of, this is really cool. And we tried to quickly narrow it down to, this is what adds value to our operators or to people. Because it’s easy to, once you get over the flash of the technology, what are you left with?

That’s where we started looking at, well, we have all these detailed models, we’re doing a lot of training of astronauts, but it’s not easy to understand the training, necessarily, when you look just at pictures or video, it’s much easier to experience the reality.

At that point you get a full scale model of the International Space Station with the Canadarm2 and SPDM. We started playing with that. We got in touch with Kevin Nasimok, who is one of the astronaut trainers for MDA. We started banding back and forth, how could you use this? What is useful? We would try a few things. We would join up together in a virtual space in an online room and try the application and see what was useful, and it just snowballed from there. Have you

Will Stackable: Had any big aha moments, and even, I’m curious, during COVID, did this change the urgency or the need? But have you had any of those moments where you thought, okay, this is something we just couldn’t do before XR?

Bart Verzijlenberg: Yeah, for sure. Obviously, we started playing with this months before COVID hit. Literally, I grabbed the HoloLens from the office, went home and didn’t come back for a couple years.

Jessa Zabala: Yeah.

Bart Verzijlenberg: One of the aha moments was, the borders were shut down. Kevin had to train somebody in Houston and you could do that through a Zoom link and see each other physically, but we tried using the HoloLens instead. And the real aha moment there was that Kevin could virtually see the astronaut nodding along and looking and getting that sense of presence and connection, and you just couldn’t get that through the Zoom link in the same sense.

And same thing when we were collaborating to figure out the applications. We stopped meeting on Zoom and we would just join in the application. Again, you’re talking with three or four people and you look at them, and you get a sense of the physical interaction. And especially during COVID, that physical presence was missing, and the VR, ironically, actually gave you some of that back.

 

Will Stackable: Brilliant. Fast forward, it’s now October and we’re past that initial COVID shock. A lot of people are back in the office. How are you using XR today and what’s changed? What’s evolved?

Bart Verzijlenberg: I think in part, we’re doing, even though we can travel again, it doesn’t mean that you want to. It’s expensive. Doing testing on hardware for example, could be pricey. Or sorry, it can be expensive to bring people to the testing. Using VR to allow telepresence to participate in the test to see what’s going on, it’s useful. Anomaly detection is helpful.

We continue to do a lot of training, and at this point, it’s not so much that you couldn’t do it using the standard classroom approach, it’s that it is nicer to have that all-encompassing environment where you are on station and you are actually seeing the real deal.

And actually, as an anecdote there, Tim Kopra, who was our … He ran our division for quite a while, he is one of the astronauts that was on station. And I walked him through it with a HoloLens and we exploded the station full scale, and we were standing on top of one of the habitat modules outside the station. And he looked at me and he said, “This brings me back to doing a spacewalk.” He got the chills from being back on orbit. That was very awesome, just to get that presence. But also, again, it gives you, to anybody who can’t be an astronaut, it does give you that sense of, oh, wow, I’m seeing this.

As part of that, the next generation, the gateway program that NASA and Canada and other countries are participating in, we took all those models and, again, as an extension now we can walk around the gateway station virtually and experience the scale, the differences with ISS. Yeah, again, it just gives you that presence.

Jessa Zabala: Yeah. We’ve also-

Will Stackable: I love that. Great use case. Go ahead.

Jessa Zabala: We’ve also found that our students are just way more engaged in the virtual space because there’s this sense of freedom where you can stick your head in this model, you can touch stuff, you can pick stuff up without feeling like, oh, this is hardware, this is space hardware.

A huge part of it, we can take off all of the shells on our hardware and they can see it moving, which is not the same experience they get when they have to fly to any of our sites. We have engineering models, but they’re all encased in metal, obviously, so you only get the outside. Whereas, here, they get to see all of the gears shifting and things moving. It’s like, “Oh, that’s how that moves? That’s what you guys are talking about when you’re saying that?”

It’s been really cool. And just the fact that our training sessions are now multi-site. People don’t have to fly to Houston. We’re actually facilitating sessions between St-Hubert, Houston and Brampton all at once where Brampton is providing tech support on the ArborXR side, and then, St-Hubert has the trainers and Houston has the students. And it’s just crazy how well this works. And especially when the students finally meet their teachers in person, it’s like, “It’s you, but you’re in person.”

A crazy thing is that we worked with Kevin every day for two years during COVID. Did not meet him in person for two years. We finally meet him and you’re already best friends because you feel like you’re with this person physically every day.

And I remember this one time Kevin and I were having a meeting in VR and he sneezed, and I actually flinched, even though he wasn’t there.

Bart Verzijlenberg: Because I know Kevin will ultimately end up listening to this, after two years of working together, the height adjustment on the headsets was not necessarily representative. When I met him the first time, he came up to about my mid chest level. He’s a very short gentleman. Very nice guy. But it is been no end of fun for us to realize that we’ve known each other for two years, but just some of the scale difference, it’s just never quite hit home.

Jessa Zabala: But Bart is also a giant, so that was a shock factor for me too.

Will Stackable: That’s amazing.

Bart Verzijlenberg: One of the other things we’ve done, we’ve run probably 250, maybe 300 of our fresh hires, MDA has been going through our growth spurt, so rather than trying to go through slides, we do a lot of our onboarding in VR now. People get to walk around the station to understand what they’re working on, including our mission ops people, they look at, how do we want to use the robotics in the future? And they’ve indicated that after a year of doing this work on their computer on a flat screen, the insights that they gained in just an hour of being in virtual reality were enormously powerful for them, which was a super exciting validation of the work we were doing.

Will Stackable: Wow.

Jessa Zabala: And we used to not actually have this kind of training until a program because we never had hiring sprees like this. Maybe one or two engineers got hired every year, and that person got paired up with a senior engineer and they would work with them as a junior for a year or two before they were independent. But due to the amount of work and growth that we’ve seen, that’s just not feasible anymore. We were like, “Hey, we think we need to onboard these people properly. Can we use this tool?” And they were like, “Oh, awesome. We’re glad that you guys already developed a tool like this.”

 It’s definitely seen site-wide use. We have so many tours booked. It’s crazy. And it definitely, it raises morale, from the people in finance to our engineers, to our interns. It’s huge.

Will Stackable: I want to zoom out. I’m going to ask the question that I think a lot of people listening probably are asking. Your use case is pretty sophisticated. There’s a lot of companies right now that are just early on testing out, is VR, AR, even something we wanted to use? How can we use it?

Your team has been doing this for a while and you’re using it in multiple different ways. You have a whole fleet of headsets. Could you just give me sort of the 10,000 foot view on, you mentioned students, then, you mentioned onboarding with new employees, then, you mentioned training astronauts, and I know you’re also using it for pitch presentations. Just give me the short 10,000 foot view, where is … I’m almost thinking this is … This new technology, you’re using it so many different ways. Give us the high level.

Bart Verzijlenberg: Of how we’re using it or? Sorry.

Will Stackable: Yeah. Yeah, exactly, the bullet list of, okay, engineers here, students here, astronauts, new presentations. Just give us the laundry list of all the ways you’re using it.

Bart Verzijlenberg: Oh. It’s a couple different major programs, effectively, that are involved. For Gateway, there is a mission analysis that we’re doing with it, just making sure, seeing how does the arm move, what is the perspective? Is there a reach of limitation? Will we hit something physically?

For all the various configurations on the station, it’s a training thing where you can see what are the different modules called, looking at hardware, so grabbing a piece of hardware that we would normally have to build and taking it into pieces. Especially the ISS space station, we were able to take all of our tools, explode them, and then, again, it goes through understanding how these pieces are put together.

And a lot of it is related to that. I’m not sure … A lot of it is for overview of the general context. I guess we do it both for [inaudible 00:16:52] and for the ISS. We do it, to some extent, with nuclear as well. And then, the other part is in the DREAMR lab where we have a set of emulation systems in place so the robots pretend to be certain physical things, and then, we overlay a satellite on top of that, for example, so that when you physically hit the robot and it moves over to the side, you can visually see the satellite move with it so that you have, it’s more a context cue in that case. It’s easier to understand what you’re looking at when you have that visual overlay of the physical hardware.

Jessa Zabala: Me. But we’re also able to control our robots in the lab through the headset as well. Bart made that available. It goes from being, you have three people in the lab to just one person who can do everything. They can control the robots through the headset, they can see what they need to see with the data, they can hold the e-stop for the robots if things go unsafe. Yeah.

Bart Verzijlenberg: I think a big part of it, ultimately, it’s the overview, especially at the scale with which we work and the expense of creating the real physical elements, if you could even house them in a space. Having very early life-sized access and context for those scenarios is incredibly helpful.

Just being able to walk around a space station, sitting inside, I have never seen anything other than diagrams of the Gateway station, but I already know that it is far smaller than the National Space Station. It is a cramped space compared to the ISS. I know that XDA and XLA, which are the two robotic systems, are far smaller than the equivalents on the ISS. Just that context of what you’re building, it’s much easier when, oh, this is what it looks like, because it is just in front of you.

Will Stackable: When you mentioned students, did you mean astronauts? Are those …

Jessa Zabala: Yes.

Bart Verzijlenberg: Yeah.

Will Stackable: Okay. So you call them students. Interesting.

Jessa Zabala: Yeah. Sorry.

Bart Verzijlenberg: It’s astronauts in training.

Brad Scoggin:  You wouldn’t think of an astronaut a student.

Jessa Zabala: Yeah.

Will Stackable: Cool.

Jessa Zabala: We say student because we train astronauts and flight controllers.

Will Stackable: Got it.

Jessa Zabala: Every aspect of running ops, essentially, they’re a student because it could be, anybody has this hat, right? But yes, it’s weird to think of them as students, but they’re trainees until they launch.

Will Stackable: No, that makes more sense. I was thinking students in a classroom like K through 12 thinking, oh, is that an extra thing your team does? Okay, I think I’m getting it now.

The last piece that I wanted to hear about, and then I know Brad’s got questions on challenges, do you use VR, AR, in presentations as your either commercial applications or for government organizations?

Could you share, I know there’s a story there as well. We heard a great story about how you used Arbor in one of those, but I think it’s also just helpful for companies listening that are interested in using VR for presenting complicated information or presenting visuals that don’t show up as well in a diagram or on a PowerPoint. Put somebody in a headset and stick them in the space station. And that’s a whole nother level of immersion. Maybe you could share just a little bit about that use case.

Jessa Zabala: Yeah. We definitely use it for demos and presentations and pitches just to … I know a lot of the people that come to us, the business side folk, want the wowza factor. And then, for us as engineers, we’re doing it, this is a use case, so our audience is more like, “We’re trying to sell the engineers on this, that this is a practical thing.”

One of our recent presentation pitches for some of one of our, hopefully, commercial robotics, I can’t say the name of the customer since this is still in works, but they were doing the presentation and they wanted a VR aspect to sell this customer, “This is what it will look like when you’re there as an astronaut if you take us on. This is what it will look like. This is your scale. These are the ops that they can do.’

We were going to give this whole presentation of the operations that they want our robotics to do, and we were like, “Well, we’ve already done this in VR.” And we were so excited. It looked so realistic. This was the whole selling point. We weren’t just going to show them CAD models. This already looks like it has flown, and we’re showing it in the context of, the Earth is there and the Moon is there, and it is 10 minutes before showtime.

And keep in mind, we went to do a test run the day before, so we were so sure, we were like, “This works.” So 10 minutes before, they’re like, “Oh, my gosh, everything is glitchy. What is happening with this build? We tested it. I don’t understand.” And we’re like, “Okay, roll back one version.” “It’s still glitchy.”

And amazing people on site. I feel like we were all so calm for what was happening here because everyone was like, “Okay, that doesn’t work. Roll it back. That doesn’t work.” I’m in Brampton rolling things back on ArborXR with the version control, and every 60 seconds or whatever, the three people that are there getting things set up are testing that version.

And finally, we get to our last version, and we hear footsteps coming. And we find one that works. We’re like, “It’s not glitchy, it works, okay.” And then, I’m sitting there in VR because I’ve been testing with them remotely, waiting for the people to come in. And then, they come in and we’re like, “Yeah, hi,” virtual handshake, everything’s fine here.

And it goes so perfectly. The suits, I want to call them suits because they’re more executive type people, they were impressed, but the reaction from their VR team was what was perfect to me because they were super excited from a technical standpoint. They were just nerds. They were zooming around asking about things, “How do we do this? How do we do that? Let’s collaborate.” That was just very exciting for us. And it could have gone south pretty fast, so we were so happy that we were able to do that remotely. This was happening in Houston and I was in Brampton running all of this. That was amazing.

Brad Scoggin:  Yeah, that’s awesome. It’s fun [inaudible 00:23:53]-

Will Stackable:  Our engineers are going to love to hear that.

Brad Scoggin: Yeah. Sometimes it feels like we’re in a closet building stuff, so it is fun to hear we helped somebody for real. That’s very, very cool. Okay. Here’s my question.

Will Stackable: I just have to say, too, that scenario, we had something like that in mind. We’ve heard from a number of customers, they’re having to ship thumb drives or Google Drive links. In that scenario, it’s not an option. You’re trying to upload something to Google Drive, send a link over. Good luck shipping something. There’s no way to … So really, you needed something that was instantaneous.

Jessa Zabala: Yeah. And we had multiple issues with this, not just from the ArborXR standpoint where we had to version change. We had shipped our own headsets, they had gotten lost in transit, so they had to buy new headsets, set them up with ArborXR. S.

Will Stackable: Oh, no.

Jessa Zabala: We’re lucky that we have a team down there that we speak to often. We’re very connected with them. And we remotely helped them, set them up. Then, we could see them on ArborXR and we’re like, “Good.” And then I’m like, “Push the build.”

Will Stackable: We’ve heard of people on Arbor tracking down stolen headsets that disappeared.

Jessa Zabala: Yeah.

Will Stackable: And they showed up somebody’s house and said, “Hey, you’ve got our headset.” That’s definitely, the remote location …

Jessa Zabala: “I do that as well. It’s so creepy. I let everyone know. I’m like, “If you’re taking a headset home, I know where you live.”

Will Stackable:I know where you live.

Bart Verzijlenberg: This is counterproductive for me, Jessa. I have a habit of taking stuff, so this is not good.

Jessa Zabala: I know. That’s why I still know you were the one that took that one headset that suddenly turned up again.

Brad Scoggin: And now, we all know.

Will Stackable: Now, we all know.

Brad Scoggin: Well, that’s funny. It’s funny too because it wasn’t … When we made the transition as a company from entertainment to enterprise, and we started interviewing companies, like Will said, they weren’t just mailing thumb drives back and forth. They would ship the whole headset back. We need to get this headset updated. We talked to a guy from a Fortune 500 company, he’s got boxes all around him. This is probably two years ago. He was the shipping guy. He was the guy who would ship headsets back and forth to update them. It’s funny that we’re not that far removed from a weirdly archaic approach to such a forward thinking technology.

Bart Verzijlenberg: Yeah.

Jessa Zabala: Mm-hmm.

Brad Scoggin: Okay. A lot of the companies we talk to have maybe one or two very specific use cases. Or maybe it’s in training, but there’s several use cases within training. I think it’s very interesting that you do have this multitude of use cases, from onboarding to training to putting someone inside the space station.

Maybe talk about some, and you shared a little bit, but maybe some of the unique challenges with your process of deploying XR. And then, it sounds like you’ve had to iterate on the fly because there’s been an appetite for more. Yeah, I would love to hear about some of those challenges, even on the soft side.

You mentioned it being nice that you have people in the field that you have a relationship with that has helped grease the wheels on this thing, so yeah, maybe just talk through some of the challenges of deploying XR the way that you’re doing it.

Certainly, before ArborXR came along, a key challenge, obviously, was just having to manually update all these units. That was, especially given that we have controlled materials in our applications, we have station models, we have robotic arm models, we cannot just freely share those publicly. Some of the more standard tools are just not compatible with that.

A big problem is that, for us at least, is that we were doing, a lot of our models are engineering based. I got a Gateway model, it was three gigabytes. That’s not exactly Oculus compatible or Quest compatible. Just figuring out along the way, how do you get these massive models into the applications, and then, distribute it.

The other key challenge I think we had, not so much in delivering the applications, it was connecting them. As much as Microsoft has got Mesh coming out slowly as an example, and Oculus has a certain amount of public interaction, having multi-user experiences was not something that was really easy to do out of the box.

It took a while to figure out or set up an environment where you could easily share all these models, all these interactions virtually amongst multiple headsets. And I think that’s been one of the critical things, is that the VR has needs on your own. It really unfolds when you have multiple people. You need to be able to collaborate and be together and see the same thing. That was probably a big one for us. Jessa?

Jessa Zabala: Yeah, I would agree. I think, for me, now, more of the issue is because … We have so much control in- house because everything’s secure, everybody has security clearance and that kind of stuff. But some of our customers, like the Canadian Space Agency, they have even more security on them because they’re a government agency. This has been a hard fought battle is that, they have their own headsets. We have to securely transmit our app to them. They have to side load each and every one. They won’t give us access to doing that. With the old version of doing this, when we had, what were we using? The Oculus business or something like that?

Bart Verzijlenberg: Yeah.

Jessa Zabala: They wouldn’t get on board with that. It wasn’t secure enough to them. We were just doing that in- house, and because of that, we had to be able to deploy to it.

We could only deploy the headsets that were in our lab. We couldn’t deploy to other locations and stuff. We still had to send the app, and it has to be sent securely. That could take all day, so any kind of rollback, that’s not possible. And each one had to be side loaded. That’s our headsets in Houston, that’s our headsets onsite at St-Hubert, and then, that’s here. And then, it’s also everybody that has one at home.

With ArborXR, it’s actually been a battle now won with the Canadian Space Agency to actually get their headsets on ArborXR because it’s so secure. And then, we can now distribute the app, because we still own the app and that’s one has been one of the issues is that, “Okay, well we can’t give you access to this. We have to control the headsets, but now that we can deploy directly to the headsets, there’s none of this issue anymore.” And even doing that with future customers, we can say, “You guys can buy the headsets. We still have control over the deployment. You won’t have access to all of the code and that sort of thing anymore,” which I think has been a huge battle for us now with security. Yeah.

That’s awesome. Another story that the engineers are going to love, love, love. They may spend a lot of time working and thinking about security, and obviously, it’s a big deal. It’s a really big deal generally speaking, but especially in a use case like yours.

Another question again, with all these different use cases, lots of custom specialty content, are you making content in-house, or where your content coming from?

Jessa Zabala: We make content in-house. It is pretty amazing because it used to just be Bart, and then, it was just me and Bart, and now, we have a team of, I want to say 10 people, with various roles to make this happen, which is honestly still not enough with the amount of work that’s coming down the pipeline.

But yeah, we have content developers in Unity, but then, we also have one software debugger, which we would love to have another software backend person. We have people who specifically focus on the models and making sure that they are XR friendly, especially for the Oculus. Then, we have trainers, who are our users and making things work, and then, giving us the feedback of what needs to be changed. We have ops planners. It’s pretty great.

Bart Verzijlenberg: And I think that is probably one of the keys that we uncovered fairly early on is that there is development for VR, which, as Jessa said, was oftentimes me, especially at the beginning.

And then, there was the content generation where you’re in Unity, you drag objects around, you attach existing capabilities to them and you make an environment that you enjoy or that gets your lesson across. We set up just enough infrastructure so that the content generators did not have to be programmers. You could just make the scene the way you want.

And we have, as a result, leverage redeemed some building upgrades. And so one of the non- programmers went ahead and used our framework to put together a room the way he likes. He didn’t have to do any programming for it because he just has to generate the content and then we have simple build scripts in place to get him out onto his headset. Yeah, that separation of developer versus content generation was really, really helpful.

Jessa Zabala: Mm-hmm. Yeah.

Brad Scoggin: That’s very interesting. Well, this has been awesome, honestly, so many cool use cases. You guys have great energy and obviously are super passionate about VR and the use cases.

As we move to a wrap, just curious, maybe, it can even just be a simple answer. You guys are pretty deep in this, again, with multiple use cases, significant use cases. What’s your dream, or where do you see XR going from here at MDA?

Jessa Zabala: For me personally, I love it when it’s the engineers who are asking for the latest build to use this for some sort of design planning or ops planning. They’re looking at the International Space Station and we’re going to put the payload.”

That, for me, is if this is an engineering tool that most of our engineers want to use, that’s just a huge win for me. And then, especially, our customer, with the Canadian Space Agency, I love the fact that more and more of the flight controllers are like, “Oh, yeah, we love this. For training, if this has the instructions on how to do maintenance right there, that is awesome.” For me, it’s the practical everyday use cases. That’s my goal.

Bart Verzijlenberg: I think mine might be a little bit more off the wall. My hope, as Gateway launches, we’re going to be in orbit around the moon with the gateway station and robotic operations. And I think one of the ones I would love to see and I think is possible is outreach. I would love to be able to give a headset to a classroom full of students and say, “Watch. Right now, on the moon, they’re doing this operation with the arm and you’re seeing live data on your headset.” And there are some controls that need to be considered for that, but I think that would be just such an amazing use case to be able to participate in that way in an operation going on, and stand there as the arm does something.

Jessa Zabala: Yeah. Because right now we can watch things that happened on orbit, but that’s stuff that we are doing after the fact. We’re taking the telemetry from on orbit, and then, we’re playing it back real fast in our visualizer, which is still cool to watch after the fact, and it’s great for figuring things out and seeing where things went wrong. But …

Bart Verzijlenberg: Yeah, the fun example there is that the Dragon capsule, we have mission telemetry from a Dragon capsule orbiting above the space station or hovering above the space station, coming in, and then, being grappled by SSRMS and being docked onto the station, all of it, real telemetry. That is only after the fact.

It’s certainly very humbling to stand with a life size shuttle coming down at you as the arm grabs it, but being able to do that lives, I think, and for general public outreach, I think that would be really, really cool for the technology to proceed to. And beyond that, what Jessa said, the engineers wanting to use the tool and it being valuable to their day-to-day, I think that’s probably the two main objectives I would have in mind.

Brad Scoggin: Well, I think that’s a huge just further validation of XR when the engineers see it as a necessity to do their job or they think they can do their job better, that’s really cool to hear. And the student thing, yeah, that sounds incredible. Talk about a great way to get the next generation of astronauts excited, for sure. Or students, as you call them. Student astronauts.

Well, we’re at time, but this has been awesome, guys. Really, really appreciate you making the time to do this, and look forward to chatting again in the future.

Bart Verzijlenberg: Awesome. Thanks for having us.

Brad Scoggin: I had to make a quick wardrobe change for those of you that are watching. We ran out of time yesterday, so we’re doing this after the fact, but that was a very fun interview. Training astronauts, I don’t know how it gets more serious than that.

Will Stackable: I know. Thanksgiving’s coming up, so I’m going to have some stories to share with my family. Usually, I’m trying to explain something about device management and they’re always scratching their heads, but astronauts, not a bad …

Brad Scoggin: Building Space station.

Will Stackable: We should say students, right? Astronauts in training.

Brad Scoggin: Yeah.

Will Stackable: I think one thing that was really impactful to me was just, I’ve said this again on other calls, but we’re seven years in, we are, to doing this. I know VRs a bit older than that, for sure. And just now, we’re starting to see these broad use cases where you have companies using VR not just for one, but multiple different applications.

They were talking about everything from pitching commercial clients and putting them in a headset to preparing people to do spacewalks, and I had never even heard of the use case, combining VR with physical robotics so that you can have a better feedback on how things are on the training process and the training and testing process. To me, that’s so exciting to see that this is just part of their daily life right now. VR is integrated into their overall tech stack. It’s a core technology for them and it’s not going anywhere.

Brad Scoggin: Yeah, that’s exactly what I was going to say. I think seeing, for many companies now, VR training is now, it’s really being integrated as a part of their overall training strategy. It’s super encouraging for all of us who’ve been doing this for seven years.

As always, thank you so much for spending time with us. You can check us out wherever you get your podcasts, and we will see you next time.

Never Miss an Episode

Get notified in your email inbox when new episodes go live.

Privacy Policy(Required)

Share this Episode:

xr industry leaders arborxr podcast

Never Miss an Episode

Get notified in your email inbox when new episodes go live.

Privacy Policy(Required)

Episodes