The use of extended reality (XR) technology has allowed faculty to give students these in-depth learning experiences, even while most instruction is currently online.
“[Extended reality brings] students inside the Barrier Reef, back in time to historical events, to Mars and to other educational scenarios that are either low-frequency, high-risk or impossible,” says Sean Hauze, Ph.D., director of instructional technology services at San Diego State University. “And now the definition of impossible has been expanded, given that the vast majority of our courses are virtual … and immersive learning is the key to make this all possible.”
To incorporate extended reality into the learning environment, several campuses have established labs dedicated to introducing and creating immersive experiences, such as San Diego State’s Virtual Immersive Teaching and Learning (VITaL) initiative, California State University, San Bernardino’s Extended Reality for Learning (xREAL) Lab and Sonoma State University’s Immersive Learning @SSU Program, which includes the campus’s VITaL lab and Immersive Learning Development Track.
What is XR?
Extended reality refers to the immersive technologies that blend real and virtual environments, including virtual reality (VR), mixed reality (MR) and augmented reality (AR). Virtual reality is a fully immersive experience that visually transports a person and often requires the use of headsets and other wearable equipment. Augmented reality brings a new environment to the person when they use a mobile device to view images overlaid on their actual surroundings. Mixed reality—also known as spatial commuting—blends the two by imposing a 3D interactive environment on the person’s real space.
“Immersing the student in an environment allows them to be more engaged in that environment and helps them learn more about what the material is that they’re looking at,” says Sara Kassis, Ph.D., Sonoma State physics professor and faculty fellow for immersive learning. “I find it’s more engaging and allows students to be focused on a particular topic.”
But to ensure the use of extended reality accomplishes the learning goals intended, the lab teams need to take a few things into consideration.
Design: To create an effective experience, Dr. Hauze explains teams need to employ cutting-edge XR technology, create unique 3D spatial designs that take full advantage of the available technology and incorporate videos, readings or activities tied to learning objectives.
Research: Teams also need to study the efficacy of the experiences they create to gauge how well they accomplish those learning objectives. For instance, Hauze’s Ph.D. dissertation studied the effectiveness of a mixed reality nursing simulation in partnership with Texas Tech University, Microsoft and Pearson.
After comparing students who learned the content using a 3D simulation, a 2D representation and a traditional case study, the team found, “overall, it’s more engaging and motivating both in 2D and in 3D, and the 3D component was much more attention grabbing and satisfying for the students,” Hauze says.
Accessibility and Equity: Because immersive technologies are visual and at times require expensive equipment, the teams must also consider how they can create experiences that accommodate students with disabilities and are accessible on more widely owned technology.
“We want to make this potential available to everybody,” says Mihaela Popescu, Ph.D., faculty director of Cal State San Bernardino’s xREAL Lab. “… We want to particularly emphasize the social justice aspect and ask the tough questions, such as who gets excluded from these experiences, and whose voice is unheard when we introduce these kind of experiences in the classroom—who gets to benefit, and who doesn’t.”
Especially now that students are using many of the experiences at home where they may not have sophisticated immersive technology equipment or reliable internet, faculty cannot rely on XR experiences requiring either of these. Some alternatives the labs are exploring include device agnostic experiences that can be operated on a smartphone or computer, augmented reality applications or immersive experiences designed for Google Cardboard —an inexpensive virtual device used with a smartphone.
To increase accessibility, Dr. Kassis—and her multidisciplinary student teams developing XR experiences—ensure each virtual experience has both audio and closed captioning, uses high contrast colors and clear fonts and limits the range of motion required so that a person in a wheelchair could participate. In addition, they’ve translated one of their in-house experiences into Spanish.
XR for the Classroom
Before the onset of COVID-19, the campuses’ XR labs produced highly immersive virtual experiences for students and provided headsets on-campus with which to experience them. Some of the fully immersive experiences created by CSU campuses include SDSU VITaL’s Galaxy Gazer virtual reality simulation used to teach the concept of parallax in Astronomy 101 courses and xREAL Lab’s Ambrosia Project, a multi-user immersive experience in which students act as part of an archeological team surveying the mythical island of Ambrosia.
Sonoma State’s VITaL Lab also made use of third-party VR apps in its permanent space within the library, which was equipped with several sets of headsets. Students could sign up for time at a station where they could choose from a list of apps to use either for class assignments or for fun.
“It allows any student from any discipline to use the XR tools that we have to be able to learn more about the educational material they’re learning in class,” Kassis says.
Employing a mixed reality headset, Kassis also created her own electric circuits app for her physics students who were having trouble visualizing the electric current flow and building circuits on a breadboard. Using the headset, students saw 3D images of current flows demonstrating their mechanics and could practice building a circuit before trying it in real life. “The hope was that they could enhance their experience and remember the essential information,” Kassis explains.
Similarly, SDSU’s VITaL team developed the mixed reality nursing simulation on which Hauze conducted his research. Using 125 cameras, the team captured the holographic image of a patient actor demonstrating the effects of anaphylaxis (an allergic reaction to medication). While wearing a headset, students would be in a physical hospital room, but interacting with a 3D avatar of the patient actor.
“[It’s] being able to experience performance scenarios [like this one] that should never happen in the real world, because you should recognize the signs of anaphylaxis early on and introduce the antidote,” Hauze explains. “But in the case of simulation, you can experience that all the way until the patient dies, just to know what that looks like, and experience that over and over again so it’s really easy to react to in the real world.”
XR for Online Learning
During the current pandemic, though, the labs pivoted their focus to XR experiences that could augment online learning at home, such as virtual tours, simulations and 3D models.
For example, SDSU biology professor Sandra Garver began working with VITaL before COVID-19 drove the CSU online to create 3D models of bones and a sheep heart for an online anatomy lab, because a formaldehyde sensitivity prevents her from teaching labs in person. Using a method called photogrammetry, she took hundreds of photos of each item in her anatomy collection and rendered 3D models on her computer.
With the help of VITaL—and its student employees, professional cameras and advanced computer programs—she’s building an extensive online library of these virtual models, both with and without her custom labels, that students can access for free.
“For students to be able to look at the various parts and really get a feel for what they’re looking at, they need to see it in a three-dimensional format,” she says. “I’ve always been exceptionally strong about the idea that anatomy needs to be taught where you can hold it and turn it and spin it. I always thought of this more as a supplement until we hit this COVID-19 situation, and then it became just so much more relevant.”
Similarly, Popescu’s team developed an open source augmented reality application to which faculty members can upload their own images and content. It was first used as a geology study tool that displayed a 3D image of a rock when students pointed their device at a surface. “You can look at rocks online, but isn’t it better to look at them on the table?” Popescu questions. “You can manipulate them … You can look at it however way you want. And you learn differently, because it’s being brought to you in a space you’re familiar with.”
In addition, her team collaborated with Professor of Art and Master of Fine Arts (MFA) Graduate Coordinator Alison Petty Ragguette to virtually recreate an on-campus art gallery in a mobile app. It will showcase scans of MFA students’ thesis projects since their exhibition was canceled this year. “It is a wonderful opportunity for students to be on the cutting edge of VR exhibitions and be able to have firsthand experience designing these VR spaces,” Petty Ragguette says.
Lastly, Kassis is using a free third-party electric field app that students can use in simulation mode or with Google Cardboard. Using a device’s camera, students see their space overlaid with a 3D representation of an electric field they can walk around and observe from different angles.
“At this time a year ago, we said, ‘In five years we’ll look back and wonder how we ever taught without these immersive learning tools,’” says James Frazee, Ph.D., SDSU chief academic technology officer. “Due to COVID acting as an accelerator for emerging technologies that promote active learning, we’re wondering if it’s possible to teach without them today.”
See other ways CSU faculty are using extended reality at California State University Channel Islands, California State University, Monterey Bay, San Diego State University, California Polytechnic State University, San Luis Obispo and California State University San Marcos.