The alphabet soup of AR, MR and VR

Dr Rick Skarbez, a computer scientist from La Trobe University, recently co-wrote an opinion piece entitled It Is Time to Let Go of ‘Virtual Reality published in Communication of the ACM. It’s a timely provocation that argues that virtual reality (VR) and augmented reality (AR) are subsets of mixed reality (MR). I realise this is a lot of acronyms, but it is an argument that deserves unpacking, especially in light of advances in VR hardware that allow for virtual objects (holograms, 3D models) to be placed and interacted with in the real-life environment of a person using the equipment.

Of course, there has been MR reality headsets This includes HoloLens, Magic Leap and the forthcoming Apple Vision Pro that fit within the mixed reality category even though these are sometimes referred to as AR.  All this is very confusing because ideas about MR have morphed over time. For example, I’ve read papers that describe MR experiments that involve adding real life objects (tangibles) as part of virtual reality environments. An example of this might be using real objects that a VR system associates with 3D models recognisable in the real world; that is, it associates an ordinary box that a user may pick up with a 3D model of a house that the user can than interact with in a fully realised (synthetic) VR environment.

There have also been other ways to explain the difference between AR, VR and MR, like this infographic:
AR_VR_MR 1

The above graphic adapted from https://www.mobileappdaily.com/2018/09/13/difference-between-ar-mr-and-vr

Or this more detailed explanation:

ar-vr-mr-differences

Some headsets now have passthrough cameras which record and track, in real time, the user’s environment and, like a live stream, render it in front of the user’s eyes in the headset. This means that virtual 3D objects can then be seemingly overlayed on the user’s real environment allowing them to interact with it increasingly with hand and voice controls and gestures, although hand-held controllers are still commonly used. This might allow you to put a virtual alpaca in what looks like your lounge room and, depending on the sophistication of the programming of the virtual beastie, interact with it in a playful manner. This is mixed reality because it blends the real (your lounge room) with the virtual (the synthetic alpaca) that is programmed to respond to you and your space. Another example is the use of MR headsets such as HoloLens to teleoperate robots with varying levels of autonomy through gestural control. The mixed reality headset manufacturer Magic Leap offers this useful overview of the difference between AR, VR, and MR in terms of user experience:

 

Traditionally, many researchers have used the reality-virtuality continuum from Milgram and Kishino’s (1994) Taxonomy of Mixed Reality Displays (diagram below).

Reality_virtuality continuum

Some university and industry commentators have introduced eXtended Reality (XR) as an umbrella term for the AR, VR and MR. The term “immersive” is also used to encompass the technologies as in immersive education. Others, such as Apple, have gone back to the technical language of spatial computing.

For the non-technical person, it continues to be a confusing terminology mess. The melding of AR/MR capability (via passthrough cameras) into what are conventionally thought of as VR headsets has prompted a rethink.

To return to Skarbez et. al. (2023) who argue that the term mixed reality should be used as an “organizing and unifying concept… (to) harmonize discordant voices” in the ongoing “terminology wars” of the field (p. 41). They have previously written an article that reworks Milgram and Kishino (1994) reality-virtuality continuum. In these writings, they suggest that all technology-mediated realities are mixed reality because it is now common to have environments “in which real-world and virtual-world objects and stimuli are presented together… as a user simultaneously perceive both real and virtual content” (Skarbez et. al. 2023, p.42). Their articles present both potential positive and negative outcomes from moving to a mixed reality umbrella and there is certainly the need to consider what may be lost without the precision of distinguishing between types of technologies.

From an educational perspective AR, VR and MR have some similar but also quite different learning affordances (or properties that can enable educational experiences). Likewise there are common and unique ethical considerations now that AI is integrated into immersive technologies.

I will be having a videoed conversation with Rick Skarbez on his ideas on the future of immersive technologies and their terminology in January 2004 and posting this to the VR School website, so stay tuned.

 

This post brought to you by A/Prof Erica Southgate who is an alphabet soup kind of person who simultaneously uses the terms VR, MR, XR, metaverse and immersive technologies. 

 

References

Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS on Information and Systems, 77(12), 1321-1329.

Skarbez, R., Smith, M., & Whitton, M. C. (2021). Revisiting Milgram and Kishino’s reality-virtuality continuum. Frontiers in Virtual Reality2, 647997.

Skarbez, R., Smith, M., & Whitton, M. (2023). It Is Time to Let Go of ‘Virtual Reality’. Communications of the ACM66(10), 41-43.

 

Cover image from Art with Mrs Filmore, 1st Grade– “Mixed media alphabet soup!” https://www.artwithmrsfilmore.com/tag/alphabet-soup-art-lesson/

Designing a 360° VR boating safety resource for children

Associate Professor Erica Southgate has partnered with 360° company VRTY and NSW Maritime, the government agency responsible for marine safety and regulation, to develop a unique learning resource on boating safety for primary school aged students.

NSW Maritime currently undertake education programs on boating safety in schools and are now investing in the development of this digital learning resource to complement face-to-face delivery and promote mobile learning opportunities for students.

The 360° learning resource will cover areas such as the importance of life jackets, essential safety equipment on boats, and on-water awareness. It will include a child-centred narrative and interactive and informative multi-media pop-ups. Some of these pop-ups will be ‘easter eggs’ or fun findable content embedded in the resource for discovery learning. The learning affordances or special properties of 360° media are well suited to a learning resource on boating safety as students can be virtually transported out to a waterway to get a real feeling of what it is like to travel in vessel while maintaining awareness of safety. They can learn and practice this awareness in the security of the classroom or home. The project includes a participatory component where children will provide feedback on the design of the resource.

So far, we have brainstormed and storyboarded the resource with NSW Maritime staff. And, we have just returned from capturing 360° footage as the video illustrates. Throughout the rest of 2022, we will keep you updated on our progress on 360° Boating Safety Resource. Stay tuned.

Students co-creating safety guidelines for VR

Throughout 2022, we are focused on students as educational VR content creators. This includes students taking an active role in designing their own guidelines for safely using VR equipment. A visit to Trinity College at the start of their VR project saw Steve Grant, Director of Innovation and Creativity, facilitate a brainstorming session with Year 7 students where they worked together to come up with safety guidance for their project. In addition, students also worked as a whole class to develop ideas about good design in VR. At Southern Montessori School, teacher Toni Maddock led her middle school class through a similar co-design process. This video provides a great insight into the start of the project at Southern Montessori with students working together to develop their own safety instructions. As these teachers demonstrate, facilitating powerful VR learning experiences involves empowering students from the very first lesson.

The metaverse

Ever since Facebook announced its vision for their metaverse on 28 October 2021, including the company’s name change to Meta, there has been a buzz about what it might mean for the future of the internet and our digital (and real) lives.  

Of course, this announcement was set against the recent warnings from a reputable whistle-blower about the harm the social media company is doing including to children and young people through its algorithms that shape user beliefs and behaviour, and inadequate moderation of harmful content.  

This blog post unpacks the idea of the metaverse, taking into account Facebook’s vision but also extending beyond it, to understand its history and highlight some implications for teachers.

Where does the term metaverse come from?

English teachers – You Are Up!

The term metaverse was coined by Neal Stephenson in his 1992 cyber punk novel Snow Crash. It referred to a computer generated universe.

40651883._SY475_

Snow Crash is a rollicking sci fi read that has fired-up the imagination of those interested in possible technology futures with its fascinating portrayal of the persistent immersive 3D digital world of the metaverse that can be jacked into through a personal headset or public booths that produce a lower grade, glitchy avatar. In fact, the novel popularised the word avatar. It also highlighted the dangers of corporate and government control of knowledge and its infrastructures, dreamt up a devastating hybrid DNA and digital virus, and featured deadly semi-autonomous weapons called ‘rat things’.

An aside: For an earlier version of the metaverse, but this one was called the ‘matrix’, see William Gibson’s (1984) Neuromancer, a dazzling tale about a VR universe inhabited by mastermind AIs that influenced the Matrix film trilogy (soon to be quadrilogy).

What will the metaverse be?

The idea of the metaverse extends beyond Facebook’s (proprietary?) influence and has been described as a spatialised interoperable version of the internet. At the moment no one really knows what the metaverse might be like although there are current smart glasses, persistent VR spaces and gaming sites that provide a window into social, commercial, communication and creative aspects of it. Users will probably connect with the persistent interfaces, spaces and layers of the metaverse using a VR headset or smart glasses or on a screen (or with some type of yet-to-be-invented hardware that can integrate aspects of these). There is also a future vision, and investment into research, for direct human brain-computer interface. The metaverse will be populated with people in avatar form and by AI-powered virtual characters in human and other forms.

Here is a description of what the metaverse might be:

“The metaverse is the idea of a shared digital universe in the cloud created by merging virtual spaces that are physically persistent together with augmented reality (AR) layered over the real world. The metaverse is singular because the concept includes the sum of all virtual and online worlds along with all AR layers enhancing the physical world… Besides games and hangouts, it will include social media platforms, workplace tools, investing resources, online shops and much more. You’ll be able to immerse yourself completely in this spatial internet using virtual reality (VR) technology or just interact with bits of it that are layered over your physical space via AR. Instead of a profile picture, you’ll be represented by a complete digital avatar or persona. You’ll be able to meet up with your friends’ digital personas and wander around visiting virtual places and attending virtual events.” https://history-computer.com/metaverse-the-complete-guide/

For those interested in how Facebook’s metaverse might be designed in stages see this excellent article from Avi Bar-Zeev, veteran developer of and commentator on all things eXtended Reality (XR).

What does the metaverse mean for teachers and students?

1. Be curious but don’t believe the hype: There is a fair bit of publicity around the metaverse, and this will infiltrate the EdTech space – just remember that the metaverse isn’t here yet (at least in a scaled-up interoperable way), and some suggest it may never arrive. So, it’s good to be intrigued without buying into the hype.

2. Keep up with current research on immersive learning: We are still in the early days of building the evidence base for the effectiveness of immersive technologies for learning using headset-mediated VR and augmented reality experienced through glasses or via screen, especially in schools.  Results are promising but ongoing rigorous research is needed so that we can confidently embed immersive learning into school classrooms in ways that make pedagogical sense and align with curriculum across subject areas. Asking questions about the evidence base and keeping up with the research on immersive learning is vital as knowledge about this will allow us to ask the right educational questions as the metaverse evolves.  

3. Get interested in the (dry) but important areas of privacy law, digital legislation and regulation, and AI ethics: The idea of the metaverse only amplifies existing concerns regarding the automated harvesting, sharing and use of data without user consent including biometric data which is about and of the user body (facial recognition, pupil dilation, gaze and movement tracking etc.) and which can be highly identifying. There are many different forms of biometric data and plenty of biometric harvesting tools available and so we need to watch this space carefully. Automated nudging of behaviour and the affective moods of users will be diffused through the metaverse as current visions see this as a place to advertise and sell products to us as well as collect our personal data in ways which will be highly embodied and emotional. The inclusion of cameras in smart glasses and VR headsets adds another layer of complexity to maintenance of privacy. The Internet of Things will seamlessly fuse with the Internet of Bodies creating legal, ethical and social dilemmas for all of us, personally and professionally. Children and young people will be differently impacted at each stage of their physical, cognitive, moral, and social development. The teaching profession needs to ask who will regulate the metaverse, define its standards, and build and control its infrastructure and content, as this should inform decision making on procurement of technology for schools. No teacher wants to bring unethical technology into the classroom and so we need to start understanding and applying ethical frameworks now and into the future as the metaverse merges with aspects of our everyday lives in work, leisure and learning.

4. Empower children and young people to have a say in what the metaverse should be: Look for places in the curriculum where students can investigate and use the technologies related to the metaverse as well as explore public and industry discourse about its ethical and social implications. Such opportunities should expand the boundaries of digital literacy education to take in civics and citizenship, the environmental impacts of technology, ideas about human-machine relationships, and re-formed conceptions of learning, creativity and identity in the new machine age. Some industry doyens, such as the CEO of the child-targeted Roblox gaming platform which has 42 million daily users logins, suggest that children are already in a proto-metaverse and that one day such platforms will be pivotal to a metaverse providing everything from learning, shopping and business communication tools. Schooling systems rarely recognise the digital leisure life of children and youth, and yet industry is watching and factoring this into their plans for the metaverse. It is important that we as educators facilitate children’s critical engagement and agency in this space so that they are not viewed just as consumers or as data points. The voices and visions of children and young people should be integral to shaping a metaverse which upholds human rights including the rights of child.

The post bought to you by A/Prof Erica Southgate who is looking forward to having a snazzy Star Trek Borg avatar in the metaverse.

P.S. For those interested, here is the full Facebook Meta announcement.

Snow Crash novel cover featured in this post is from https://www.amazon.com/Snow-Crash-Neal-Stephenson/dp/0553380958

Teachers reflect on 360° VR for language learning

This post reports on Athelstone School teachers’ views on using VRTY, a 360° content creation platform, for learning Italian with primary (elementary) school students. To catch-up on the research go here and here.

Language teachers Jo Romeo and Angelica Cardone provided extensive reflections in video and written form throughout the study. They noted that most students were engaged in the learning task of creating their virtual tour of Italy and incorporate the mandated Italian directional language and greetings. Teachers were particularly pleased to see less technologically confident students gain skills by collaborating with their peers either in pairs to create one virtual world or through peer-to-peer interaction more generally.

Teacher written reflections suggested that throughout the unit of work students were developing the Deeper Learning capacities of effective communication and problem solving through self-directed learning and an academic mindset featuring persistence when confronted with a range of difficulties:

“(The project) has enabled aspects of learning as they (students) have designed and created their own (virtual) worlds without too much teacher input. They have explored the platform on their own and used it to showcase their language and IT skills. Students did their own research on well-known landmarks as well as using their prior knowledge to include in their VR worlds. This has enabled them to learn factual historical information about different Italian landmarks and has also improved their vocabulary on directional language.… Students enjoyed recording their voices for the sound markers (that were embedded in the 360° scenes) and some students also researched how to pronounce particular words. They became independent workers as most of the time they problem solved on their own trying different strategies to see if they worked or didn’t. This displayed determination and commitment to successfully complete their (virtual worlds).”

Throughout the research, teachers learnt about the potential of immersive storytelling for language learning and students learnt about this too, guided by a mix of instructional strategies and creative processes. Instructional strategies included explicit teaching, scaffolding of student independent research and student production of different types of interactive media in Italian and English to be embedded in the scenes of their virtual Italian tour. After students had created several interconnected 360° scenes, teachers encouraged them to make audio files of themselves (sometimes with peers) orally using the directional language central to the curriculum. These voice recordings were then embedded in appropriate places in 360° scenes along with other media students had sourced or created such as photo and text information pop-ups providing historical or cultural facts related to the scene.

Students exhibited joy when experiencing their 360° creations through a VR headset, as the teachers explain:

“Most students reacted (to the immersive experience) with expressions such as ‘This is amazing’, ‘This is so cool!’. They were actually able to experience firsthand by being engaged in their virtual world. … (T)hey were able to interact more with the world they created using the headsets because for them it felt like they were in Italy and experiencing the tour around Italy rather than just seeing it on the screen.”

“The students were excited and eager to view their worlds in VR using the headsets. It was fantastic to see their enthusiasm and wonder at being able to view what they had created on a screen using the VRTY platform into what felt like ‘real life’.”

Longitudinal, deep teacher reflection is a key source of data for the VR School Study. Teacher reflections over time provide important insights in to growth in teacher professional learning, student learning and the success of different pedagogical strategies and curriculum planning approaches when using VR real classrooms.

Cover picture: Our last real-life team selfie before the Covid pandemic hit – Front: A/Prof Erica Southgate; Rear (Left to Right): Athelstone School language teachers Angelica Cardone and Jo Romeo, and Principal (and language teacher) Gyllian Godfrey. The study was funded through the South Australian Department for Education Innovative Language Program Grant.

Training children in 360° content creation

An essential part of scaffolding digital learning when using emerging technology in schools is the provision of developmentally appropriate training on using platforms to meet learning objectives. While there is a lot of talk about generations Y and Z being digital natives, there is great variability in the capability of children and young people in using digital tools for learning, especially when it is comes to creating rather than consuming products.

Throughout the Athelstone School project we have thought carefully about training and supporting primary school aged students (11 – 12years) in using the 360° VRTY platform or content creation.  In 2019 we did a pilot study using VRTY with Year 5 students which helped us hone the training approach. In this phase of the study student training was conducted via teleconference and lasted 40 minutes. VRTY personnel delivered the training, while the teachers and researcher were on hand to assist. This initial training involved a general introduction to using the platform to create virtual worlds in screen mode. We used a ‘sticky note’ exercise to evaluate the training where students writing down their comments on a post-it note about the training so that we could gauge the class’s training experience. This exercise revealed most students enjoyed the training but that some found it challenging as the examples below show.

Some student feedback from the first training exercise.

In 2020, we expanded the training and support approach to include an additional teleconference session on how to save and share virtual content with others in screen and immersive modes. VRTY designed a special handbook for students on this step-by-step process. This handbook was printed out and put on each desk for easy referral. This supplemented to in-platform tutorials and information, providing an option for students who might prefer more conventional reference material to support learning. This in-class training was undertaken via conference which we already had practice with before the necessity of conducting such sessions due to COVID restrictions.

Training in action from the student perspective.

One of the learning objectives for the unit of work was that students could use the on-desk training handbook effectively for assistance to trouble-shoot issues as they arose. The evaluation indicated that all students met this learning objective.

Our experience shows that primary school students may need different training and resource approaches to build confidence and scaffolding them towards competence in using 360° content creation tools. The training response included provision of in-platform instructions and tutorials with a back-up paper-based manual available on student desks. Once confidence was developed, students played and learnt through this process too. Multi-pronged training approaches coupled with practice and play makes perfect.

Training in progress 21st century style.

This post bought to you by A/Prof Erica Southgate, the VRTY team Kingston Lee-Young and Sarah Lee and the teachers of Athelstone School.

Conceptions of VR + signature pedagogies = learning fit

In my recent book, I provide some explanatory frameworks on the pedagogical uses of VR. While much of the public discourse centres around technical differences between types of VR (i.e. the difference between 3 Degree of Freedom [DOF] vs 6 DOF) or whether 360° technology is ‘real’ VR, as an educator I think it is more important to focus on the pedagogical utility of the technology. One way of making pedagogical sense of VR is to conceptualise its different possibilities for learning with explicit connection to the signature pedagogies of disciplines (or school subjects derived from disciplines).

The diagram below (developed for the book) illustrates some key conceptions of VR for learning. VR applications can reflect one or more of these concepts.

When teachers are considering VR they should explore the learning experiences the application offers and how this might fit with the range of instructional strategies commonly used in specific subjects. For example, if you were teaching history you might ask if the software offers a means for transporting students to another place or time because this would fit well with the instructional repertoire usually deployed in the subject area. A core instructional strategy used in a subject is called a ‘signature pedagogy’ (Shulman, 2005). Signature pedagogies are important because they:

implicitly define what counts as knowledge in a field and how things become known…. They define the functions of expertise in a field. (Shulman, 2005, p. 56)

In the case of sparking the imagination through a historical re-creation experience (re-creation being a signature pedagogy of the discipline of history), a time-travel experience would traditionally be facilitated through the instructional use of text, maps, or video. Choosing a time-travel VR experience for history makes good pedagogical sense because it leverages or extends on the signature pedagogy of that particular discipline. Relatedly, this is why VR resonates with the types of place-based pedagogy used in subjects such as geography or in professional training simulations. The technology can be used to take the learner elsewhere and its spatial affordances (properties) fit with the signature pedagogy of geography which is the field trip or professions where situated learning in workplaces (placements) are key (such as clinical health or teacher education).

Let’s look at another example using the diagram. In order to teach science, an educator might want to  provide students with the opportunity to conduct experiments that are too complex or dangerous for a school laboratory – experimentation in labs being a signature pedagogy of the discipline of science. The teacher would therefore investigate if there was a total learning environment in the form of a virtual laboratory available so that experiments could be safely simulated.

A performing arts teacher might find that a virtual studio would be a great addition to the actual studio of the drama classroom because it offered a range of tools for her student to design sets and costumes. VR design studios allow for ease of prototyping (click of the controller for creating, erasing and changing elements) at actual scale and let students easily share design ideas for rapid feedback from the teacher and peers (the book has a case study on how a real teacher did this in a rural school).  In this case, the virtual environment offers tools to support the signature pedagogy of drama teaching which involve facilitating the creative processes through improvisation and iteration.

Finally, some VR applications enable student content creation – this might be through coding (using game engines such as Unreal and Unity for example) or with more accessible ‘no code create’ drag-and-drop software. In this pedagogical conception of VR, students use the technology as a form of immersive media that can tell a learning story. Students create their own worlds and tell their own stories to demonstrate mastery of learning outcomes and to communicate with, and teach, others.

This pedagogical conception of VR as media informs our latest research on using 360° content creation for second language learning at Athelstone primary school. The 360° platform, VRTY, offers ‘no code create’ opportunities for primary school students to create their own ‘surround’ worlds that acts as a foundation to embed other media into (other media includes gaze-activated pop-up text, sound files, photos, videos, gifs and animations). Students are required to demonstrate that they meet learning outcomes, such as oral or written mastery of Italian vocabulary, by creating a 360°world that is enriched with other digital content they have created. Students can link 360° environments together through gaze-activated portals. The many layers of media content creation entail students planning, experimenting, designing, and evaluating the story they want to tell in their virtual worlds. They then share their creations with peers and the teacher for authentic feedback. They are making media-rich narratives to educate others about the Italian language and culture while demonstrating content mastery.

One our key research questions involves understanding how language teachers can leverage their signature pedagogies to take advantage of the learning affordances of 360° media creation in ways that enhance student engagement and learning. Concentrating on the instructional utility of VR in direct relation to the distinctive pedagogies of the subject being taught – its signature pedagogies –  will yield theoretically rich and salient insights for teaching and curriculum design. You are invited to follow our adventure. Stay tuned.

Bought to you by A/Prof Erica Southgate on behalf of the Athelstone School VR School Team

References

Shulman, L. S. (2005). Signature pedagogies in the professions. Daedalus134(3), 52-59.

Southgate, E. (2020). Virtual reality in curriculum and pedagogy: Evidence from secondary classrooms. Routledge.

Virtual Reality for Deeper Learning

How can we expand our understanding of learning in/through virtual reality in ways that move beyond training scenarios or simple ‘facts and figures’ knowledge acquisition?

In our latest paper we take a deep dive into how VR can help students develop elusive 21st century thinking skills. We apply the Deeper Learning framework and the Revised Bloom’s Taxonomy (featured image above) to explore student collaborative and higher order thinking.

 

Blog at WordPress.com.

Up ↑