AI in VR: Uses and concerns

Artificial intelligence is now integrated into most applications and platforms we interact with in every day life. It can be user-facing such as virtual assistants or operate behind-the-scenes to collect and analyse data that produces predictions and profiles about us that then create personalised experiences. Some uses of AI in VR in are:

Design and user experience: AI tools can facilitate the efficient design and generation of virtual environments and related content. There are examples of text-to-3D model generative AI and 3D data visualisation (stepping inside the data).  AI tools can produce digital twins or replicas of real-world objects or spaces that are real-time interactive spaces for single or multiple users. AI are also used for object recognition and tracking that adapts, in real time, to user action. This makes a virtual experience feel immediate. There is also integration of natural language processing so that users can use speech for interaction, navigation and translation purposes and enhance the communication capability of non-player (computer-generated, synthetic) characters in virtual environments. This all makes the experience of virtual reality feel more natural and engaging. And, AI’s promise of personalisation could translate to the development of better assistive adaptive technology integrated into spatial computing products.

Profiling and predictive analytics: Machine learning algorithms collect, analyse and  produce analytics about users of VR often in real time. These can be used to personalise or customise the experience of VR; for example, by offering the user certain content, options for customisation of avatars or non-player character interactions. Individual and social behavioural data related to action, eye-gaze attention and proxemics etc. can be captured by machine learning algorithms in VR simulations designed to understand and model how humans might react under certain conditions.

The combination of contemporary VR hardware and software provides for machine learning algorithms to collect a lot of personal information about a user. The Information Technology and Innovation Foundation has characterised this data as:

  • Observable: information about an individual that AR/VR technologies as well as other third parties can both observe and replicate, such as digital media the individual produces or their digital communications;
  • Observed: information an individual provides or generates, which third parties can observe but not replicate, such as biographical information or location data;
  • Computed: new information AR/VR technologies infer by manipulating observable and observed data, such as biometric identification or advertising profiles; and
  • Associated: information that, on its own, does not provide descriptive details about an individual, such as a username or IP address.2”

There is a significant amount of biometric data or data of the body that is collected in/by VR such as vocalisations, height, movement and use of head, body, limbs, and hands, facial expression, and eye gaze and pupil dilation, with ‘pass through’ cameras on headsets also capable of capturing information about a user’s environment.  There is research that indicates that biometric data can be highly identifiable, and this poses privacy and security concerns especially where children and young people are concerned. Tricomi et al. (2023) explain:

“Currently, there is an ongoing discussion on the potential protocols that will govern the Metaverse, with a particular focus on the controversial interplay between openness and privacy… (V)irtual devices allow tracking a large number of behavioral metrics, such as the headset’s and controllers’ position and rotation (which reflect the users’ physical actions), all the interactions between the user and any virtual object present in the scene, and also eye movements. All these data can be source of personal information, and even the user’s identity.”

Some jurisdictions have strong laws governing the harvesting and use of personal information including biometric data, while others do not. Under the Australian Privacy Act 1988 biometrics are considered sensitive information where consent is required for collection. The act has been under review for several years; however, the hope is that there will be stronger protections around the collection of such data. Internationally, there are several interesting policies and guidelines on biometrics for schools, including those for children.

It is worth closely examining the terms and conditions and disclosure statements of VR companies especially those related to children such as Meta’s  Parent Privacy Disclosure statement which includes data collection related to the following:

  • “With your approval, information about the size of walls, surfaces and objects in your child’s room and the distances between them and your child’s headset to offer experiences that blend their virtual and real-world environments
  • Personal information necessary to provide services so that the device and any features that you or your child turn on function optimally
  • Physical information about or related to your child, such as estimated hand size and hand pose data, if you choose to enable the hand tracking feature
  • Information about or related to the position and orientation of the headset, controllers and body movements to determine body pose, make your child’s avatar’s movements more realistic and deliver an immersive virtual experience
  • Information about or related to your child’s fitness activities in virtual reality if you choose to enable fitness-related experiences such as Meta Quest Move.”

All such data collection in VR is powered by AI. There is significant  amount of work going on internationally to ensure that the human rights of children are protected in the digital realm and this includes in VR and other immersive technologies such as augmented and mixed reality. With companies  lowering user age for VR to 10 years old, now is the time for a more robust and critical conversation on the ethical use of this web of technologies in schools, and in society more broadly. As educators we cannot delay action on this; in fact, we must lead the conversation.

References

Tricomi, P. P., Nenna, F., Pajola, L., Conti, M., & Gamberi, L. (2023). You can’t hide behind your headset: User profiling in augmented and virtual reality. IEEE Access11, 9859-9875.

This post bought to you by a real human A/Prof Erica Southgate

2 thoughts on “AI in VR: Uses and concerns

Add yours

Leave a comment

Blog at WordPress.com.

Up ↑