Metaverse and Privacy — How Privacy Is Taken Into Account in AR/VR Experiences
As the world becomes increasingly digital, more and more personal information is being collected on users every day. When you sign up as a new member to most 2D online experiences these days (such as social networks and e-Commerce websites), you are asked to provide your personal information such as your name, email, phone number, address and payment information.
Such collection of information allows for easier interaction within 2D online experiences. 3D, augmented reality (AR) and virtual reality (VR) experiences require even more personal information to allow for an immersive experience and meaningful interaction with users.
For example, the data collected by AR/VR experiences includes biometric data — personal information relating to the physical, physiological or behavioral characteristics of a natural person, which is used to create a custom experience for each unique user, like avatars. Such information allows for a much more individual experience at the cost of being able to identify individuals and infer sensitive information about them.
While many websites today strive to ensure adequate data protection measures, the introduction of 3D experiences, including AR/VR experiences, raises new privacy considerations. In an online 3D world comprised of different AR/VR apps, a user’s behavior, looks, age, location, environment, health status and other personal sensitive information is accessible to a larger audience of corporations and might even be out in the open for all to see.
Your information could be misused or misappropriated for things as mundane as unsolicited marketing campaigns targeting you for really scary stuff like fraud, identity theft and deep fakes. Your information might be used to circumvent user anonymity and even to allow for potentially discriminatory use of inferred or provided information by third parties.
In order to create certain virtual environments that mimic reality, AR/VR applications and smart devices must gather a significant amount of potentially sensitive data from different sources. VR headsets, for example, use multiple sensors so that they can fully capture the user’s surrounding physical environment as well as their facial features. In fact most modern headsets, if not all, cannot function without this capability.
Some of these sensors include a magnetometer, barometer, and a GPS chip as well as recording tools. This combination of sensors and mics allow AR/VR devices to gather detailed information on a particular person. They may detect not only a user’s location, but also their motion, orientation, and physical environment as well as their eye gaze, gestures, and facial expressions, and other individualized information that can be used to identify a person.
These sensors might even collect information about other individuals in a user’s surroundings and their conversations, without these individuals’ informed consent. These privacy concerns are not necessarily covered by existing best practices related to user privacy. After all, these best practices were formalized before these new technologies emerged.
Details in their proposal include clarifying on existing privacy laws for users, reforming privacy laws like HIPAA and COPPA, introducing laws that preemptively protect against full disclosure of biometric data, enacting federal privacy legislation on a national level and providing guidelines to opt out and other security standards and data collection practices.
Most VR/AR applications and experiences will collect information that can be observed and replicated by the applications itself as well as third parties and other AR/VR platforms. A user’s avatar is an example of observable data. It does not fully reveal the user’s identity, but some parts such as physical appearance, race, gender, and height. These avatars are used in three dimensional ways, truer to form, making the data more personal and intimate compared to two dimensional figures, such as the Bitmoji in Snapchat.
Observable data can raise privacy concerns since the users cannot control how much information they want to reveal about themselves. It is also hard for users to determine to whom the data is revealed. For example, they might be comfortable sharing information with people who they are friends within the application, but not to others. In order to address these privacy concerns, AR/VR platforms are being more transparent about how they collect and distribute information and allow users to take control over what type of information is shared and with whom.
Similarly, the Information Technology & Innovation Foundation (ITIF) also suggests that policymakers address these concerns by identifying the different types of information these devices collect and determine appropriate safeguards to protect user privacy. There are technical guards that can protect observable information like digital communications including end-to-end encryption and blocking third parties from stealing information with user consent like blocking screenshots.
Another privacy issue in AR/VR is deciding when to be seen and when to be private. Technologies like face recognition happen around us passively often times without our consent or knowledge, especially in AR/VR experiences. Alessandro Aquisti, a behavioral economist explains that this issue is going to get worse as time goes on. The issue is that it’s easy to recognize a face through growing databases which connect to online presences and eventually deeply personal data like. It’s impossible for the categories of data to maintain hard boundaries making identity theft for example, a very real threat.
AR/VR platforms also collect information provided or generated by the users, which cannot be replicated by other AR/VR platforms, because this is direct input from the user to the specific platform. Observed data covers the majority of data in AR/VR. Examples of observed data include geographical location and physical surroundings. Location data is obtained through GPS, IMU (inertial measurement unit), and physical surroundings often require cameras on mobile devices and other sensors.
Due to the depth of information, such as eye movement that can constantly track where the user is looking and pupil size, users should be able to choose if that data leaves the AR/VR experience. This can be highly delicate information that the users want to keep private from certain groups of people. Like observable data, however, users do not have control over who has access to this extremely private information. Mitigation can be tricky, since there is such a broad range of information being collected, along with the fact that this information is critical to the quality of the AR/VR experience. Similar to the case of observable data, disclosure and clear guidelines about how the data will be used can help users make informed decisions.
AR/VR platforms also compute and infer certain information about users, given their observable and observed data. An example would be recommendations on music or shopping options, which are generated based on the users’ input. Different information, such as the users’ ages and geographical location, can be combined into output more useful data.
In certain cases, computed information might be less sensitive and personal than observable or observed information actually provided by users. But at times, computed information can be even more harmful as it is merely deduced and therefore can be inaccurate. Inaccurate data, like an incorrect credit score, can cause significant harm to users which can affect opportunities for housing, insurance, lending, benefits and other services. While a credit score specifically could seem irrelevant to AR/VR, it carries the same concern for inaccurate data like a falsely banned devices or IP addresses and wrong user identifiers. Accordingly, many AR/VR companies are trying to focus on keeping computed data encrypted and secure and allow users to view the current data.
This is data that does not directly identify a person or provide personal or descriptive information about a user, such as a user ID or IP address. But when combined with other information, it can present privacy risks like phishing and hacking. This type of information becomes more and more sensitive when users make in-app purchases and micro transactions and provide data that includes their payment information, address and full name.
Many AR/VR companies therefore provide for stronger user authentication, such as combining a fingerprint with login which can help prevent malicious actors from achieving their goals. Extortion and blackmail are common and are growing with data threats. In certain cases, users should also be notified when these different data types are being combined so they can decide if they want to consent.
Much of the information AR/VR devices collect is sensitive data that is not currently used in other consumer technology devices. As AR/VR becomes more prevalent in our lives, mitigation approaches such as user consent and disclosure, user authentications, advance privacy settings, encrypted communications, and even new laws against violation of personal autonomy and discrimination as well as governmental use of sensitive data, will become more commonplace. It is important for users to identify the different types of data collected on them and understand the measures used to protect it.
echo3D (www.echo3D.co; Techstars 19’) is a cloud platform for 3D/AR/VR that provides tools and network infrastructure to help developers & companies quickly build and deploy 3D apps, games, and content.