US-20260127821-A1 - PROVIDING AWARENESS OF PRIVACY-RELATED ACTIVITIES IN VIRTUAL AND REAL-WORLD ENVIRONMENTS
Abstract
One embodiment of the present invention sets forth a technique for providing awareness of privacy-related activities. The technique includes determining a privacy level associated with a user of an extended reality environment. The technique also includes presenting, using an internal display of a headset, one or more internal indicators identifying a location of a bystander located in a real-world environment, wherein a level of detail of each internal indicator is based on the privacy level associated with the user. The technique further includes presenting, using an external display, one or more external indicators that include a monitoring indicator representing being captured by the headset and presented to the user via the headset, and further include a user activity indicator representing one or more activities of the user, wherein a level of detail of the user activity indicator is based on the privacy level associated with the user.
Inventors
- Youngwook Do
- Fraser ANDERSON
- Frederik BRUDY
- George William Fitzmaurice
Assignees
- AUTODESK, INC.
Dates
- Publication Date
- 20260507
- Application Date
- 20251229
Claims (20)
- 1 . A computer-implemented method for providing awareness of privacy-related activities, the method comprising: determining a privacy level associated with a user of an extended reality environment; and presenting, using an internal display of a headset, one or more internal indicators identifying a location of a bystander located in a real-world environment, wherein a level of detail of each internal indicator included in the one or more indicators is based on the privacy level associated with the user.
- 2 . The computer-implemented method of claim 1 , further comprising displaying, using the internal display of the headset, a privacy level indicator that is based on the privacy level associated with the user.
- 3 . The computer-implemented method of claim 2 , further comprising displaying the privacy level indicator in the real-world environment using a light emitting diode (LED), wherein the privacy level indicator has a color that corresponds to the privacy level.
- 4 . The computer-implemented method of claim 1 , wherein the privacy level is determined based on user input.
- 5 . The computer-implemented method of claim 1 , wherein the privacy level is determined based on a task being performed by the user in the extended reality environment.
- 6 . The computer-implemented method of claim 1 , wherein the one or more internal indicators are determined based on the privacy level.
- 7 . The computer-implemented method of claim 6 , wherein the one or more internal indicators are determined based on the privacy level using a user-specified indicator configuration that associates the privacy level with the one or more internal indicators.
- 8 . The computer-implemented method of claim 1 , wherein the one or more internal indicators comprise a side location indicator, wherein a location of the side location indicator on the internal display indicates a location of the bystander relative to the user.
- 9 . The computer-implemented method of claim 8 , wherein the side location indicator is displayed in response to determining that the privacy level associated with the user is a lowest privacy level.
- 10 . The computer-implemented method of claim 1 , wherein the one or more internal indicators comprise an overhead location indicator displaying a location of the bystander relative to the user in a two-dimensional overhead view.
- 11 . The computer-implemented method of claim 10 , wherein the overhead location indicator is displayed in response to determining that the privacy level associated with the user is a medium privacy level.
- 12 . The computer-implemented method of claim 1 , wherein the one or more internal indicators comprise a view of the bystander in the real-world environment, wherein the view is displayed at a location on the internal display, and wherein the location on the internal display corresponds to a location of the bystander in a field of view of the user.
- 13 . The computer-implemented method of claim 12 , wherein the view of the bystander is displayed in response to determining that the privacy level associated with the user is a highest privacy level.
- 14 . The computer-implemented method of claim 1 , wherein a monitoring indicator included in the headset includes a representation of the one or more internal indicators that are being presented to the user.
- 15 . The computer-implemented method of claim 1 , wherein the headset includes one or more external indicators that include a user activity indicator representing one or more activities of the user, and wherein a level of detail of the user activity indicator is based on the privacy level associated with the user.
- 16 . The computer-implemented method of claim 15 , wherein the user activity indicator includes a description of an activity in which the user is engaged.
- 17 . The computer-implemented method of claim 16 , wherein the description of the activity in which the user is engaged is displayed in response to determining that the privacy level associated with the user is a lowest privacy level.
- 18 . The computer-implemented method of claim 15 , wherein the user activity indicator includes a type of an activity in which the user is engaged, and the type of the activity is displayed in response to determining that the privacy level associated with the user is a medium privacy level.
- 19 . One or more non-transitory computer-readable media storing instructions that, when executed by one or more processors, cause the one or more processors to perform the steps of: determining a privacy level associated with a user of an extended reality environment; and presenting, using an internal display of a headset, one or more internal indicators identifying a location of a bystander located in a real-world environment, wherein a level of detail of each internal indicator included in the one or more indicators is based on the privacy level associated with the user.
- 20 . A system, comprising: one or more memories that store instructions, and one or more processors that are coupled to the one or more memories and, when executing the instructions, are configured to perform the steps of: determining a privacy level associated with a user of an extended reality environment; and presenting, using an internal display of a headset, one or more internal indicators identifying a location of a bystander located in a real-world environment, wherein a level of detail of each internal indicator included in the one or more indicators is based on the privacy level associated with the user.
Description
CROSS REFERENCE TO RELATED APPLICATIONS This application is a continuation of the co-pending U.S. Patent Application titled, “PROVIDING AWARENESS OF PRIVACY-RELATED ACTIVITIES IN VIRTUAL AND REAL-WORLD ENVIRONMENTS,” filed Aug. 30, 2023, and having Ser. No. 18/458,924 which claims benefit of the United States Provisional Patent Application titled “BALANCING BYSTANDER AND VR USER PRIVACY THROUGH AWARENESS CUES INSIDE AND OUTSIDE VR,” filed Mar. 30, 2023, and having Ser. No. 63/493,285. The subject matter of this related application is hereby incorporated herein by reference. BACKGROUND Field of the Various Embodiments Embodiments of the present disclosure relate generally to immersive computing environments and, more specifically, to providing awareness of privacy-related activities in virtual and real-world environments. Description of the Related Art Extended reality environments, such as Virtual Reality (VR) and Augmented Reality (AR) environments, are types of immersive environments that use a headset to present an audiovisual representation of a virtual environment to a user wearing the headset. A person wearing the headset is referred to herein as user of the immersive computing environment. The headset presents images on a display screen visible to the user, and can also produce sound using speakers. The images are rendered by a computing device, which can be located in the headset. The images include representations of virtual objects and/or virtual characters such as avatars. The virtual characters can represent other people in the real-world environment who are physically located near the user and are detected by sensors such as a camera mounted on the headset. Other people in the real-world environment are referred to herein as “bystanders. ” Bystanders can be people who can be seen or heard by the user, or can see or hear the user, for example. The headset camera can capture an image of the real-world environment, and the headset can display the camera image to the VR user as a view of the real-world environment. However, the headset camera and microphone can observe and record video and/or audio of bystanders, and therefore can cause privacy concerns among the bystanders. Further, the headset camera can record the video of a bystander, or a headset microphone can eavesdrop on a bystander without the bystander knowing that they are being observed, overheard, and/or recorded. The privacy of bystanders can be compromised by the camera or optical passthrough window on a headset. The headset can provide the user with a view of the real-world environment. For example, a passthrough video feature can be activated on a headset. The passthrough video uses the headset camera to provide a view of the real-world environment, including depictions of objects and other people who are bystanders in the environment. As another example, a transparent portion of the headset can provide a view of a portion of the real-world environment in the field of vision of the user. Bystanders can be unaware that they are being observed or recorded and potentially being incorporated into a simulated immersive environment. For example, bystanders who do not see the headset are not informed that they are being observed or recorded. Even if bystanders do see the headset, they can be unaware that the headset is capable of capturing or recording images or video of the real-world environment. The headset camera and microphone are not always active. For example, when a VR user is immersed in a VR environment and there is no need to display information about the real-world environment, the headset camera and microphone are deactivated. However, bystanders in the real-world environment who see a user wearing a headset can believe that they are being observed or recorded by the headset even though the camera or microphone is inactive. One approach that has been implemented to inform bystanders that they are being observed or captured by a camera on a headset is to provide a Light-Emitting Diode (LED) on the headset that illuminates when a headset camera is being used to capture video of the real-world environment. However, the LED indicator is relatively small and is unlikely to be seen clearly by bystanders who are not in close physical proximity to the headset user. Further, bystanders can be unaware of the presence or meaning of the LED indicator, so the LED indicator is not an effective way to inform bystanders that they are being observed or captured by a camera on the headset. Further, the privacy of the user wearing a headset can be sensitive to real-world bystanders that are in the physical environment. A bystander can overhear or otherwise observe a user without the user being aware that the bystander is present in the real-world environment. For example, a bystander standing to the side of or behind the user can be difficult to see or hear because the headset blocks the peripheral vision and/or hearing of the user wearing the