US-12623756-B1 - Systems and methods for visualizing a virtual boundary of a marine vessel in an extended reality scene that includes a non-virtual marine vessel
Abstract
A system for visualizing a virtual boundary in a scene that includes a non-virtual marine vessel is provided, the system comprising: a display device; and one or more hardware processors configured to: receive information indicative of a boundary position at which to present a virtual boundary within an extended reality (XR) scene with respect to the marine vessel; present, using the display device, the XR scene including the virtual boundary based on the boundary position, wherein a size and view of the virtual boundary in the XR scene is based on a viewing perspective at which the XR scene is viewed; and update the XR scene as the marine vessel moves.
Inventors
- Efrain Rosario-Gonzalez
- Trevor George
- Aadish Dilip Naik
Assignees
- BRUNSWICK CORPORATION
Dates
- Publication Date
- 20260512
- Application Date
- 20240205
Claims (20)
- 1 . A system for visualizing a virtual boundary in a scene that includes a non-virtual marine vessel, the system comprising: a display device; and one or more hardware processors configured to: receive information indicative of a boundary position at which to present a virtual boundary within an extended reality (XR) scene with respect to the marine vessel; present, using the display device, the XR scene including the virtual boundary based on the boundary position, wherein a size and view of the virtual boundary in the XR scene is based on a viewing perspective at which the XR scene is viewed and an orientation of the viewing perspective with respect to a water surface in a marine environment of the marine vessel such that the virtual boundary appears to be on the water surface in the XR scene at the boundary position; and update the XR scene as the marine vessel moves such that the virtual boundary appears to remain on the water surface in the XR scene.
- 2 . The system of claim 1 , the system further comprising: a plurality of cameras, including at least a first camera and a second camera, wherein each of the plurality of cameras is configured to be mounted to the marine vessel with an associated field of view of an environment of the marine vessel, and wherein each of the plurality of cameras is associated with a three-dimensional camera coordinate system; wherein the one or more hardware processors are further configured to: receive virtual boundary data; determine that the virtual boundary is at least partially within the field of view of the first camera; determine a position of the virtual boundary within the camera coordinate system associated with the first camera; render a two-dimensional image of the virtual boundary based on the virtual boundary data, the position of the virtual boundary within the camera coordinate system associated with the first camera, and a viewing perspective of the first camera; receive image data from the first camera; generate a view of the XR scene based on the image data and the two-dimensional image of the virtual boundary, such that the virtual boundary appears to be present in the view of the XR scene at the boundary position; and present, using the display device, the view of the XR scene.
- 3 . The system of claim 2 , wherein the one or more hardware processors are further configured to: determine that the marine vessel has moved with respect to the position of the virtual boundary within the camera coordinate system associated with the first camera; determine an updated position of the virtual boundary within the camera coordinate system associated with the first camera based on movement of the marine vessel; render a second two-dimensional image of the virtual boundary based on the updated position of the virtual boundary and an updated viewing perspective of the first camera; receive additional image data from the first camera; generate an updated view of the XR scene based on the additional image data and the second two-dimensional image of the virtual boundary, such that the virtual boundary appears to be present in the updated view of the XR scene at the boundary position; and present, using display device, the updated view of the XR scene.
- 4 . The system of claim 2 , wherein the display device comprises a multi-function display of the marine vessel.
- 5 . The system of claim 1 , the system further comprising: a plurality of cameras, including at least a first camera and a second camera, wherein each of the plurality of cameras is configured to be mounted to the marine vessel with an associated field of view of an environment of the marine vessel, and wherein each of the plurality of cameras is associated with a three-dimensional camera coordinate system; wherein the one or more hardware processors are further configured to: generate, using a real-time graphics engine, a digital twin of the marine vessel, wherein the digital twin comprises a plurality of virtual cameras, including at least a first virtual camera and a second virtual camera, wherein each of the plurality of virtual cameras is positioned with respect to the digital twin to have an associated field of view of a virtual environment of the digital twin of the marine vessel that corresponds to a field of view associated with one of the plurality of cameras; receive virtual boundary data; place the virtual boundary in the virtual environment with a virtual boundary position corresponding to the boundary position; determine that the virtual boundary is at least partially within the field of view of the first virtual camera; render, using the real-time graphics engine, a two-dimensional image of the virtual environment from a point of view of the first virtual camera based on the virtual boundary data; receive image data from the first camera; generate a view of the XR scene based on the image data and the two-dimensional image of the virtual environment, such that the virtual boundary appears to be present in the view of the XR scene at the boundary position; and present, using the display device, the view of the XR scene.
- 6 . The system of claim 1 , wherein the display device comprises a display of a wearable extended reality device, and wherein the system further comprises a head tracking device in communication with at least one hardware processor of the one or more hardware processors.
- 7 . The system of claim 6 , wherein the one or more hardware processors are further configured to: receive virtual boundary data; determine a position and orientation of the wearable extended reality device with respect to the marine vessel based at least in part on data from the head tracking device; determine a position and orientation of a virtual object in a reference coordinate system associated with the wearable extended reality device; determine that the virtual boundary position is at least partially within a field of view of the wearable extended reality device based on the boundary position; render an image of the virtual boundary based on the virtual boundary data, the position and orientation of the virtual object within the reference coordinate system, and the position and orientation of the wearable extended reality device; and present, using the display device, the rendered image of the virtual boundary in a position on the display device such that the virtual boundary appears to be present in the view of the XR scene at the boundary position.
- 8 . The system of claim 1 , wherein the display device comprises a display of a heads-up display (HUD) device integrated into a windshield of the marine vessel.
- 9 . The system of claim 8 , wherein the one or more hardware processors are further configured to: receive virtual boundary data; determine the viewing perspective from which a user is viewing the HUD; determine a position and orientation of the virtual boundary in a reference coordinate system associated with the HUD and the viewing perspective of the user; determine that the virtual boundary is at least partially within a field of view of the user via the HUD; render an image of the virtual boundary based on the position of the virtual boundary within the reference coordinate system; and present, using the display device, the rendered image of the virtual boundary in a position on the HUD such that the virtual boundary appears to the user to be present in the view of the XR scene at the boundary position.
- 10 . The system of claim 1 , wherein the one or more hardware processors are further configured to: receive, from an autonomy system of the marine vessel, virtual boundary data comprising a shape and size of the virtual boundary.
- 11 . A method for visualizing a virtual boundary in a scene that includes a non-virtual marine vessel, the method comprising: receiving information indicative of a boundary position at which to present a virtual boundary within an extended reality (XR) scene with respect to the marine vessel; presenting, using a display device, the XR scene including the virtual boundary based on the boundary position, wherein a size and view of the virtual boundary in the XR scene is based on a viewing perspective at which the XR scene is viewed and an orientation of the viewing perspective with respect to a water surface in a marine environment of the marine vessel such that the virtual boundary appears to be on the water surface in the XR scene at the boundary position; and updating the XR scene as the marine vessel moves such that the virtual boundary appears to remain on the water surface in the XR scene.
- 12 . The method of claim 11 , further comprising: receiving virtual boundary data; determining that the virtual boundary is at least partially within the field of view of a first camera of a plurality of cameras, wherein the plurality of cameras includes at least the first camera and a second camera, wherein each of the plurality of cameras is configured to be mounted to the marine vessel with an associated field of view of an environment of the marine vessel, and wherein each of the plurality of cameras is associated with a three-dimensional camera coordinate system; determining a position of the virtual boundary within the camera coordinate system associated with the first camera; rendering a two-dimensional image of the virtual boundary based on the virtual boundary data, the position of the virtual boundary within the camera coordinate system associated with the first camera, and a viewing perspective of the first camera; receiving image data from the first camera; generating a view of the XR scene based on the image data and the two-dimensional image of the virtual boundary, such that the virtual boundary appears to be present in the view of the XR scene at the boundary position; and presenting, using the display device, the view of the XR scene.
- 13 . The method of claim 12 , further comprising: determining that the marine vessel has moved with respect to the position of the virtual boundary within the camera coordinate system associated with the first camera; determining an updated position of the virtual boundary within the camera coordinate system associated with the first camera based on movement of the marine vessel; rendering a second two-dimensional image of the virtual boundary based on the updated position of the virtual boundary and an updated viewing perspective of the first camera; receiving additional image data from the first camera; generating an updated view of the XR scene based on the additional image data and the second two-dimensional image of the virtual boundary, such that the virtual boundary appears to be present in the updated view of the XR scene at the boundary position; and presenting, using the display device, the updated view of the XR scene.
- 14 . The method of claim 12 , wherein the display device comprises a multi-function display of the marine vessel.
- 15 . The method of claim 11 , further comprising: generating, using a real-time graphics engine, a digital twin of the marine vessel, wherein the digital twin comprises a plurality of virtual cameras, including at least a first virtual camera and a second virtual camera, wherein each of the plurality of virtual cameras is positioned with respect to the digital twin to have an associated field of view of a virtual environment of the digital twin of the marine vessel that corresponds to a field of view associated with one of a plurality of cameras associated with the marine vessel, wherein the plurality of cameras includes at least the first camera and a second camera, wherein each of the plurality of cameras is configured to be mounted to the marine vessel with an associated field of view of an environment of the marine vessel, and wherein each of the plurality of cameras is associated with a three-dimensional camera coordinate system; receiving virtual boundary data; placing the virtual boundary in the virtual environment with a virtual boundary position corresponding to the boundary position; determining that the virtual boundary is at least partially within the field of view of the first virtual camera; rendering, using the real-time graphics engine, a two-dimensional image of the virtual environment from a point of view of the first virtual camera based on the virtual boundary data; receiving image data from the first camera; generating a view of the XR scene based on the image data and the two-dimensional image of the virtual environment, such that the virtual boundary appears to be present in the view of the XR scene at the boundary position; and presenting, using the display device, the view of the XR scene.
- 16 . The method of claim 11 , wherein the display device comprises a display of a wearable extended reality device, and extended reality device further comprises a head tracking device in communication with at least one hardware processor.
- 17 . The method of claim 16 , further comprising: receiving virtual boundary data; determining a position and orientation of the wearable extended reality device with respect to the marine vessel based at least in part on data from the head tracking device; determining a position and orientation of a virtual object in a reference coordinate system associated with the wearable extended reality device; determining that the virtual boundary position is at least partially within a field of view of the wearable extended reality device based on the boundary position; rendering an image of the virtual boundary based on the virtual boundary data, the position and orientation of the virtual object within the reference coordinate system, and the position and orientation of the wearable extended reality device; and presenting, using the display device, the rendered image of the virtual boundary in a position on the display device such that the virtual boundary appears to be present in the view of the XR scene at the boundary position.
- 18 . The method of claim 11 , wherein the display device comprises a display of a heads-up display (HUD) device integrated into a windshield of the marine vessel.
- 19 . The method of claim 18 , further comprising: receiving virtual boundary data; determining the viewing perspective from which a user is viewing the HUD; determining a position and orientation of the virtual boundary in a reference coordinate system associated with the HUD and the viewing perspective of the user; determining that the virtual boundary is at least partially within a field of view of the user via the HUD; rendering an image of the virtual boundary based on the position of the virtual boundary within the reference coordinate system; and presenting, using the display device, the rendered image of the virtual boundary in a position on the HUD such that the virtual boundary appears to the user to be present in the view of the XR scene at the boundary position.
- 20 . The method of claim 11 , further comprising receiving, from an autonomy system of the marine vessel, virtual boundary data comprising a shape and size of the virtual boundary.
Description
FIELD The present disclosure generally relates to systems and methods for visualizing a virtual boundary of a marine vessel in an extended reality scene that includes a non-virtual marine vessel. BACKGROUND The following U.S. Patents are incorporated herein by reference, in entirety: U.S. Pat. No. 9,927,520 discloses a method of detecting a collision of the marine vessel, including sensing using distance sensors to determine whether an object is within a predefined distance of a marine vessel, and determining a direction of the object with respect to the marine vessel. The method further includes receiving a propulsion control input at a propulsion control input device, and determining whether execution of the propulsion control input will result in any portion of the marine vessel moving toward the object. A collision warning is then generated. U.S. Pat. No. 11,373,537 discloses a propulsion control system on a marine vessel that includes at least one propulsion device configured to propel the marine vessel, at least one input device manipulatable to provide user control input to control a movement direction and velocity of the marine vessel, at least one proximity sensor system configured to generate proximity measurements describing a proximity of an object with respect to the marine vessel, and a controller. The controller is configured to limit user input authority over propulsion output in a direction of the object by at least one propulsion device based on the proximity measurement so as to maintain the marine vessel at least a buffer distance from the object, and then to suspend the maintenance of the buffer distance from the object upon receipt of a user-generated instruction to do so. Upon receipt of a user control input via the user input device to move the marine vessel in the direction of the object, the controller controls the at least one propulsion device based on the user control input such that the marine vessel approaches and impacts the object. U.S. Pat. No. 11,403,955 discloses a propulsion control system on a marine vessel that includes at least one propulsion device configured to propel the marine vessel and at least one proximity sensor system configured to generate proximity measurements describing a proximity of an object with respect to the marine vessel. The system further includes a controller configured to receive proximity measurements, access a preset buffer distance, and calculate a velocity limit in a direction of the object for the marine vessel based on the proximity measurements and the preset buffer distance so as to progressively decrease the velocity limit as the marine vessel approaches the preset buffer distance from the object. U.S. patent application Ser. No. 18/302,602 discloses a system configured to assist a user in identifying an object in an area outside a marine vessel. An image sensor is configured to collect imaging data for the area outside the marine vessel. A display device is configured to generate a display on a windshield assembly. The windshield assembly comprises a windshield and a frame adjacent to at least one side of the windshield. A control system is configured to analyze the imaging data to identify an object within the area outside the marine vessel and to control the display device to generate a display to visually indicate the object on the windshield assembly, where the display assists the user positioned at a helm of the marine vessel in identifying the object through the windshield. SUMMARY In accordance with some embodiments of the disclosed subject matter, a system for visualizing a virtual boundary in a scene that includes a non-virtual marine vessel is provided, the system comprising: a display device; and one or more hardware processors configured to: receive information indicative of a boundary position at which to present a virtual boundary within an extended reality (XR) scene with respect to the marine vessel; present, using the display device, the XR scene including the virtual boundary based on the boundary position, wherein a size and view of the virtual boundary in the XR scene is based on a viewing perspective at which the XR scene is viewed; and update the XR scene as the marine vessel moves. In some embodiments, the system further comprises: a plurality of cameras, including at least a first camera and a second camera, wherein each of the plurality of cameras is configured to be mounted to the marine vessel with an associated field of view of an environment of the marine vessel, and wherein each of the plurality of cameras is associated with a three-dimensional camera coordinate system; wherein the one or more hardware processors are further configured to: receive virtual boundary data; determine that the virtual boundary is at least partially within the field of view of the first camera; determine a position of the virtual boundary within the camera coordinate system associated with the first camera; render a two-dimens