Search

EP-4739975-A1 - USER INTERFACE AND TARGETING SYSTEM

EP4739975A1EP 4739975 A1EP4739975 A1EP 4739975A1EP-4739975-A1

Abstract

A targeting system uses data streamed from multiple user interfaces (10). The targeting system comprising a digital representation (56) of a real-world environment (14) in which the multiple user interfaces (10) are being used by users (12). Data is obtained from the multiple user interfaces (12), wherein the data comprises, for each interface: location data giving the location of the user interface (10), orientation data giving a vector of a target axis (26) of the user interface, image data from a camera device (30) pointing in a direction that extends along the target axis (26), and an activation state of a trigger device (32) of the user interface (10). The digital representation (56) is kept continuously updated with the locations of the multiple user interfaces (10) and the vectors of the target axes 26 as the user interfaces (10) are moved by their users (12). The image data is used to determine the placement of real-world target objects including the placement of users (12) of the user interfaces (10) and the placement of intervening objects in the real-world. A simulated shot (21, 22, 23, 24) is fired by a first user (12) by activation of the trigger device (32) of their user interface (10). The targeting system determines if a path (52) of the simulated shot (21, 22, 23, 24) has passed within a certain distance of a second user (12) of the second user interface (10). This is done based on at least two of: the image data from the first user interface (10), the vector of the target axis (26) of the first user interface (10), and a relative location of the first and second user interfaces (10). The targeting system also simulates the effect of any intervening objects (16), (18) on the path (52) of the simulated shot (21, 22, 23, 24) based on penetration ability for the simulated shot (21, 22, 23, 24) and on determination of the properties of the real-world intervening objects (16, 18). The second user (12) receives feedback, e.g. audible feedback, if the path (52) of the simulated shot (21, 22, 23, 24) is determined to have passed within a certain distance of the second user (12).

Inventors

  • TVERSLAND, Kjetil
  • FIDJE, Jahn, Thomas
  • ANDERSEN, Per-Arne

Assignees

  • Newbringer AS

Dates

Publication Date
20260513
Application Date
20240704

Claims (20)

  1. 1. A method of operating a targeting system using data streamed from multiple user interfaces, the targeting system comprising a digital representation of a real-world environment in which the multiple user interfaces are being used; wherein the method includes: receiving data from the multiple user interfaces, wherein the data comprises, for each interface: location data giving the location of the user interface, orientation data giving a vector of a target axis of the user interface, image data from a camera device pointing in a direction that extends along the target axis, and an activation state of a trigger device of the user interface; updating the digital representation with the locations of the multiple user interfaces and the vectors of the target axes as the user interfaces are moved by respective users thereof; using the image data to determine the placement of real -world target objects including the users of the user interfaces as well as using the image data and/or the digital representation to determine the placement of intervening objects in the real- world; registering an activation of a trigger device by a first user of a first user interface, wherein activation of the trigger device indicates that a simulated shot has been fired; determining if a path of the simulated shot has passed within a certain distance of a second user of the second user interface based on at least two of: the image data from the first user interface, the vector of the target axis of the first user interface, and a relative location of the first and second user interfaces based on the location data; simulating the effect of any intervening objects on the path of the simulated shot based on an assessment of penetration ability for the simulated shot and based on determination of the properties of the real -world intervening objects; and providing feedback to the second user if the path of the simulated shot is determined to have passed within a certain distance of the second user.
  2. 2. A targeting system configured to receive and use data streamed from multiple user interfaces, wherein the data comprises, for each interface: location data giving the location of the user interface, orientation data giving a vector of a target axis of the user interface, image data from a camera device pointing in a direction that extends along the target axis, and an activation state of a trigger device of the user interface; wherein the targeting system comprises: a digital representation of a real-world environment in which the multiple user interfaces are being used; a digital twin sub-system for updating the digital representation with the locations of the multiple user interfaces and the vectors of the target axes as the user interfaces are moved by respective users thereof; an object tracking sub-system configured to use the image data to determine the placement of real -world target objects including the users of the user interfaces as well as to use the image data and/or the digital representation to determine the placement of intervening objects in the real -world; a simulated shot sub-system configured to: register an activation of a trigger device by a first user of a first user interface, wherein activation of the trigger device indicates that a simulated shot has been fired; determine if a path of the simulated shot has passed within a certain distance of a second user of the second user interface based on at least two of the image data from the first user interface, the vector of the target axis of the first user interface, and a relative location of the first and second user interfaces based on the location data; and simulate the effect of any intervening objects on the path of the simulated shot based on an assessment of penetration ability for the simulated shot and based on determination of the properties of the real -world intervening objects; and a user feedback sub-system for providing feedback to the second user if the path of the simulated shot is determined to have passed within the certain distance of the second user.
  3. 3. A method or system as claimed in claim 1 or 2, wherein determining if the path of the simulated shot has passed within a certain distance of a user is based on all of the image data, the vector of the target axis of the first user interface, and the relative location of the first and second user interfaces.
  4. 4. A method or system as claimed in claim 1, 2 or 3, wherein if it is determined that the path of the simulated shot will pass through an intervening object such that the second user is not visible in image data from the first user interface then the location of the second user is determined based on the relative location of the first and second user interfaces, image data from the second user interface, and optionally image data from further user interfaces in which the second user is visible.
  5. 5. A method or system as claimed in any preceding claim, wherein the quality of the image data is assessed before determining if a simulated shot has passed within a certain distance of a user, and wherein in the event of potential inaccuracies in the image data then the location of the path of the simulated shot relative to the second user is determined based on the target axis vector and the relative location without use of the image data.
  6. 6. A method or system as claimed in any preceding claim, wherein the location data allows the location of the user interface to be known to an accuracy of 20 cm or better.
  7. 7. A method or system as claimed in any preceding claim, wherein the feedback to the second user includes audible feedback, and wherein the content of the audible feedback is determined based on one or more of: the simulated weapon and/or projectile type, the distance of the path of the simulated shot away from the user, the location of the path of the simulated shot relative to the user, and/or features of the surrounding environment.
  8. 8. A method or system as claimed in any preceding claim, wherein each user interface provides audible feedback for simulated shots that are not aimed at the respective user but that strike objects within an audible range, including sounds relating to impacts of simulated shots fired by the respective user from their own user interface.
  9. 9. A method or system as claimed in any preceding claim, wherein registering an activation of a trigger device includes registering a timestamp for the activation, with this timestamp then being used to identify relevant data for determining if the path of the simulated shot has passed within a certain distance of the second user.
  10. 10. A method or system as claimed in any preceding claim, wherein the length of the certain distance is determined based on simulated shots that would be detectable events in the real-world, and wherein the length of the certain distance varies depending on the nature of the simulated weapon and/or on the environment surrounding the user.
  11. 11. A method or system as claimed in any preceding claim, wherein the image data is used to determine relevant features of the surrounding environment and/or for identification of the users of the user interfaces.
  12. 12. A method or system as claimed in any preceding claim, wherein the image data from all of the user interfaces is used to determine the placement of real-world target objects in the field of view of the camera device, and wherein the digital representation of the real-world environment is used for mapping of the real-world target objects as well as other features of the surrounding environment.
  13. 13. A method or system as claimed in any preceding claim, wherein the determination of the placement of target objects that are users of other user interfaces includes not only location but also the form of the users of other user interfaces in the image data in order that a determination of a simulated shot making a hit on such a user can include detail of the location of the hit on that user’s body.
  14. 14. A method or system as claimed in any preceding claim, wherein in addition to visible light the camera device uses thermal imaging.
  15. 15. A method or system as claimed in any preceding claim, wherein the user interface includes a smartphone that is used to provide the camera device and/or for data transmission.
  16. 16. A computer programme product comprising instructions that, when executed, will configure a computer system to operate in accordance with the method of any of claims 1 or 3 to 15.
  17. 17. A user interface for a targeting system as claimed in any of claims 2 to 15, the user interface comprising: a target axis; an orientation sensing system for determining a vector of the target axis; a positioning system for determining the location of the user interface; an image handling system for receiving image data from a camera device pointing in a direction that extends along the target axis, wherein the image data allows for determination of placement of real -world target objects including the placement of users of other user interfaces and the placement of intervening objects in the real -world; a trigger device for activation to indicate when the user of the user interface intends to fire a simulated shot; and a user feedback device for providing feedback to the user if a path of a simulated shot fired via another user interface is determined to have passed within a certain distance of the user.
  18. 18. A user interface as claimed in claim 17, wherein the positioning system comprises a Global Positioning System, GPS, device with Real-time Kinematic Positioning, RTK.
  19. 19. A user interface as claimed in claim 17 or 18, wherein the positioning system uses inertial positioning data obtained via sensors of the orientation sensing system.
  20. 20. A targeting system as claimed in any of claims 2 to 15, the targeting system comprising multiple user interfaces as claimed in any of claims 17 to 19.

Description

USER INTERFACE AND TARGETING SYSTEM The present invention relates to a user interface, a targeting system using multiple user interfaces, and a method of operating the targeting system. Computer programme products are provided for implementing the method. The targeting system uses data from the multiple user interfaces, which may each be a simulated weapon and/or may be a part of a weapons training system. It is known to use simulated weapons or weapons training system for various purposes, such as for leisure activities or for training/assessment purposes. This may involve training/assessment in the use of weapons or training in strategy via “wargames”. User interfaces with similar capabilities may be used in leisure games such as laser tag. It is commonplace for such user interfaces to use line of sight targeting systems like lasers or infrared beams. This then requires a target device such as a reflector or receiver of some sort in order that it can be determined if a targeted object has been hit. Various types of simulated weapon have been developed alongside the increased usage of smart devices (internet-of-things) and smartphones. These simulated weapons involve a gun-like body that is provided with a smartphone cradle and they are used with a game that operates on the smartphone. An example of this type of toy gun is disclosed in CN203483841U. Such systems can make use of the smartphone camera along with augmented reality to provide virtual targets overlaid on a real-world image and/or to identify real -world objects to be the targets within the game. The user can aim at the target by pointing the gun, with its orientation being detected using motion sensors on the gun or on the smartphone. A normal smartphone GPS can be used to identify location in order to allow for location specific game features or tracking of the user. It has also been proposed to link such systems with other devices in order to provide an improved gaming experience, such as by means of Bluetooth connected headgear or clothing as disclosed in CN109126118A. An example of a weapons training system for military use is described in US20150354922A1. This discloses a laser targeting system using modified firearms shooting blanks rather than live rounds. A vest is worn with several target devices that can emit light and sound when hit. Via the use of TENS (Transcutaneous Electrical Nerve Stimulation) units in the vests the system can give more realistic conditions in terms of stress and the reaction to stress. Another example of a known targeting system is found in US 2014/178841 Al, which describes a laser-less targeting system where a hit/miss is determined making use of the orientation/position of the weapon used alongside image data from a camera that is aligned with the target axis of the weapon. The path of the bullet is calculated using ballistics formulae and the image data enables an assessment of if the target is visible at the point where the bullet will strike. It will however be appreciated that a need remains for improvements in user interfaces and targeting systems for such purposes. It would be an advantage to provide a system that could better simulate a real-world scenario and/or to provide hardware elements for such a system. Viewed from a first aspect, the present invention provides a method of operating a targeting system using data streamed from multiple user interfaces, the targeting system comprising a digital representation of a real-world environment in which the multiple user interfaces are being used; wherein the method includes: receiving data from the multiple user interfaces, wherein the data comprises, for each interface: location data giving the location of the user interface, orientation data giving a vector of a target axis of the user interface, image data from a camera device pointing in a direction that extends along the target axis, and an activation state of a trigger device of the user interface; updating the digital representation with the locations of the multiple user interfaces and the vectors of the target axes as the user interfaces are moved by respective users thereof; using the image data to determine the placement of real -world target objects including the users of the user interfaces as well as using the image data and/or the digital representation to determine the placement of intervening objects in the real -world; registering an activation of a trigger device by a first user of a first user interface, wherein activation of the trigger device indicates that a simulated shot has been fired; determining if a path of the simulated shot has passed within a certain distance of a second user of the second user interface based on at least two of: the image data from the first user interface, the vector of the target axis of the first user interface, and a relative location of the first and second user interfaces based on the location data; simulating the effect of any intervening object