Search

EP-4266258-B1 - INTERACTIVE PROJECTION INPUT/OUTPUT DEVICE

EP4266258B1EP 4266258 B1EP4266258 B1EP 4266258B1EP-4266258-B1

Inventors

  • GAO, YONG

Dates

Publication Date
20260506
Application Date
20211206

Claims (15)

  1. An interactive projection input and output device (10), comprising: a projecting module (110), configured to display a screen mirroring content from an external device (20) in a projection region of a projection picture, the projection region corresponding to the external device (20); an operation capturing module (120), configured to: form an operation capture scope corresponding to the projection region; establish a first coordinate system within the operation capture scope; capture a user operation behavior occurring within the operation capture scope; and determine a first coordinate of a user operation point corresponding to the user operation behavior in the first coordinate system; an instruction generating module (130), configured to: obtain a relative position relationship between the user operation point and the projection region according to the first coordinate of the user operation point and regional position information of the projection region in the first coordinate system; and generate a position information instruction according to the relative position relationship; and a wireless transmission module (140), configured to: receive screen mirroring information from the external device (20) through a downlink wireless connection, wherein the screen mirroring information comprises the screen mirroring content; and send the position information instruction to the external device (20) through an uplink wireless connection, such that an operating system of the external device (20) calls an underlying drive program to execute the position information instruction, so as to achieve a control operation in response to the user operation behavior and thus operate the external device (20); the screen mirroring information further comprises an operating system type of the external device (20), and at least one of picture vertex coordinates or a screen resolution of the external device (20); wherein the picture vertex coordinates or the screen resolution is used to instruct the instruction generating module (130) to: generate a second coordinate of a virtual operation point to be displayed by the external device (20) according to the picture vertex coordinates or the screen resolution, and the relative position relationship, such that the relative position relationship is satisfied between the virtual operation point and a screen of the external device (20); and generate the position information instruction identifiable by an operating system of the external device (20) and for performing the control operation at the virtual operation point according to the second coordinate; and wherein the operating system type is used to instruct the instruction generating module (130) to generate the position information instruction matched with the operating system type according to the relative position relationship.
  2. The device of claim 1, wherein the projecting module (110) comprises: a projection function assembly (1101), configured to project the screen mirroring content; and a projection controller (1102), configured to: adjust a position, an angle and/or a size of the projection region in response to receiving a projection adjustment instruction from the instruction generating module (130); and perform spectral correction favorable for protection of eye sight according to optical electronic information contained in the screen mirroring content.
  3. The device of claim 1, wherein the operation capturing module (120) comprises an operation detector (1201) which is disposed to form the operation capture scope corresponding to the projection region, and the operation detector (1201) is configured to: capture the user operation behavior occurring within the operation capture scope according to detected environmental physical signals; determine the first coordinate of the user operation point corresponding to the user operation behavior in the first coordinate system; and identify and capture the regional position information of the projection region in the first coordinate system.
  4. The device of claim 1, wherein the operation capturing module (120) comprises: a detection signal generator (1202), configured to form the operation capture scope corresponding to the projection region by sending detection signals; and an operation detector (1201), configured to: capture the user operation behavior occurring within the operation capture scope by detecting the detection signals; determine the first coordinate of the user operation point corresponding to the user operation behavior in the first coordinate system; and identify and capture the regional position information of the projection region in the first coordinate system.
  5. The device of claim 1, wherein the instruction generating module (130) comprises an application specific integrated circuit chip having a hardware computing circuit formed thereon by burning.
  6. The device of claim 1, wherein the screen mirroring information further comprises at least one of an eccentric distance, a horizontal and vertical screen orientation, a data rate, or a frame rate; wherein, the horizontal and vertical screen orientation is used to instruct the instruction generating module (130) to adjust the relative position relationship; the data rate and the frame rate are used jointly to calculate the screen resolution and the horizontal and vertical screen orientation; the eccentric distance is used to correct the second coordinate of the virtual operation point.
  7. The device of claim 1, wherein, the external device (20) is plural in number, the projecting module (110) is specifically configured to respectively project screen mirroring contents from a plurality of external devices to projection regions of the projection picture corresponding to the plurality of external devices; the instruction generating module (130) is specifically configured to: determine a specific projection region where the user operation point is located, according to the first coordinate of the user operation point and regional position information of each of the projection regions in the first coordinate system; and generate a position information instruction for a specific external device corresponding to the specific projection region according to the relative position relationship between the user operation point and the specific projection region; the wireless transmission module (140) is specifically configured to, through wireless connections respectively established with the plurality of external devices, receive screen mirroring information of the plurality of external devices, wherein the screen mirroring information comprises the screen mirroring contents; and send the position information instruction for the specific external device from the instruction generating module (130) to the specific external device.
  8. The device of claim 7, wherein in response to identifying a gesture operation corresponding to a file sharing function, the instruction generating module (130) is further configured to: determine an external device (20) as a sharing end and position information of a shared file to be displayed in the external device (20) as the sharing end, based on a start point of the gesture operation, determine an external device (20) as a receiving end based on a swipe direction or endpoint of the gesture operation, and generate a file sharing position information instruction, wherein the file sharing position information instruction is used to share the shared file from the external device (20) as the sharing end to the external device (20) as the receiving end.
  9. The device of claim 8, wherein the wireless transmission module (140) is further configured to: send the file sharing position information instruction to the external device (20) as the sharing end, such that the external device (20) as the sharing end sends the shared file to the external device (20) as the receiving end through a wireless connection established with the external device (20) as the receiving end.
  10. The device of claim 8, wherein the wireless transmission module (140) is further configured to: send the file sharing position information instruction to the external device (20) as the sharing end to cause the external device (20) as the sharing end to send the shared file to the wireless transmission module (140) through a wireless connection established with the wireless transmission module (140), such that the wireless transmission module (140) sends the shared file to the external device (20) as the receiving end through a wireless connection established with the external device (20) as the receiving end.
  11. The device of claim 1, wherein the downlink wireless connection established by the wireless transmission module (140) with the external device (20) comprises one or more of router, Bluetooth, NFC, UWB, ZigBee, LiFi, WiFi, NB-IOT, eMTC, LoRa, tuner, GNSS, LNB, cellular 3G, cellular 4G, cellular 5G, cellular 6G, or its own wireless hotspot.
  12. The device of claim 1, wherein the uplink wireless connection established by the wireless transmission module (140) with the external device (20) comprises one or more of Bluetooth, BLE, NFC, router, WiFi, NB-IOT, eMTC, LoRa, tuner, GNSS, LNB, Z-Wave, cellular 3G, cellular 4G, cellular 5G, cellular 6G or wireless USB.
  13. The device of claim 1, further comprising: a camera module (150), configured to acquire image information; and an audio module (160), configured to acquire audio information and play audio information received by the wireless transmission module (140) from the external device (20).
  14. The device of claim 1, wherein, the projecting module (110) comprises an eye sight protection filter, and the eye sight protection filter comprises an eye sight protection filter lens or an eye sight protection filter film to reduce density of high energy blue light and/or stroboscopic effect harmful to eye sight.
  15. The device of claim 1, wherein the device is integrated into glasses; and the projecting module (110) is configured to project the projection picture to retinas of human eyes or to lenses of the glasses or to air.

Description

TECHNICAL FIELD The present disclosure relates to interactive projection systems, and in particular to an interactive projection input and output device. BACKGROUND An interactive projection system which combines mixed virtual reality technology and motion capture technology is a result of further development of the virtual reality technology, with its core technology being intelligent recognition and control system based on infrared image acquisition and infrared laser positioning. Specifically, real-time video data is captured by an infrared camera and sent to an image processing unit, and dynamic background modeling is performed in a video data stream based on a specific algorithm to separate a target from background, determine position information of the target, and convert the position information into control signals. In this way, based on the control signals, control for computers and other intelligent multimedia systems can be achieved. For example, using an embedded laser projection technology, a touch projection device is enabled to project a virtual keyboard, and a virtual projection device is connected to a PC terminal in which a software is installed, so as to operate the PC terminal through the virtual keyboard, thereby the traditional keyboard is replaced. However, in the related arts, a virtual keyboard or a virtual mouse cannot transmit absolute coordinates, that is, a virtual touch screen cannot be implemented. Further, the traditional touch projection device has self-contained CPU and operating system and thus is not a peripheral device in a strict sense. When the touch projection device controls other external intelligent device, it is required to make sure the other external intelligent device is installed with a dedicated APP or software. Therefore, the traditional touch projection device cannot achieve general coordinated control across system and device. US2018011586A1 discloses a multi-touch display panel that includes: a display panel configured to display an image according to image data; a multi-touch panel arranged over the display panel and configured to generate touch data; and a communication module configured to communicate with a remote device. The remote device includes a display panel and a touch screen, and the communication module is further configured to receive the image data from the remote device and to provide the touch data to the remote device. US2016328021A1 discloses a glass-type terminal (or a terminal for an eye-glass type) capable of selecting and executing more various functions even without a separate input device. US2016363774 (A1) discloses a display control method. SUMMARY In order to overcome the problems existing in the related art, the present invention provides an interactive projection input and output device. The interactive projection input and output device includes: a projecting module, configured to display a screen mirroring content from an external device in a projection region of a projection picture corresponding to the external device; an operation capturing module, configured to: form an operation capture scope corresponding to the projection region; establish a first coordinate system within the operation capture scope; capture a user operation behavior occurring within the operation capture scope; and determine a first coordinate of a user operation point corresponding to the user operation behavior in the first coordinate system; an instruction generating module, configured to: obtain a relative position relationship between the user operation point and the projection region according to the first coordinate of the user operation point and regional position information of the projection region in the first coordinate system; and generate a position information instruction according to the relative position relationship; and a wireless transmission module, configured to: receive screen mirroring information from the external device through a downlink wireless connection, wherein the screen mirroring information comprises the screen mirroring content; and, send the position information instruction to the external device through an uplink wireless connection, such that an operating system of the external device calls an underlying drive program to execute the position information instruction, so as to achieve a control operation in response to the user operation behavior and thus operate the external device; the screen mirroring information further comprises an operating system type of the external device, and at least one of picture vertex coordinates or a screen resolution of the external device; wherein, the picture vertex coordinates or the screen resolution is used to instruct the instruction generating module to: generate a second coordinate of a virtual operation point to be displayed by the external device according to the picture vertex coordinates or the screen resolution, and the relative position relationship, such that the r