Search

CN-122018700-A - Single-person first-view-angle interaction system and method based on photoconductive film

CN122018700ACN 122018700 ACN122018700 ACN 122018700ACN-122018700-A

Abstract

The invention discloses a single person first visual angle interaction system and method based on a photoconductive film. The system modulates high-frequency light through the micro-nano grating array to form a reflective light pattern which can be collected after cornea reflection, an image sensor collects and calculates head tracks at 120Hz or more, accuracy is 0.05 pixel or less, a first visual angle picture and the tracks are synchronously encoded into space-time capsules, and a playing end adjusts the visual angle of a display picture in real time based on the head motion tracks. The grating array is aligned with the pixels through photoetching, is necessary hardware for realizing high-precision low-delay interaction under the constraint of a mobile terminal, and can be used for scenes such as medical teaching, family memory, vlog, remote interaction and the like.

Inventors

  • CHANG LE

Assignees

  • 常乐

Dates

Publication Date
20260512
Application Date
20260319

Claims (10)

  1. 1. Claim 1 of A single-person first visual angle interaction system based on a light guide film is characterized by comprising the light guide film, an image sensor, a head track calculation unit, a first visual angle data acquisition unit, a space-time capsule generation unit and a first person capsule visual angle conversion unit, wherein the light guide film is attached to the surface of a screen and comprises a micro-nano grating array, the micro-nano grating array modulates light signals to form a reflected light pattern which can be acquired by an image sensor, the phase change of the reflected light pattern has a certain corresponding relation with head motion displacement, the edge light source is integrated at the edge of the light guide film and emits high-frequency modulated light with the frequency of more than or equal to 1MHz, the image sensor is arranged on an acquisition end device and is used for acquiring the reflected light pattern reflected by a cornea of a user at the sampling frequency of more than or equal to 120Hz, the head track calculation unit is used for calculating the head motion track of the collector according to the phase change of the reflected light pattern acquired by the image sensor, the calculation accuracy is less than or equal to 0.05 pixel, the first visual angle data acquisition unit is used for recording the first visual angle picture of the collector in real time through a camera, the first visual angle picture of the collector is used for carrying out level synchronous encoding on the first visual angle picture of the first visual angle picture, the first visual angle picture is used for generating first person capsule visual angle data, the visual angle data is arranged on the playing end device, and the playing end device is used for receiving and analyzing the capsule visual angle data.
  2. 2. Claim 2 claim The system of claim 1, wherein the micro-nano-grating array of the photoconductive film has a period of 200-500nm and is aligned with screen pixel hardware by a photolithographic process in combination with reticle alignment with an alignment error of < 0.01 pixels.
  3. 3. Claim 3 claim The system of claim 1, wherein the head trajectory calculation unit calculates the head movement displacement Δx by analyzing the phase change ΔΦ of the reflected light pattern in mm/radian in combination with the system calibration factor K according to the formula Δx = kχΔΦ.
  4. 4. Claim 4 claim The system of claim 1, wherein the edge light modulation frequency is greater than or equal to 1MHz and the image sensor sampling frequency is greater than or equal to 120Hz.
  5. 5. Claim 5 claim The system of claim 1, further comprising a gaze direction calibration unit for calculating a gaze direction of the collector based on the user's cornea reflection characteristics in the reflected light pattern collected by the image sensor, in combination with the optical modulation characteristics of the micro-nano grating array.
  6. 6. Claim 6 of The system of claim 1, further comprising an AI profile generation unit for conducting a real-time conversation with the user based on the AI profile model generated by the collector's offline training.
  7. 7. Claim 7 of The system according to claim 1, further comprising an emotion feedback unit for synchronously outputting emotion state information according to heartbeat and respiratory data of the collector, or according to voice tone characteristics, and 10-20Hz high-frequency jog components (extracted by band-pass filtering) in the reflected light pattern.
  8. 8. Claim 8 claim A single person first visual angle interaction method based on a light guide film is applied to a system according to any one of claims 1-7 and is characterized by comprising the steps of S1, enabling an edge light source to emit high-frequency modulation light which is more than or equal to 1MHz, modulating the light guide film micro-nano grating array to form a reflected light pattern, S2, enabling an image sensor to collect the reflected light pattern reflected by a cornea of a user at a sampling frequency which is more than or equal to 120Hz, S3, resolving a head movement track according to phase change of the reflected light pattern, enabling accuracy to be less than or equal to 0.05 pixel, S4, enabling a camera to record a first visual angle picture of a collector in real time, S5, enabling millisecond synchronization through a unified time stamp, enabling the first visual angle picture and head track data to be encoded into a space-time capsule, S6, enabling end analysis data, and adjusting visual angles of display pictures in real time based on the head movement track.
  9. 9. Claim 9 claim The method of claim 8, wherein step S3 solves for the displacement by the formula Δx = kxΔΦ, ΔΦ being the phase change, K being the factory calibration factor in mm/radian.
  10. 10. Claim 10 claim A computer readable storage medium storing a computer program, which when executed by a processor performs the method of any one of claims 8-9.

Description

Single-person first-view-angle interaction system and method based on photoconductive film Citation of related application The application refers to the following prior application filed by the applicant on the same day or before, and the whole technical contents of the specification, the claims and the abstract are incorporated by reference herein, namely, patent application number 2026103269949, application date 2026-03-17, application name, a control method and a control system of an unlicensed eyeball tracking touch terminal based on a light guide interaction layer, patent application number 2026102814971, application date 2026-03-10, and application name, namely, an unlicensed AI interaction implementation method based on a light guide toughened film. Technical Field The invention relates to the technical fields of man-machine interaction, virtual reality and artificial intelligence, in particular to a single first visual angle interaction system and method based on a light guide film, which are suitable for intelligent devices with touch screens, such as intelligent mobile phones, tablets and computers. Background The existing short video platform only supports the user to passively watch video contents shot by a third party, can not realize visual angle conversion based on head movement of the user, can not carry out real-time conversation with people in the video, and can not sense the current emotion state of the photographer. Although the existing VR/AR technology can realize immersive experience, most of the content is manually modeled, the first view records of real characters are lacking, and special hardware is relied on. In the prior art, the photoconductive film is only used for screen display enhancement or physical touch control, but not applied to head movement track acquisition and visual angle transformation. Disclosure of Invention 1. The invention aims to solve the technical problems that under the condition of not adding a special sensor, high-precision head motion trail acquisition is realized, real-time visual angle conversion of a playing end picture is realized based on the head motion trail, a light guide film is made to be a necessary hardware structure for realizing the functions, compatible operation under different hardware configurations is realized, and low delay and high precision of visual angle conversion are ensured. 2. Technical solution 2.1 hardware Structure of photoconductive film The light guide film is of a multilayer composite optical structure and comprises a micro-nano grating array, a light guide layer, an optical waveguide layer and an upper MgF 2 anti-reflection coating, wherein the micro-nano grating array has a period of 200-500nm, the period can achieve the optimal signal to noise ratio under the constraint of power consumption of a mobile end, the period is lower than 200nm and is easily interfered by ambient light, the phase resolving precision is insufficient when the period is higher than 500nm, the alignment error is less than 0.01 pixel when the period is aligned with a screen pixel through a photoetching process in combination with a mask plate, the micro-movement of a head is used for modulating incident light and converting the micro-movement of the head into phase change of reflected light, the edge micro-LED light source is integrated around a film body and emits high-frequency modulated light with the frequency of more than or equal to 1MHz, the optical waveguide layer uniformly transmits the high-frequency modulated light emitted by the edge micro-LED light to the whole film surface, the temperature compensation layer has a thermal expansion coefficient matched with screen glass, the grating optical characteristic is stable under the environment of-10 ℃ to 60 ℃, and the upper MgF 2 anti-reflection coating reduces the interference of the ambient light. 2.2 Principle of head movement track collection The high-frequency modulated light emitted by the edge micro LED light source is uniformly transmitted to the surface of the light guide film through the light guide layer, and a reflected light pattern with specific spatial frequency is formed after the light is modulated through the micro-nano grating array, wherein the reflected light pattern refers to a light signal reflected by the cornea of a user, and the light signal is collected in real time by an image sensor (such as a front camera) at a sampling frequency of more than or equal to 120 Hz. When the head of a user rotates, the position of the eyeball relative to the screen changes, so that the reflected light pattern acquired by the image sensor generates phase change, and the phase change is extracted by carrying out Fourier transformation and phase calculation on the reflected light pattern. The head track calculation unit combines the system calibration coefficient K and calculates the head movement displacement deltax according to the formula deltax=K multiplied by delta phi. K is a s