CN-121971844-A - Golf intelligent glasses system and information processing method thereof
Abstract
The invention discloses a golf intelligent glasses system which comprises a sensor assembly, a processing module and a display component, wherein the sensor assembly at least comprises a camera, an Inertial Measurement Unit (IMU), a millimeter wave radar and a microphone array, the processing module is in communication connection with the sensor assembly and is used for performing environment sensing, interaction identification and golf movement analysis, the processing module comprises a multi-mode depth estimation unit used for fusing vision and inertial data to generate environment depth information and a golf AI analysis unit used for generating a batting strategy, the display component is in communication connection with the processing module and comprises an optical waveguide lens and a driving unit, and the driving unit is configured to drive the optical waveguide lens to display Augmented Reality (AR) guide information to a wearer and display dynamic vision special effects to an external environment. According to the invention, no-laser environment perception and personalized batting strategy generation are realized, training efficiency and use experience are improved through AR guidance, double-sided display and natural interaction, and professional application of intelligent wearable equipment is expanded.
Inventors
- SONG ZIQI
- SONG SHUQI
- SONG ZIJIAN
Assignees
- 北京量子厚德医疗科技有限公司
Dates
- Publication Date
- 20260505
- Application Date
- 20251217
Claims (10)
- 1. A golf smart eyeglass system, comprising: The sensor assembly at least comprises a camera, an Inertial Measurement Unit (IMU), a millimeter wave radar and a microphone array; The processing module is in communication connection with the sensor assembly and is used for executing environment sensing, interaction identification and golf sport analysis, and comprises a multi-mode depth estimation unit used for fusing visual and inertial data to generate environment depth information and a golf AI analysis unit used for generating a batting strategy; And a display assembly communicatively coupled to the processing module, including an optical waveguide lens and a drive unit configured to drive the optical waveguide lens to display augmented reality AR guidance information to a wearer and to display dynamic visual effects to an external environment.
- 2. The golf smart eyewear system of claim 1, wherein the multi-modal depth estimation unit is configured to obtain the fused depth values by adaptively weighted fusion of a first depth value from a monocular depth estimation network, a second depth value from a synchronous positioning and mapping SLAM system, and a third depth value from inertial measurement unit IMU motion prediction.
- 3. The golf smart eyewear system of claim 2, wherein the adaptive weighted fusion is calculated as: ; Wherein D fused is the fusion depth value, D mono is the first depth value, D slam is the second depth value, D imu is the third depth value, and α, β, γ are real-time dynamic weights corresponding to D mono 、D slam 、D imu , respectively.
- 4. The golf smart eyeglass system of claim 1, wherein the golf AI analysis unit is configured to make a shot landing prediction based on the environmental depth information and generate the shot strategy, and wherein the shot landing prediction has a calculation formula: ; wherein, P pred is the predicted drop point coordinate, P ball is the golf ball coordinate, v 0 is the initial speed of the batting, θ 0 is the batting emission angle, g is the gravitational acceleration, For the target direction unit vector, Δ wind is the wind condition correction amount, Δ slope is the gradient correction amount, and Δ user is the personalized deviation correction amount based on the user history data.
- 5. The golf smart eyewear system of claim 1, wherein the processing module further comprises an interaction resolution unit; The interaction analysis unit is configured to identify nodding actions based on data of the inertial measurement unit IMU and trigger corresponding control instructions based on the identified number of consecutive nodding times.
- 6. The golf smart eyewear system of claim 5, wherein the interaction resolution unit is further configured to identify a preset gesture based on data of the millimeter wave radar and/or a preset voiceprint event based on data of the microphone array.
- 7. The golf smart eyewear system of claim 1, wherein the display assembly further comprises a virtual filter processing unit; the virtual filter processing unit is configured to generate a dynamic color change effect on the AR guide information and/or the dynamic visual special effect by superposing a programmable virtual filter layer and adjusting parameters thereof.
- 8. The golf smart eyewear system of claim 1, wherein the processing module further comprises a mode management unit; the mode management unit is configured to switch to a golf specialty mode upon detection of a golf scene feature; In the golf specialty mode, the multi-modal depth estimation unit and the golf AI analysis unit are activated and the display component is configured to display the AR guidance information related to golf assistance.
- 9. The golf smart eyewear system of claim 8, wherein the mode management unit is further configured to control the display component to adjust the transparency of the AR guide information and to initiate the particular dynamic visual effect when the wearer is detected to be in a social gaze scene in the golf specialty mode.
- 10. An information processing method applied to the smart glasses system as claimed in any one of claims 1 to 9, comprising the steps of: Collecting data by a sensor assembly; Generating environmental depth information through multi-mode data fusion based on the acquired data; performing a golf motion analysis based on the environmental depth information to generate a shot strategy; displaying Augmented Reality (AR) guide information based on the batting strategy to a wearer through a display component, and displaying dynamic visual special effects to an external environment; and identifying user interaction intention based on the analysis of the data and executing corresponding control.
Description
Golf intelligent glasses system and information processing method thereof Technical Field The invention relates to the technical field of intelligent wearable equipment and augmented reality display, in particular to a golf intelligent glasses system and an information processing method thereof. Background Currently, the mainstream golf game aid solutions on the market mainly rely on single function devices, such as sensors for measuring swing biomechanics only or independently operated laser rangefinders. The functions of the equipment are split, real-time and comprehensive strategy suggestions based on actual batting scenes such as ball positions, targets and gradients cannot be provided, and a user needs to switch different tools, so that the operation flow is complex and experience is discontinuous. While common consumer-grade Augmented Reality (AR) smart glasses have been applied in some areas, they lack dedicated algorithms, AR content, and interaction logic optimized for golf depth of play. The prior AR glasses have the problems that the professional analysis function is lost, the vision guidance and real environment registration precision is insufficient, the environment perception and the motion analysis cannot be effectively fused and the like in a golf scene. In addition, existing devices also have limitations in the manner of interaction and visual experience. In the movement process, the physical keys or the touch pad are inconvenient to operate, and the visual mode of the lenses of the common glasses or the intelligent glasses is often fixed, so that dynamic adjustment and personalized expression cannot be carried out according to movement scenes, ambient light or user preferences. Therefore, a comprehensive solution capable of integrating professional analysis, real-time guidance, natural interaction and adaptive display is needed. Disclosure of Invention In view of the above-mentioned technical problems in the related art, the present invention provides a golf intelligent glasses system and an information processing method thereof, which can overcome the above-mentioned shortcomings of the prior art. In order to achieve the technical purpose, the technical scheme of the invention is realized as follows: a golf intelligent eyeglass system; this golf smart eyeglass system includes: The sensor assembly at least comprises a camera, an Inertial Measurement Unit (IMU), a millimeter wave radar and a microphone array; The processing module is in communication connection with the sensor assembly and is used for executing environment sensing, interaction identification and golf sport analysis, and comprises a multi-mode depth estimation unit used for fusing visual and inertial data to generate environment depth information and a golf AI analysis unit used for generating a batting strategy; And a display assembly communicatively coupled to the processing module, including an optical waveguide lens and a drive unit configured to drive the optical waveguide lens to display augmented reality AR guidance information to a wearer and to display dynamic visual effects to an external environment. Further, the multi-modal depth estimation unit is configured to obtain a fused depth value by adaptively weighted fusion of a first depth value from the monocular depth estimation network, a second depth value from the synchronous positioning and mapping SLAM system, and a third depth value from the inertial measurement unit IMU motion prediction. Further, the calculation formula of the adaptive weighted fusion is as follows: ; Wherein D fused is the fusion depth value, D mono is the first depth value, D slam is the second depth value, D imu is the third depth value, and α, β, γ are real-time dynamic weights corresponding to D mono、Dslam、Dimu, respectively. Further, the golf AI analysis unit is configured to predict a ball striking drop point based on the environmental depth information and generate the ball striking strategy, wherein the calculation formula of the ball striking drop point prediction is as follows: ; wherein, P pred is the predicted drop point coordinate, P ball is the golf ball coordinate, v 0 is the initial speed of the batting, θ 0 is the batting emission angle, g is the gravitational acceleration, For the target direction unit vector, Δ wind is the wind condition correction amount, Δ slope is the gradient correction amount, and Δ user is the personalized deviation correction amount based on the user history data. Further, the processing module further comprises an interaction analysis unit; The interaction analysis unit is configured to identify nodding actions based on data of the inertial measurement unit IMU and trigger corresponding control instructions based on the identified number of consecutive nodding times. Further, the interaction resolution unit is further configured to identify a preset gesture based on the data of the millimeter wave radar and/or to identify a preset voiceprint event