Search

CN-122027754-A - Studio virtual and real picture synchronous tracking and fusion system

CN122027754ACN 122027754 ACN122027754 ACN 122027754ACN-122027754-A

Abstract

The invention relates to the technical field of broadcast television and virtual-real fusion, and particularly discloses a studio virtual-real picture synchronous tracking and fusion system which comprises a multi-sensor fusion tracking module, a full IP synchronous control module, a virtual-real fusion rendering module and a multi-program shared scheduling platform. The multi-sensor fusion tracking module fuses various sensor data and outputs high-precision six-degree-of-freedom pose information, the full IP synchronization control module controls synchronization errors to be within 10ms based on PTP synchronization and dynamic delay compensation, the virtual-real fusion rendering module combines AR/MR rendering, digital twin and artificial intelligence to realize seamless fusion of virtual and real pictures, and the multi-program sharing scheduling platform realizes resource virtualization and flexible multiplexing. The invention solves the tracking defect of a single sensor, improves the virtual-real synchronization precision and visual expressive force, reduces the operation and maintenance cost, improves the intelligent level of news production, and is suitable for scenes such as live broadcasting in a studio.

Inventors

  • WANG ZHENGXIN
  • DAI HONGBIN
  • ZHANG LIYING

Assignees

  • 宁波广播电视集团

Dates

Publication Date
20260512
Application Date
20260328

Claims (9)

  1. 1. The utility model provides a studio virtual and actual picture synchronous tracking and fusion system which characterized in that includes: the multi-sensor fusion tracking module is used for acquiring various sensing data of the video camera and the target object in the studio and outputting six-degree-of-freedom pose information of the video camera and the target object through a multi-source data fusion algorithm; the full IP synchronous control module is used for providing a unified time reference and realizing synchronous control of tracking data and video signals based on a dynamic delay compensation mechanism; The virtual-real fusion rendering module is used for generating a virtual scene in real time according to the pose information, and fusing and outputting the virtual scene and a camera real shot picture; and the multi-program sharing scheduling platform is used for carrying out virtualization and dynamic scheduling on the studio hardware resources and supporting that a plurality of programs share the same system resources.
  2. 2. The system for synchronously tracking and fusing virtual and real pictures of a studio according to claim 1, wherein the multi-sensor fusion tracking module comprises: the photoelectric sensor is used for capturing infrared reflection mark points on the camera and the target object; an infrared sensor for providing auxiliary location information in low light conditions; the image recognition sensor is used for extracting the outline and the characteristic points of the target object through a visual recognition algorithm; the point cloud sensor is used for constructing a three-dimensional point cloud model of the inner space of the studio and providing space geometric constraint.
  3. 3. The system for synchronously tracking and fusing virtual and real pictures of a studio according to claim 2, wherein the multi-sensor fusion tracking module is used for fusing data of a photoelectric sensor, an infrared sensor, an image recognition sensor and a point cloud sensor by adopting a Kalman filtering algorithm, and automatically switching to a tracking mode based on fusion of the point cloud sensor and the image recognition sensor when a photoelectric sensor signal is shielded.
  4. 4. The system for synchronously tracking and fusing virtual and real pictures of a studio according to claim 3, wherein said all IP synchronization control module comprises: the PTP high-precision time synchronization unit is constructed based on an IEEE 1588 protocol and is used for providing a unified time reference; The delay compensation unit is used for measuring link delay of each link of tracking data acquisition, transmission, rendering and video acquisition in real time, and dynamically adjusting virtual scene rendering time to realize virtual and real picture synchronization; And the all-IP signal scheduling unit adopts SMPTE ST 2110 standard to uniformly package video, audio and tracking data into IP streams, so that flexible routing and resource pooling scheduling of signals are realized.
  5. 5. The system for synchronously tracking and fusing virtual and real pictures in a studio according to claim 4, wherein the delay compensation unit adopts a unified timestamp synchronization and dynamic delay compensation mechanism to construct a closed-loop compensation model, so that the synchronization error of the virtual picture and the real picture is controlled within 10 ms.
  6. 6. The system for synchronously tracking and fusing virtual and real pictures of a studio according to claim 5, wherein said virtual and real fusion rendering module comprises: the AR/MR real-time rendering engine is used for generating a virtual scene matched with the visual angle of the camera in real time according to the pose information; The digital twin space expansion unit is used for constructing a digital twin model of the internal space and the external landmark space of the studio and realizing projection mapping and visual fusion of the live-action picture and the virtual twin space through a twin matching algorithm; And the artificial intelligence fusion unit is used for locally deploying the large language model and integrating the voice and video enhancement interfaces to form an integrated artificial intelligence workflow.
  7. 7. The system for synchronously tracking and fusing virtual and real pictures of a studio according to claim 6, wherein the digital twin space expansion unit dynamically calculates a projection mapping relation between a real picture and a virtual twin space according to real-time pose and lens parameters of a camera, and realizes accurate matching of a real scene and the digital twin space through superposition control of naked eye vision and a visual focus of an ultra-high definition camera.
  8. 8. The system for synchronously tracking and fusing virtual and real pictures of a studio according to claim 7, wherein the large language model locally deployed by the artificial intelligence fusion unit comprises one or more of DeepSeek, bean bags, kimi and Gemini flash2.0, and integrates an NVIDIA Broadcast software API interface for realizing voice enhancement, video call quality improvement, intelligent auditing and intelligent output of small screen content in a live Broadcast process.
  9. 9. The system for synchronously tracking and fusing virtual and real pictures of a studio according to claim 8, wherein said multi-program shared scheduling platform comprises: the resource virtualization unit is used for abstracting the video camera, the rendering server and the tracking equipment in the studio into a virtual resource pool; the program arrangement unit is used for supporting the dynamic application of studio resources of multiple news columns in a time slice or event driving mode and realizing the rapid switching and multiplexing of the same studio system among different programs; And the operation guaranteeing unit is used for guaranteeing the system stability under the multi-section high-frequency use scene through the system-level redundancy design and real-time monitoring.

Description

Studio virtual and real picture synchronous tracking and fusion system Technical Field The invention relates to the technical field of broadcast television and virtual-real fusion, in particular to a studio virtual-real picture synchronous tracking and fusion system. Background With the development of the media technology, the news studio has higher requirements on flexibility, real-time performance and visual expressive force of program production. Virtual-real fusion technology, especially Augmented Reality (AR) and Mixed Reality (MR) technology, has become an important means for improving visual presentation effects of news programs. In the prior art, the traditional virtual studio system based on the SDI architecture relies on a single sensor to track the positions of the camera and the presenter. The technical problems of the scheme in practical application are as follows: The tracking precision is insufficient, namely a single sensor is easily interfered by factors such as on-site lamplight change, personnel shielding and the like, so that tracking signals are lost or precision is reduced, and the alignment effect of virtual and real pictures is affected. The synchronization error is larger, and the lack of a unified time reference between virtual scene rendering and real camera shooting causes dislocation, jitter or delay of virtual and real pictures in the dynamic shooting process, and particularly the problem is more prominent under multi-position switching or fast moving scenes. The system expansibility is poor, the system resource scheduling is stiff due to the point-to-point connection mode based on SDI, the same studio system is difficult to support for multiple programs, the equipment utilization rate is low, and the operation and maintenance cost is high. The existing system is mostly limited to simple virtual background replacement, cannot realize the depth fusion of the real scene in the studio and the external digital space, and lacks immersive visual experience. Therefore, how to construct a studio system with high precision, low delay, high expansibility and depth virtual-real fusion capability becomes a technical problem to be solved in the field. Disclosure of Invention The invention aims to solve the technical problem of providing a studio virtual and real picture synchronous tracking and fusion system, which realizes the precise synchronous fusion of a camera and a target object in high-precision pose tracking and virtual and real pictures, supports multiple programs to share system resources, and improves the studio manufacturing efficiency and flexibility. In order to solve the technical problems, the technical scheme provided by the invention is that the system for synchronously tracking and fusing virtual and real pictures of a studio comprises a multi-sensor fusion tracking module, a full IP synchronous control module, a virtual and real fusion rendering module and a multi-program shared scheduling platform. The multi-sensor fusion tracking module comprises: the photoelectric sensor is arranged at the top of the studio and is used for capturing infrared reflection mark points on the camera and the host; an infrared sensor for providing auxiliary location information in low light conditions; the image recognition sensor is deployed around the camera and the host, and the outline and the characteristic points of the target object are extracted in real time through a visual recognition algorithm; the point cloud sensor is deployed at a key position of the studio and is used for constructing a three-dimensional point cloud model of the inner space of the studio and providing space geometric constraint; The multi-sensor fusion tracking module carries out fusion processing on data of the photoelectric sensor, the infrared sensor, the image recognition sensor and the point cloud sensor through a Kalman filtering algorithm and outputs six-degree-of-freedom pose information of the camera and the presenter. The system automatically switches to a tracking mode based on fusion of the point cloud and the image recognition sensor when the photoelectric or infrared signals are blocked, so that tracking continuity and positioning accuracy are ensured. The positioning precision of the camera and the target object reaches +/-1 mm. The all IP synchronous control module comprises: The PTP high-precision time synchronization unit is constructed based on an IEEE 1588 protocol and provides a unified time reference for a camera, a rendering server, a tracking module and a broadcasting server in the system; The delay compensation unit is used for measuring link delay of each link of tracking data acquisition, transmission, rendering and video acquisition in real time based on a uniform timestamp synchronization and dynamic delay compensation mechanism, constructing a closed-loop compensation model, and dynamically adjusting virtual scene rendering time to control synchronization errors of a virtual pict