CN-120455644-B - Naked eye 3D imaging evaluation method and system based on feedback analysis
Abstract
The invention discloses a naked eye 3D imaging evaluation method and system based on feedback analysis, and relates to the technical field of 3D display. And establishing a special network through the cloud and the 3D terminal, and storing and converting the initial display data into naked eye 3D content in real time. Based on feedback analysis, on one hand, an interaction matrix is generated through user interaction behavior, the operation characteristic difference is analyzed, and on the other hand, a binocular vision simulation technology is utilized, a moving pixel window is adopted to analyze left and right view color histogram distribution, and imaging weight is quantized. The system performs feature decomposition on the interaction matrix, establishes a content priority group in combination with the imaging weight, dynamically adjusts the cloud cache strategy, and improves the 3D display efficiency. The invention effectively improves the user experience and the imaging quality, effectively improves the picture interaction efficiency of naked eye 3D display equipment, and improves the multi-user display efficiency of the 3D display end.
Inventors
- SONG JIANJUN
- ZHU FEILONG
Assignees
- 深圳市新长铖科技有限公司
Dates
- Publication Date
- 20260505
- Application Date
- 20250514
Claims (7)
- 1. The naked eye 3D imaging evaluation method based on feedback analysis is characterized by comprising the following steps of: s101, establishing exclusive network connection with a 3D display end based on a cloud end, and storing initial display data through the cloud end; S102, a user interacts through a 3D display end, the cloud converts initial display data to form naked eye 3D content, the naked eye 3D content is displayed through the 3D display end, interaction information of the user for each initial display data is collected in an interaction period, and an interaction matrix is generated based on the interaction information; S103, based on initial display data, simulating left and right image data of a 3D display end acquired by two eyes according to a sight distance state of a user through an image pickup module, moving from the left and right image data by taking a preset pixel matrix as a moving window, calculating a color value histogram of the preset pixel matrix in the left and right image data based on each movement, analyzing visual difference of the two eyes in the left and right image data through feedback of the color value histogram, and setting imaging weight based on visual difference evaluation; S104, performing feature decomposition through the interaction matrix, evaluating the difference of the interaction matrix, classifying the initial display data based on the difference, and setting a priority group of the initial display data based on a classification result and combining imaging weights; s105, performing cache setting on the initial display data based on the cloud through the priority group to generate a 3D display cache scheme; wherein, the S104 specifically is: Performing feature decomposition on all the interaction matrixes to obtain feature values and feature vectors, and calculating the difference between the interaction matrixes based on the feature values and the feature vectors of each interaction matrix to obtain matrix difference values of every two interaction matrixes; Grouping all interaction matrixes based on the matrix difference values, so that the maximum matrix difference value in the same group does not exceed a preset difference, and obtaining a plurality of groups of interaction matrixes; the method comprises the steps of mapping a plurality of groups of interaction matrixes to the classification of initial display data to obtain a plurality of groups of display data; And calculating an imaging weight average value corresponding to the initial display data in each group of display data, and setting the priority of each group of display data based on the weight average value to obtain a priority group.
- 2. The naked eye 3D imaging evaluation method based on feedback analysis according to claim 1, wherein S101 specifically comprises: and establishing exclusive network connection between the cloud and the plurality of 3D display ends, acquiring user interaction information in real time through the 3D display ends, transmitting the user interaction information to the cloud for storage, and storing initial display data based on the cloud.
- 3. The naked eye 3D imaging evaluation method based on feedback analysis according to claim 1, wherein S102 specifically comprises: setting an interaction period and dividing a plurality of time nodes; In an interaction period, judging initial display data corresponding to a user interaction process through a 3D display end, and collecting corresponding interaction information; The interactive information comprises the interactive frequency, the browsing display times and the browsing time of each time node in the initial display data; and constructing an interaction matrix by taking the time node as a first dimension and the interaction information as a second dimension.
- 4. The naked eye 3D imaging evaluation method based on feedback analysis according to claim 1, wherein S103 specifically is: Acquiring the distance between eyes and the viewing distance of a user, setting two camera devices through a camera module, and acquiring image data of a 3D display end to obtain left and right image data; Setting a 3 multiplied by 3 pixel matrix and taking the pixel matrix as a moving window, moving the window on the left image data, calculating a corresponding color histogram based on the pixel matrix in each movement, extracting color features through the color histogram, and calculating the whole left image data in a covering way after the movement is finished; Forming a left image feature set based on the color features extracted each time the motion is performed; extracting color features of the right image to form a right image feature set; Respectively selecting color features extracted each time from the left image feature set and the right image feature set to perform feature difference calculation to obtain a plurality of difference values; And (3) carrying out averaging on the plurality of difference values to obtain a visual difference value, and setting an imaging weight based on the visual difference value.
- 5. The naked eye 3D imaging evaluation method based on feedback analysis according to claim 1, wherein S105 specifically is: the initial display data is subjected to cache setting based on the cloud end through the priority group, and a 3D display cache scheme is generated; In the 3D display caching scheme, initial display data of a first priority is converted in real time through a cloud end and stored in a cache list; And preloading and setting a real-time queue task based on the initial presentation data of the second priority.
- 6. The naked eye 3D imaging evaluation system based on feedback analysis is characterized by comprising a memory and a processor, wherein the memory comprises a naked eye 3D imaging evaluation program based on feedback analysis, and the naked eye 3D imaging evaluation program based on feedback analysis realizes the steps of the naked eye 3D imaging evaluation method based on feedback analysis as claimed in claim 1 when being executed by the processor.
- 7. A computer readable storage medium, wherein the computer readable storage medium includes a feedback analysis-based naked eye 3D imaging evaluation program, and the feedback analysis-based naked eye 3D imaging evaluation program, when executed by a processor, implements the steps of the feedback analysis-based naked eye 3D imaging evaluation method according to any one of claims 1 to 5.
Description
Naked eye 3D imaging evaluation method and system based on feedback analysis Technical Field The invention relates to the field of naked eye 3D imaging, in particular to a naked eye 3D imaging evaluation method and system based on feedback analysis. Background The naked eye 3D technology generally uses parallax barriers, lenticular lenses or other optical elements to separate left and right eye images, and sets different display pictures so as to achieve the naked eye 3D effect. The naked eye 3D display technology has certain development and application in advertisement screens, televisions, digital photo frames, game machines and the like. But for naked eye 3D display platforms of some exhibition halls, 3D video analysis and display are carried out simultaneously by multiple users, naked eye 3D content is more, delay overlarge condition easily occurs during conversion, smooth synchronous display of multi-terminal picture imaging is difficult to realize in the prior art, experience under multi-point user application is poor, and meanwhile, interactive display and feedback analysis of users are absent in the prior art, and dynamic adjustment and display of 3D content transmission and conversion are difficult to carry out. Disclosure of Invention The invention overcomes the defects of the prior art and provides a naked eye 3D imaging evaluation method and system based on feedback analysis. The first aspect of the invention provides a naked eye 3D imaging evaluation method based on feedback analysis, which comprises the following steps: s101, establishing exclusive network connection with a 3D display end based on a cloud end, and storing initial display data through the cloud end; S102, a user interacts through a 3D display end, the cloud converts initial display data to form naked eye 3D content, the naked eye 3D content is displayed through the 3D display end, interaction information of the user for each initial display data is collected in an interaction period, and an interaction matrix is generated based on the interaction information; S103, based on initial display data, simulating left and right image data of a 3D display end acquired by two eyes according to a sight distance state of a user through an image pickup module, moving from the left and right image data by taking a preset pixel matrix as a moving window, calculating a color value histogram of the preset pixel matrix in the left and right image data based on each movement, analyzing visual difference of the two eyes in the left and right image data through feedback of the color value histogram, and setting imaging weight based on visual difference evaluation; S104, performing feature decomposition through the interaction matrix, evaluating the difference of the interaction matrix, classifying the initial display data based on the difference, and setting a priority group of the initial display data based on a classification result and combining imaging weights; S105, performing cache setting on the initial display data based on the cloud through the priority group, and generating a 3D display cache scheme. In this scheme, S101 specifically is: and establishing exclusive network connection between the cloud and the plurality of 3D display ends, acquiring user interaction information in real time through the 3D display ends, transmitting the user interaction information to the cloud for storage, and storing initial display data based on the cloud. In this solution, S102 specifically is: setting an interaction period and dividing a plurality of time nodes; In an interaction period, judging initial display data corresponding to a user interaction process through a 3D display end, and collecting corresponding interaction information; The interactive information comprises the interactive frequency, the browsing display times and the browsing time of each time node in the initial display data; and constructing an interaction matrix by taking the time node as a first dimension and the interaction information as a second dimension. In this solution, S103 specifically is: Acquiring the distance between eyes and the viewing distance of a user, setting two camera devices through a camera module, and acquiring image data of a 3D display end to obtain left and right image data; Setting a 3 multiplied by 3 pixel matrix and taking the pixel matrix as a moving window, moving the window on the left image data, calculating a corresponding color histogram based on the pixel matrix in each movement, extracting color features through the color histogram, and calculating the whole left image data in a covering way after the movement is finished; Forming a left image feature set based on the color features extracted each time the motion is performed; extracting color features of the right image to form a right image feature set; Respectively selecting color features extracted each time from the left image feature set and the right image feature set to perform feature