CN-122023622-A - Scene rendering method and device, storage medium and electronic equipment
Abstract
The application discloses a scene rendering method, a device, a storage medium and electronic equipment, wherein the method comprises the steps of predicting first probability of a target user moving in a target path and second probability of interaction with scene objects in the target scene based on historical movement data of the target user in the target scene, determining rendering weights of the scene objects included in the target scene, determining a dynamic heat map of the target scene, wherein the dynamic heat map is used for representing the activity of the target scene, and rendering the target scene based on the first probability, the second probability, the rendering weights and the dynamic heat map. The application solves the problem of low scene rendering efficiency, and further achieves the effect of improving the scene rendering efficiency.
Inventors
- LIN JIANG
- LIU BIN
- Wu Hufa
- DU SHUFENG
- LIN XIAO
- YAO CHUNLONG
- YE PENGCHENG
- YE MENGXUAN
- ZHOU MIAO
- ZHOU WEI
- Yan Gaihong
- TANG XIAOQING
Assignees
- 浙江大华技术股份有限公司
Dates
- Publication Date
- 20260512
- Application Date
- 20251231
Claims (20)
- 1. A method of rendering a scene, comprising: predicting a first probability that a target user moves in a target path and a second probability of interacting with a scene object in a target scene based on historical movement data of the target user in the target scene; determining a rendering weight of the scene object included in the target scene; Determining a dynamic heat map of the target scene, wherein the dynamic heat map is used for representing the liveness of the target scene; and rendering the target scene based on the first probability, the second probability, the rendering weight and the dynamic heat map.
- 2. The method of claim 1, wherein determining the rendering weights for the scene objects included in the target scene comprises: assigning an initial weight to the scene object based on the attribute of the scene object; Determining bounding box volumes of bounding boxes of each scene object, wherein the bounding boxes are used for framing the scene objects; the rendering weights are determined based on the initial weights and the bounding box volumes.
- 3. The method of claim 2, wherein determining the rendering weights based on the initial weights and the bounding box volumes comprises: Determining a maximum bounding box volume included in the bounding box volumes; determining, for each of the scene objects, a first ratio of the bounding box volume of the scene object to the maximum bounding box volume; determining a first product of the first ratio and a first preset parameter; determining a first sum of the first product and a second preset parameter; Determining a second product of the first sum and the initial weight to obtain an updated weight; The rendering weights are determined based on the updated weights.
- 4. The method of claim 3, wherein determining the rendering weights based on the updated weights comprises: and under the condition that the scene object is positioned in any one of the target paths, increasing the updating weight by a target value to obtain the rendering weight.
- 5. The method of claim 3, wherein determining the rendering weights based on the updated weights comprises: determining a first object based on the update weight, wherein the update weight of the first object is greater than or equal to a first preset threshold; determining a first distance between the scene object and the first object, and determining a second distance which is smaller than a second preset threshold and is included in the first distance; determining a second object corresponding to the second distance included in the first object; determining a third object with the largest update weight included in the second object; Determining a third distance of the third object included in the second distance; Determining a third product of the first attenuation coefficient and the third distance; Determining a fourth product of the third product and the first constant; Determining a first exponent value based on the natural base and the fourth product as an exponent; determining a fifth product of the first index value and a second attenuation coefficient; And determining a second sum of the fifth product and the update weight as the rendering weight.
- 6. The method of claim 1, wherein determining the dynamic heat map of the target scene comprises: Dividing the target scene to obtain a plurality of first unit grids; Determining a user density for each of the first cell grids; Determining a target frequency at which a target event occurs in each of the first cell grids; Determining an initial dynamic heat map for each of the first cell grids based on the user density and the target frequency; And fusing the rendering weight and the initial dynamic heat map to obtain the dynamic heat map.
- 7. The method of claim 6, wherein determining the user density of each of the first cell grids comprises: determining a first number of users included in the first cell grid; determining a second ratio of the first number to a target number, wherein the target number is a maximum number of users that can be accommodated in the first cell grid; And determining the minimum value included in the second ratio and the second constant as the user density.
- 8. The method of claim 6, wherein determining a target frequency for target events occurring in each of the first cell grids comprises: Determining the target times of the target event in each first unit grid in a preset time period; determining an event weight of each first cell grid based on the target times; Determining the average value of a plurality of event weights to obtain a weight average value; determining a sixth product of the duration corresponding to the preset time period and the weight average value; For each of the first cell grids, performing the operations of determining the target frequency of the first cell grid by determining a third ratio of the event weight of the first cell grid to the sixth product; and determining the third ratio as the target frequency.
- 9. The method of claim 6, wherein determining an initial dynamic heat map for each of the first cell grids based on the user density and the target frequency comprises: Determining a seventh product of a mixing coefficient and the user density; determining a first difference of a third constant and the mixing coefficient; determining an eighth product of the first difference and the target frequency; determining a third sum of the seventh product and the eighth product as an original dynamic heat map; the initial dynamic heat map is determined based on the original dynamic heat map.
- 10. The method of claim 9, wherein determining the initial dynamic heat map based on the original dynamic heat map comprises: Determining the decay rate of the original dynamic heat map; Determining a ninth product of the decay rate, a preset duration and a fourth constant; Determining a second exponent value having a natural base low and the ninth product as an exponent; A tenth product of the original dynamic heat map and the second index value is determined as the initial dynamic heat map.
- 11. The method of claim 10, wherein determining the decay rate of the original dynamic heat map comprises: determining a historical frequency of occurrence of the target event for each of the first cell grids within a historical time period; Determining a maximum frequency included in the historical frequency; Determining a fourth ratio of the historical frequency to the maximum frequency; Determining a second difference of a fifth constant to the fourth ratio; an eleventh product of the third attenuation coefficient and the second difference is determined as the attenuation rate.
- 12. The method of claim 6, wherein fusing the rendering weights with the initial dynamic heat map to obtain the dynamic heat map comprises: Determining a rendering weight map of the first network of units based on the rendering weights; adjusting the initial dynamic heat map to obtain a first heat map, and adjusting the rendering weight map to obtain a target weight map, wherein the resolution of the first heat map is the same as that of the target weight map; Performing target operation on each pixel point pair to determine a heat value of the pixel point pair, wherein the pixel point pair comprises a first pixel point in the first heat map and a second pixel point in a target weight map, and the position of the first pixel point in the first heat map is the same as the position of the second pixel point in the target weight map; determining a graph comprising the heat value as the dynamic heat graph; Wherein the target operation comprises: determining a first value of the first pixel point and a second value of the second pixel point; Determining a twelfth product of the first value and the second value as the heat value when the first value is greater than or equal to an activation threshold and the second value is less than a third preset threshold; Determining a third difference between a sixth constant and a preset parameter, determining a twelfth product of the first value and the second value, and determining a third exponent value based on the twelfth product, the third difference being an exponent, as the heat value, when the first value is greater than or equal to the activation threshold and the second value is greater than or equal to the third preset threshold; Determining a thirteenth product of the second value and the activation threshold as the heat value if the first value is less than the activation threshold and the second value is less than the third preset threshold; And under the condition that the first value is smaller than the activation threshold value and the second value is larger than or equal to the third preset threshold value, determining a fourth difference value between a seventh constant and the preset parameter, determining a thirteenth product of the second value and the activation threshold value, and determining a fourth index value taking the thirteenth product as a base and the fourth difference value as an index as the heat value.
- 13. The method of claim 1, wherein rendering the target scene based on the first probability, the second probability, the rendering weights, and the dynamic heat map comprises: dividing the bounding box of the scene object in a preset plane to obtain a plurality of second unit grids; Determining an initial score of the second unit grid, determining a target heat value in the second unit grid from the dynamic heat map, and determining a fourteenth product of the initial score, the target heat value and an area of the second unit grid as the initial priority score; rendering the target scene based on a plurality of the initial priority levels.
- 14. The method of claim 13, wherein determining an initial score for the second cell grid comprises: In the case that any one of the sub-paths in the target path is included in the second unit network, determining a third probability of the sub-path included in the first probability, and determining the third probability as the initial score; And determining a preset score as the initial score under the condition that any one sub-path in the target path is not included in the second unit network.
- 15. The method of claim 13, wherein rendering the target scene based on a plurality of the initial priority scores comprises: Determining a fourth sum of a plurality of said initial priority scores; Determining a fifteenth product of the fourth sum and the rendering weight; Determining a sixteenth product of the second probability and a scaling factor; Determining a hyperbolic tangent function value of the sixteenth product; determining a seventeenth product of the hyperbolic tangent function value and the enhancement coefficient; Determining a fifth sum of the seventeenth product and an eighth constant; determining an eighteenth product of the fifth sum and the fifteenth product as an update priority score; rendering the target scene based on the updated priority score.
- 16. The method of claim 15, wherein rendering the target scene based on the updated priority score comprises: Determining a maximum value included in the updated priority scores of the plurality of scene objects, and obtaining a maximum priority score; Determining a minimum value included in the updated priority scores of the plurality of scene objects to obtain a minimum priority score; Determining a fifth difference value between the updated priority score and the minimum priority score, a sixth difference value between the maximum priority score and the minimum priority score, a sixth sum value between the sixth difference value and a ninth constant, and a fifth ratio of the fifth difference value and the sixth sum value as the target priority score; rendering the target scene based on the target priority.
- 17. The method of claim 16, wherein rendering the target scene based on the target priority score comprises: executing a first level rendering scheme on the scene object under the condition that the target priority score is larger than a fourth preset threshold value; executing a second level rendering scheme on the scene object if the target priority score is greater than a fifth preset threshold and less than the fourth preset threshold, wherein the fifth preset threshold is less than the fourth preset threshold; Executing a third level rendering scheme on the scene object if the target priority score is greater than a sixth preset threshold and less than the fifth preset threshold, wherein the sixth preset threshold is less than the fifth preset threshold; and executing a fourth-level rendering scheme on the scene object under the condition that the target priority score is smaller than the sixth preset threshold value.
- 18. A scene rendering apparatus, comprising: The prediction module is used for predicting a first probability of the target user moving in a target path and a second probability of interacting with a scene object in the target scene based on historical movement data of the target user in the target scene; A first determining module configured to determine a rendering weight of the scene object included in the target scene; The second determining module is used for determining a dynamic heat map of the target scene, wherein the dynamic heat map is used for representing the liveness of the target scene; and the rendering module is used for rendering the target scene based on the first probability, the second probability, the rendering weight and the dynamic heat map.
- 19. A computer program product comprising computer programs/instructions which, when executed by a processor, implement the steps of the method of any of claims 1 to 17.
- 20. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program, wherein the computer program, when executed by a processor, implements the steps of the method of any of claims 1 to 17.
Description
Scene rendering method and device, storage medium and electronic equipment Technical Field The embodiment of the application relates to the field of computers, in particular to a scene rendering method, a device, a storage medium and electronic equipment. Background In the related art, the real-time rendering of a very large scale scene requires detail level techniques to solve the above difficulties, since it contains tens of thousands of high-precision models, far beyond the real-time carrying capability of graphics hardware. However, the conventional level of detail technology is mostly dependent on simple geometric rules to make decisions, such as euclidean distance between an object and a camera or the ratio of the euclidean distance between the object and the camera in a screen space, and the value difference of the objects in the scene cannot be distinguished, which will cause the problems that key visual elements are degraded and non-key elements occupy resources, so that the rendering efficiency of the scene is low. Accordingly, the related art has a technical problem of low scene rendering efficiency. In view of the above problems in the related art, no effective solution has been proposed at present. Disclosure of Invention The embodiment of the application provides a scene rendering method, a device, a storage medium and electronic equipment, which are used for at least solving the technical problem of low scene rendering efficiency in the related technology. According to one aspect of the embodiment of the application, a scene rendering method is provided, which comprises the steps of predicting a first probability of a target user moving in a target path and a second probability of interacting with a scene object in the target scene based on historical movement data of the target user in the target scene, determining a rendering weight of the scene object included in the target scene, determining a dynamic heat map of the target scene, wherein the dynamic heat map is used for representing the activity of the target scene, and rendering the target scene based on the first probability, the second probability, the rendering weight and the dynamic heat map. In one exemplary embodiment, determining rendering weights for the scene objects included in the target scene includes assigning an initial weight to the scene objects based on attributes of the scene objects, determining bounding box volumes for bounding boxes for each of the scene objects, the bounding boxes being used to frame the scene objects, and determining the rendering weights based on the initial weights and the bounding box volumes. In one exemplary embodiment, determining the rendering weight based on the initial weight and the bounding box volume includes determining a maximum bounding box volume included in the bounding box volume, performing, for each of the scene objects, determining a first ratio of the bounding box volume of the scene object to the maximum bounding box volume, determining a first product of the first ratio and a first preset parameter, determining a first sum of the first product and a second preset parameter, determining a second product of the first sum and the initial weight, resulting in an updated weight, and determining the rendering weight based on the updated weight. In one exemplary embodiment, determining the rendering weight based on the updated weights includes increasing the updated weights by a target value if the scene object is located in any one of the target paths, resulting in the rendering weight. In one exemplary embodiment, determining the rendering weight based on the update weight includes determining a first object based on the update weight, wherein the update weight of the first object is greater than or equal to a first preset threshold, determining a first distance between the scene object and the first object, determining a second distance included in the first distance that is less than a second preset threshold, determining a second object corresponding to the second distance included in the first object, determining a third object included in the second object that has the greatest update weight, determining a third distance of the third object included in the second distance, determining a third product of a first attenuation coefficient and the third distance, determining a fourth product of the third product and a first constant, determining a first exponent value based on a natural base number, based on the fourth product, determining a fifth product of the first exponent value and a second attenuation coefficient, determining a third product of the fifth product and the update weight as the rendering value. In one exemplary embodiment, determining the dynamic heat map of the target scene comprises dividing the target scene to obtain a plurality of first unit grids, determining the user density of each first unit grid, determining the target frequency of a target event in e