CN-122027821-A - Automatic association retrieval method and device for monitoring track and event video
Abstract
The application relates to the technical field of computers, in particular to a method and a device for automatically associating and retrieving a monitoring track and an event video, wherein the method comprises the steps of receiving a track sequence uploaded by mobile monitoring equipment and an event video fragment triggered and generated by a sensor; the method comprises the steps of matching associated track data which are coincident or nearest to each other in time in a track sequence of corresponding mobile monitoring equipment for each event video segment, calculating a representative geographic position of the corresponding event video segment through a preset algorithm based on the associated track data, generating a semantic tag of the corresponding event video segment, constructing a space-time event database according to the semantic tag, the representative geographic position and a video time range, responding to a semantic search request initiated by a client, inquiring in the space-time event database based on inquiry conditions contained in the semantic search request, and returning matched event video segment information. The application is beneficial to improving the utilization efficiency of the monitoring data.
Inventors
- LIANG GUANGCAI
- MAO JIAFU
- CHENG XUWEI
- YANG YONG
Assignees
- 深圳市途强物联科技有限公司
Dates
- Publication Date
- 20260512
- Application Date
- 20260123
Claims (10)
- 1. The automatic association retrieval method for the monitoring track and the event video is characterized by comprising the following steps: Receiving a track sequence uploaded from mobile monitoring equipment and an event video clip generated by sensor triggering, wherein the event video clip is associated with an event triggering type and a corresponding video time range; For each event video clip, matching out associated track data which are coincident or nearest to each other in time in a track sequence of corresponding mobile monitoring equipment according to an associated video time range; Calculating a representative geographic position of the corresponding event video segment through a preset algorithm based on the matched associated track data, and generating a semantic tag of the corresponding event video segment according to the representative geographic position and the associated event trigger type; constructing a space-time event database according to the semantic tags, the representative geographic positions and the video time range; responding to a semantic search request initiated by a client, inquiring in the space-time event database based on an inquiry condition contained in the semantic search request, and returning matched event video clip information.
- 2. The method for automatically associating and retrieving a monitoring track with an event video according to claim 1, wherein the calculating, based on the matching associated track data, a representative geographic position of a corresponding event video segment by a preset algorithm, and generating a semantic tag of the corresponding event video segment according to the representative geographic position and an associated event trigger type comprises: acquiring the number of effective track points and the video time span according to the matched associated track data and the corresponding video time range; When the video time span is larger than a preset time threshold and the number of the effective track points is not smaller than a preset number threshold, calculating the average value of longitude and latitude coordinates of all track points in the effective track point set by adopting a center point algorithm to obtain a representative geographic position of a corresponding event video segment; When the video time span is not greater than a preset time threshold or the number of the effective track points is less than a preset number threshold, obtaining an accurate geographic position corresponding to the video starting moment through linear interpolation calculation based on the adjacent effective track points before and after the video starting moment, and taking the accurate geographic position as a representative geographic position of a corresponding event video segment; And generating semantic tags of the corresponding event video clips according to the representative geographic positions and the associated event trigger types.
- 3. The method for automatically associating and retrieving a monitoring track and an event video according to claim 2, wherein the preset algorithm further comprises an abnormal track point correction mechanism, and when the position offset of the continuous track points in the effective track point set exceeds a preset offset threshold, the offset track is corrected by comparing road topology information of a preset electronic map.
- 4. The method for automatically associating and retrieving a monitoring track with an event video according to claim 1, further comprising: analyzing the track sequence and identifying the stay point of the mobile monitoring equipment; If the representative geographic position of the event video segment is located in the preset range of the stay point, the identification information of the corresponding stay point is added to the semantic tag of the corresponding event video segment.
- 5. The method for automatically associating and retrieving a monitoring track with an event video according to claim 1, further comprising: responding to a historical time period track playback request sent by a user through the client, wherein the historical time period track playback request comprises a retrieval time period; Retrieving and extracting all event video clips occurring in the retrieval time period from the space-time event database, and combining all event video clips in a time line sequence to generate a video abstract only comprising key moments; And synchronously returning the video abstract and the moving track data of the corresponding time period to the client so as to realize synchronous display of the moving track and the event video clip.
- 6. The method for automatically associating and retrieving a monitoring track and an event video according to claim 5, wherein the client is integrated with an electronic map, and the electronic map is used for highlighting the event occurrence position point corresponding to the event video segment and the movement track corresponding to the time period; the realization of the synchronous display of the moving track and the event video clip comprises the following steps: And when the playback time reaches the starting time node of any event video segment, playing the event video segment in a pop-up window or picture-in-picture mode in an interface in a related manner.
- 7. The method for automatically associating and retrieving a monitoring track with an event video according to claim 1, wherein said constructing a spatiotemporal event database according to said semantic tags, said representative geographic locations and said video time ranges comprises: uniformly packaging the semantic tags, the representative geographic positions and the video time range into structured data items by combining the event trigger type and the equipment identifier; Establishing a multidimensional index based on the structured data entry; Based on the multidimensional index and the structured data entry, a spatiotemporal event database is constructed using a distributed storage architecture.
- 8. An automatic association search device for monitoring track and event video, comprising: The data receiving module is used for receiving the track sequence uploaded by the mobile monitoring equipment and event video clips generated by sensor triggering, wherein the event video clips are associated with event triggering types and corresponding video time ranges; the track data matching module is used for matching out the associated track data which are overlapped or nearest in time in the track sequence of the corresponding mobile monitoring equipment according to the associated video time range aiming at each event video clip; the semantic tag generation module is used for calculating the representative geographic position of the corresponding event video segment through a preset algorithm based on the matched associated track data, and generating the semantic tag of the corresponding event video segment according to the representative geographic position and the associated event trigger type; The database construction module is used for constructing a space-time event database according to the semantic tags, the representative geographic positions and the video time range; And the data query module is used for responding to a semantic search request initiated by the client, querying the space-time event database based on query conditions contained in the semantic search request and returning matched event video clip information.
- 9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 7 when the computer program is executed.
- 10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 7.
Description
Automatic association retrieval method and device for monitoring track and event video Technical Field The application relates to the technical field of computers, in particular to a method and a device for automatically associating and retrieving a monitoring track and an event video. Background With the development of the internet of things technology, mobile monitoring devices (such as vehicle-mounted automobile data recorders, portable monitoring terminals and the like) have commonly provided with track tracking and event-triggered video recording functions. However, the large amount of data generated by the devices brings great challenges to the effective utilization of users, namely firstly, the data is in an 'islanding' characteristic, track data and event videos are stored independently, the users need to manually compare a video time stamp with a track diagram to determine the specific position of the occurrence of the event, the operation is complex and the efficiency is low, secondly, the retrieval mode is original, the existing system only supports rough retrieval according to time or device ID, the users need to manually browse a large amount of data to screen out target information, thirdly, the event videos lack of effective semantic description, video files are usually named in a form of 'device ID_time stamp', key information such as geographic position and event type is not contained, and are difficult to be understood and indexed by a search engine, fourthly, the key video clips are time-consuming to extract, and in a long-time moving process (such as 8-hour driving), the key event videos can only occupy a few minutes, and the prior art lacks effective means of rapidly positioning and extracting the clips from the long tracks. Therefore, a technical scheme capable of breaking data islands, realizing automatic association of tracks and event videos and supporting efficient intelligent retrieval is needed, so as to solve the problem of low utilization efficiency of the existing monitoring data. Disclosure of Invention Accordingly, it is necessary to provide a method and apparatus for automatically associating and retrieving a monitoring track and an event video, which can improve the utilization efficiency of monitoring data. In a first aspect, the present application provides a method for automatically associating and retrieving a monitoring track and an event video, the method comprising: Receiving a track sequence uploaded from mobile monitoring equipment and an event video clip generated by sensor triggering, wherein the event video clip is associated with an event triggering type and a corresponding video time range; For each event video clip, matching out associated track data which are coincident or nearest to each other in time in a track sequence of corresponding mobile monitoring equipment according to an associated video time range; Calculating a representative geographic position of the corresponding event video segment through a preset algorithm based on the matched associated track data, and generating a semantic tag of the corresponding event video segment according to the representative geographic position and the associated event trigger type; constructing a space-time event database according to the semantic tags, the representative geographic positions and the video time range; responding to a semantic search request initiated by a client, inquiring in the space-time event database based on an inquiry condition contained in the semantic search request, and returning matched event video clip information. In one embodiment, the calculating, based on the matching associated track data, the representative geographic location of the corresponding event video segment by a preset algorithm, and generating the semantic tag of the corresponding event video segment according to the representative geographic location and the associated event trigger type includes: acquiring the number of effective track points and the video time span according to the matched associated track data and the corresponding video time range; When the video time span is larger than a preset time threshold and the number of the effective track points is not smaller than a preset number threshold, calculating the average value of longitude and latitude coordinates of all track points in the effective track point set by adopting a center point algorithm to obtain a representative geographic position of a corresponding event video segment; When the video time span is not greater than a preset time threshold or the number of the effective track points is less than a preset number threshold, obtaining an accurate geographic position corresponding to the video starting moment through linear interpolation calculation based on the adjacent effective track points before and after the video starting moment, and taking the accurate geographic position as a representative geographic position of a corresponding event vi