Search

JP-2026076381-A - Data analysis apparatus and method

JP2026076381AJP 2026076381 AJP2026076381 AJP 2026076381AJP-2026076381-A

Abstract

[Problem] To provide a data analysis device that can easily suppress interruptions in the data sequence obtained by detecting objects in the field over time. [Solution] The data analysis device (2) in this disclosure controls the generation of a data sequence showing the results of detecting objects in a time series at the site. The data analysis device includes a display unit (23) for displaying information, an input unit (22) for inputting user operations, and a control unit (20) for controlling a data management unit that manages the detection results of objects and generates a data sequence. The control unit causes the display unit to display display information including the range in which the ends of multiple data sequences are adjacent to each other at the site (S1), receives user operations at the input unit to adjust parameters indicating the conditions under which multiple data sequences are connected to each other in the adjacent ranges (S3), and controls the data management unit to apply the adjusted parameters in the adjacent ranges according to the user operations at the input unit (S4). [Selection Diagram] Figure 5

Inventors

  • 市村 大治郎
  • 秦 秀彦
  • 伊藤 智祥

Assignees

  • パナソニックIPマネジメント株式会社

Dates

Publication Date
20260511
Application Date
20260224
Priority Date
20220107

Claims (10)

  1. A data analysis device that controls the generation of a data sequence representing a timeline showing the chronological order of the results of detecting objects at a site over time, A display unit that displays information, An input section for user input, The system includes a control unit that controls a data management unit that manages the detection results of the object and generates the data sequence, The control unit, Display information including the range where the ends of multiple timelines, each represented by multiple data columns, are adjacent to each other at the aforementioned site, is displayed on the display unit. The input unit receives a user operation to adjust parameters that indicate the conditions under which the multiple timelines are connected to each other in the adjacent range, A data analysis device that controls the data management unit to apply the adjusted parameters in the adjacent range according to the user operation in the input unit.
  2. The control unit, The input unit accepts user input specifying the requirements to which the adjusted parameters apply in the adjacent range. The data analysis device according to claim 1, wherein the data management unit is controlled such that the adjusted parameters are not applied if the specified requirements are not met, and the adjusted parameters are applied if the specified requirements are met.
  3. The data analysis apparatus according to claim 1 or 2, wherein the control unit detects the adjacent range as a candidate for a break in the timeline based on the data sequence generated by the data management unit, and notifies the detected range in the display information.
  4. The control unit receives a user operation to specify the adjacent range in the display information via the input unit, The data analysis apparatus according to claim 1 or 2, wherein detailed information relating to a timeline whose end is located within the specified range is displayed on the display unit.
  5. The system further includes a storage unit that stores pattern information including multiple patterns in which the object is not detected at the site, The control unit, The input unit receives a user operation to select one pattern from the multiple patterns indicated by the aforementioned pattern information. The data analysis apparatus according to claim 1 or 2, wherein information prompting the user to adjust the parameters according to the selected pattern is displayed on the display unit.
  6. The aforementioned data management unit, Based on the captured images of the aforementioned site, the detection results of the image recognition model that detects the object are managed. The data analysis apparatus according to claim 1 or 2, which generates the data sequence by sequentially connecting the detection results of the image recognition model based on the parameters.
  7. The system further comprises a communication unit that communicates data with the aforementioned data management unit, The control unit, The data sequence is received from the data management unit via the communication unit. The data analysis apparatus according to claim 1 or 2, wherein the adjusted parameters are transmitted to the data management unit via the communication unit.
  8. The data analysis device according to claim 1 or 2, wherein the timeline indicates the detection result of the worker at the site performing a predetermined task.
  9. A data analysis method that controls the generation of a data column showing a timeline representing the chronological order of the results of detecting objects at a site over time, The aforementioned data sequence is generated by the data management unit that manages the detection results of the object, The steps include displaying information on a display unit that includes the range in which the ends of multiple timelines, each represented by multiple data columns, are adjacent to each other at the aforementioned site, and displaying this information on a display unit. The input unit receives a user operation to adjust parameters indicating the conditions under which the multiple timelines are connected to each other in the adjacent range. A data analysis method comprising the step of controlling the data management unit to apply the adjusted parameters in the adjacent range in accordance with the user operation in the input unit.
  10. A program for causing a computer's control unit to execute the data analysis method described in claim 9.

Description

This disclosure relates to data analysis equipment and methods. Patent Document 1 discloses an image analysis device that tracks a person within a series of image frames captured by a camera. This device detects the person region in each image frame and outputs a score; outputs a score for changes in the person region for each pair of image frames; recognizes a real person from each person region and outputs a score; and outputs a score for changes in person recognition within the person region. Using all of the above scores, the device assigns a movement path ID and associates it with a person ID for each person region within the image frame. This allows for robust person tracking against inter-person occlusion. Japanese Patent Publication No. 2020-09166 A diagram showing an overview of the movement flow analysis system according to Embodiment 1 of this disclosure.Block diagram illustrating the configuration of a data analysis device in a movement flow analysis system.Block diagram illustrating the configuration of a movement management server in a movement analysis system.A diagram illustrating the breaks in movement patterns in a movement pattern analysis system.A flowchart illustrating the operation of a data analysis device in a movement flow analysis system.A flowchart illustrating the process of visualizing interrupted data in a data analysis device.This figure shows an example of the display screen for movement analysis in a data analysis device.This figure shows an example of how detailed information is displayed in a data analysis device.A flowchart illustrating parameter adjustment processing in a data analysis device.A diagram illustrating a discontinuation pattern table in a data analysis device.This figure shows an example of the display of parameter adjustment processing in a data analysis device.A diagram illustrating parameter adjustment information in a data analysis device.A diagram illustrating a modified version of a data analysis device. The embodiments will be described in detail below, with reference to the drawings as appropriate. However, unnecessarily detailed explanations may be omitted. For example, detailed explanations of already well-known matters or redundant explanations of substantially identical configurations may be omitted. This is to avoid unnecessarily verbose explanations and to facilitate understanding for those skilled in the art. The applicant provides the accompanying drawings and the following description to enable a person skilled in the art to fully understand this disclosure, and does not intend to limit the subject matter described in the claims. (Embodiment 1) Hereinafter, Embodiment 1 of this disclosure will be described with reference to the drawings. 1. Configuration The configuration of the system using the data analysis device according to Embodiment 1 will be explained with reference to Figure 1. 1-1. About the Movement Flow Analysis System Figure 1 shows an overview of the movement flow analysis system 1 according to this embodiment. As shown in Figure 1, for example, the system 1 comprises a plurality of cameras 11, a data analysis device 2, and a movement flow management server 3. Each of the devices 11, 2, and 3 of the system 1 is connected to a communication network 13 such as a LAN, WAN, or the Internet, and is capable of data communication. This system 1 stores information such as the movement paths, i.e., the trajectories, of each worker W in a workplace 10, such as a factory, logistics warehouse, or store, for analysis purposes. This system 1 is applicable to data analysis performed by users 15, such as the manager or analyst of the workplace 10, to analyze the time allocation or efficiency of individual workers W in the workplace 10. In the data analysis described above, it is useful to obtain the movement path of each worker W as a continuous, uninterrupted sequence from beginning to end. The data analysis device 2 of this embodiment, by reflecting the analysis results of the user 15, makes it easier to obtain a continuous, uninterrupted movement path in the movement path analysis system 1. In this system 1, each camera 11 is arranged such that, for example, different areas are included in the captured images of the workspace 10. The number of cameras 11 in this system 1 is not limited to multiple cameras; it may be just one. The cameras 11 are connected to a communication network 13, for example, to transmit video data D0 of the captured images of the workspace 10 to the movement management server 3. The cameras 11 may be, for example, omnidirectional cameras or box cameras. The movement management server 3 is a server device that stores and manages information such as video data D0, which is the result of imaging by each camera 11, and movement data D1, which indicates various movement paths based on the video data D0. The configuration of the movement management server 3 will be described later. 1-2. Configuration of the Data Analysis De