Search

US-12626513-B2 - Information processing apparatus, information processing method, and non-transitory storage medium

US12626513B2US 12626513 B2US12626513 B2US 12626513B2US-12626513-B2

Abstract

An information processing apparatus ( 10 ) according to the present invention includes: an acquisition unit ( 11 ) that acquires an image captured at a site where a trouble happens; an information generation unit ( 12 ) that detects a person from the image, and also generates information relevant to the detected person, based on the image; and an output unit ( 13 ) that outputs the information relevant to the detected person.

Inventors

  • Ryo Kawai
  • Noboru Yoshida
  • Tingting DONG
  • Satoshi Yamazaki
  • Jianquan Liu
  • Naoki Shindou
  • Karen Stephen
  • YUTA NAMIKI
  • Youhei SASAKI

Assignees

  • NEC CORPORATION

Dates

Publication Date
20260512
Application Date
20220113

Claims (20)

  1. 1 . An information processing apparatus comprising: at least one memory configured to store one or more instructions; and at least one processor configured to execute the one or more instructions to: acquire an image captured at a site where a trouble happens; detect a person from the image, and also generate information relevant to the detected person, based on the image; and output the information relevant to the detected person, wherein the at least one processor is further configured to execute the one or more instructions to: detect a rescue target person and an object in a periphery of the rescue target person from the image; generate information relevant to the rescue target person including a relative relationship between a worker and the object based on the image and a floor map of the site, the information indicating a position of the object by a direction and a distance based on a current position of the worker; and output the information relevant to the rescue target person.
  2. 2 . The information processing apparatus according to claim 1 , wherein the information relevant to the detected person includes positional information of the detected person.
  3. 3 . The information processing apparatus according to claim 1 , wherein the information relevant to the detected person indicates at least one of a number of persons in each of a plurality of poses, a number of persons doing each of a plurality of movements, a number of persons in each age group, and a number of persons of each gender.
  4. 4 . The information processing apparatus according to claim 1 , wherein the information relevant to the detected person indicates at least one of a number and positional information of persons who satisfy a condition defined by using at least one of a pose, a movement, an age group, and gender.
  5. 5 . The information processing apparatus according to claim 1 , wherein the at least one processor is further configured to execute the one or more instructions to; in a case where the detected person is plural, determine priority of rescue for a plurality of the detected persons, based on a pose, a movement, an age group, and gender, and output the priority.
  6. 6 . The information processing apparatus according to claim 1 , wherein the at least one processor is further configured to execute the one or more instructions to: generate information relevant to an environment of the site; and output the information relevant to the environment of the site, wherein the information relevant to the environment of the site indicates at least one of a current temperature of the site, a trend of temporal change in temperature of the site, a current smoke state of the site, a trend of temporal change in smoke of the site, a current state of a plurality of pieces of equipment installed at the site, and a trend of temporal change in state of a plurality of pieces of equipment installed at the site.
  7. 7 . The information processing apparatus according to claim 6 , wherein the information relevant to the environment of the site includes information relevant to a local environment of the site.
  8. 8 . The information processing apparatus according to claim 7 wherein the at least one processor is further configured to execute the one or more instructions to: in a case where the detected person is plural, determine priority of rescue for a plurality of the detected persons, based on the information relevant to the local environment of the site and positional information of the detected persons; and output the priority.
  9. 9 . The information processing apparatus according to claim 7 , wherein the at least one processor is further configured to execute the one or more instructions to: compute an evacuation route, based on the information relevant to the local environment of the site and a map of the site; and output the evacuation route.
  10. 10 . The information processing apparatus according to claim 7 , wherein the at least one processor is further configured to execute the one or more instructions to: generate danger level information indicating a danger level of each area in the site, based on the information relevant to the local environment of the site; and output the danger level.
  11. 11 . The information processing apparatus according to claim 1 , wherein the at least one processor is further configured to execute the one or more instructions to: determine a worker who rescues the detected person, based on positional information of each of a plurality of workers engaged in a rescue operation at the site and positional information of the detected person; and output information indicating the worker who rescues the detected person.
  12. 12 . The information processing apparatus according to claim 1 , wherein the at least one processor is further configured to execute the one or more instructions to: collate a face image of the detected person with a face image of a person preliminarily registered in a database, and update safety information of the person registered in the database.
  13. 13 . The information processing apparatus according to claim 1 , wherein the at least one processor is further configured to execute the one or more instructions to acquire the image captured by a worker terminal held by a worker engaged in a rescue operation at the site.
  14. 14 . An information processing method comprising, executing by a computer: acquiring an image captured at a site where a trouble happens; detecting a person from the image, and also generating information relevant to the detected person, based on the image; and outputting the information relevant to the detected person; detecting a rescue target person and an object in a periphery of the rescue target person from the image, generating information relevant to the rescue target person including a relative relationship between a worker and the object based on the image and a floor map of the site, the information indicating a position of the object by a direction and a distance based on a current position of the worker, and outputting the information relevant to the rescue target person.
  15. 15 . The information processing method according to claim 14 , wherein the information relevant to the detected person includes positional information of the detected person.
  16. 16 . The information processing method according to claim 14 , wherein the information relevant to the detected person indicates at least one of a number of persons in each of a plurality of poses, a number of persons doing each of a plurality of movements, a number of persons in each age group, and a number of persons of each gender.
  17. 17 . The information processing method according to claim 14 , wherein the information relevant to the detected person indicates at least one of a number and positional information of persons who satisfy a condition defined by using at least one of a pose, a movement, an age group, and gender.
  18. 18 . A non-transitory storage medium storing a program causing a computer to: acquire an image captured at a site where a trouble happens; detect a person from the image, and also generate information relevant to the detected person, based on the image; output the information relevant to the detected person; detect a rescue target person and an object in a periphery of the rescue target person from the image; generate information relevant to the rescue target person including a relative relationship between a worker and the object based on the image and a floor map of the site, the information indicating a position of the object by a direction and a distance based on a current position of the worker; and output the information relevant to the rescue target person.
  19. 19 . The non-transitory storage medium according to claim 18 , wherein the information relevant to the detected person includes positional information of the detected person.
  20. 20 . The non-transitory storage medium according to claim 18 , wherein the information relevant to the detected person indicates at least one of a number of persons in each of a plurality of poses, a number of persons doing each of a plurality of movements, a number of persons in each age group, and a number of persons of each gender.

Description

This application is a National Stage Entry of PCT/JP2022/000862 filed on Jan. 13, 2022, the contents of all of which are incorporated herein by reference, in their entirety. TECHNICAL FIELD The present invention relates to an information processing apparatus, an information processing method, and a program. BACKGROUND ART A technique relevant to the present invention is disclosed in Patent Documents 1 and 2, and Non-Patent Document 1. Patent Document 1 discloses a technique for assisting a fire fighting operation and the like by a fire fighter by using an eyeglasses-type wearable terminal. Specifically, it is disclosed that a layout of a structure and the like, a vital sign, and the like are displayed via the eyeglasses-type wearable terminal. Patent Document 2 discloses a technique for computing a feature value for each of a plurality of key points of a human body included in an image, searching, based on the computed feature value, for an image including a human body of which pose is similar or a human body of which movement is similar, and classifying the pose and the movement by grouping similar poses and similar movements. Patent Document 3 discloses a technique for assisting a rescue operation, based on positional information of a portable terminal carried by a person in need for rescue and each of a rescue member. Patent Document 4 discloses a technique for detecting an emergency in a public facility, a building, a vehicle, and a transportation network. Non-Patent Document 1 discloses a technique relevant to skeleton estimation of a person. RELATED DOCUMENT Patent Document Patent Document 1: Japanese Patent Application Publication (Translation of PCT Application) No. 2015-504616Patent Document 2: International Patent Publication No. WO 2021/084677Patent Document 3: Japanese Patent Application Publication No. 2018-142338Patent Document 4: Japanese Patent Application Publication (Translation of PCT Application) No. 2019-534488 Non-Patent Document Non-Patent Document 1: Zhe Cao, Tomas Simon, Shih-En Wei, Yaser Sheikh, “Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields”, The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2017, P. 7291-7299 DISCLOSURE OF THE INVENTION Technical Problem An example problem of the present invention is to provide a new technique for assisting a worker engaged in a rescue operation at a site where a trouble happens. Solution to Problem According to an example aspect of the present invention, an information processing apparatus is provided, including: an acquisition unit that acquires an image captured at a site where a trouble happens;an information generation unit that detects a person from the image, and also generates information relevant to the detected person, based on the image; andan output unit that outputs the information relevant to the detected person. Further, according to an example aspect of the present invention, an information processing method is provided, including executing, by a computer: an acquisition step of acquiring an image captured at a site where a trouble happens;an information generation step of detecting a person from the image, and also generating information relevant to the detected person, based on the image; andan output step of outputting the information relevant to the detected person. Further, according to an example aspect of the present invention, a program is provided, causing a computer to function as: an acquisition unit that acquires an image captured at a site where a trouble happens;an information generation unit that detects a person from the image, and also generates information relevant to the detected person, based on the image; andan output unit that outputs the information relevant to the detected person. Advantageous Effects of Invention According to an example aspect of the present invention, a technique for assisting a worker engaged in a rescue operation at a site where a trouble happens. BRIEF DESCRIPTION OF THE DRAWINGS The above-described object, other objects, features and advantages are further clarified by favorable example embodiments described below and the following drawings accompanying thereto. FIG. 1 is a diagram illustrating one example of a functional block diagram of an assistance system. FIG. 2 is a diagram illustrating one example of a hardware configuration of an apparatus. FIG. 3 is a diagram illustrating one example of a functional block diagram of an information processing apparatus. FIG. 4 is a diagram for describing processing by an information generation unit. FIG. 5 is a flowchart illustrating one example of a flow of processing by the information processing apparatus. FIG. 6 is a diagram illustrating another example of the functional block diagram of the assistance system. FIG. 7 is a diagram schematically illustrating one example of information output from the information processing apparatus. FIG. 8 is a diagram schematically illustrating another ex