Search

JP-7855873-B2 - Delivery drones and delivery methods using delivery drones

JP7855873B2JP 7855873 B2JP7855873 B2JP 7855873B2JP-7855873-B2

Inventors

  • 恩田 正宏

Assignees

  • 株式会社JVCケンウッド

Dates

Publication Date
20260511
Application Date
20220225

Claims (6)

  1. A delivery drone for delivering goods, An imaging unit that captures images of the surroundings of the drone, When the drone reaches a predetermined range from the delivery location of the item, a face detection unit detects a human face from the image captured by the imaging unit and the image obtained from the person who ordered the item. A face determination unit compares the face of a person detected from the video captured by the imaging unit with the face of a person detected from the video obtained from the person who ordered the item, and determines whether they match or not. If the face determination unit determines that the face of a person detected from the video captured by the imaging unit matches the face of a person detected from the video obtained from the person who ordered the item, the release position determination unit determines that the position of the face of the person detected from the video captured by the imaging unit is the release position of the item. Equipped with , The face detection unit detects human faces from the video captured by the imaging unit and the video obtained from the customer who ordered the goods, and also detects the facial expression of the detected human face. The face determination unit compares the human face detected from the video captured by the imaging unit with the human face detected from the video obtained from the person who ordered the item, and determines whether the faces and facial expressions match. The release position determination unit determines, when the face determination unit determines that the person's face and facial expression detected from the video captured by the imaging unit matches the person's face and facial expression detected from the video obtained from the person who ordered the item, to be the release position of the item. Delivery drone.
  2. A delivery drone for delivering goods, An imaging unit that captures images of the surroundings of the drone, When the drone reaches a predetermined range from the delivery location of the item, a face detection unit detects a human face from the image captured by the imaging unit and the image obtained from the person who ordered the item. A face determination unit compares the face of a person detected from the video captured by the imaging unit with the face of a person detected from the video obtained from the person who ordered the item, and determines whether they match or not. If the face determination unit determines that the face of a person detected from the video captured by the imaging unit matches the face of a person detected from the video obtained from the person who ordered the item, the release position determination unit determines that the position of the face of the person detected from the video captured by the imaging unit is the release position of the item. Equipped with , The face determination unit compares the human face detected from the video captured by the imaging unit with the human face detected from the video obtained from the person who ordered the item, and determines whether the faces match and whether the changes in facial expressions match. The release position determination unit determines the position of the person's face detected from the video captured by the imaging unit as the release position of the item when the face determination unit determines that the changes in the person's face and expression detected from the video obtained from the customer who ordered the item match. Delivery drone.
  3. The face determination unit, when the time at which it detected the person's face from the video captured by the imaging unit is the same as or within a predetermined period of time when the video captured by the imaging unit was captured, compares the person's face detected from the video captured by the imaging unit with the person's face detected from the video obtained from the person who ordered the goods, and determines whether they match. A delivery drone according to claim 1 or 2 .
  4. A flight control unit that controls the flight of the drone, A release control unit that controls the release of the article, Equipped with, The flight control unit flies the drone to the release position determined by the release position determination unit. The release control unit releases the article at the release position. A delivery drone according to any one of claims 1 to 3 .
  5. When the drone reaches a predetermined range from the delivery location of the goods, a face detection step is performed to detect a human face from the image captured by an imaging unit that captures images of the area around the drone, and from the image obtained from the person who ordered the goods. A face determination step involves comparing the face of a person detected from the video captured by the imaging unit with the face of a person detected from the video obtained from the person who ordered the item, and determining whether they match or not. If it is determined that the face of a person detected from the video captured by the imaging unit matches the face of a person detected from the video obtained from the customer who ordered the item, the position of the face of the person detected from the video captured by the imaging unit is determined to be the release position of the item in a release position determination step. Includes, In the face detection step, a human face is detected from the video captured by the imaging unit and the video obtained from the person who ordered the item, and the facial expression of the detected human face is also detected. In the face determination step, the face of a person detected from the video captured by the imaging unit is compared with the face of a person detected from the video obtained from the person who ordered the item, and it is determined whether the faces and facial expressions match. In the release position determination step, if it is determined in the face determination step that the person's face and facial expression detected from the video captured by the imaging unit matches the person's face and facial expression detected from the video obtained from the person who ordered the item, the position of the person's face detected from the video captured by the imaging unit is determined to be the release position of the item. A delivery method using delivery drones, implemented by a goods delivery system.
  6. When the drone reaches a predetermined range from the delivery location of the goods, a face detection step is performed to detect a human face from the image captured by an imaging unit that captures images of the area around the drone, and from the image obtained from the person who ordered the goods. A face determination step involves comparing the face of a person detected from the video captured by the imaging unit with the face of a person detected from the video obtained from the person who ordered the item, and determining whether they match or not. If it is determined that the face of a person detected from the video captured by the imaging unit matches the face of a person detected from the video obtained from the customer who ordered the item, the position of the face of the person detected from the video captured by the imaging unit is determined to be the release position of the item in a release position determination step. Includes, In the face determination step, the face of a person detected from the video captured by the imaging unit is compared with the face of a person detected from the video obtained from the person who ordered the item, and it is determined whether the faces match and whether the changes in facial expressions match. In the release position determination step, if it is determined in the face determination step that the changes in a person's face and facial expression detected from the video captured by the imaging unit match the changes in a person's face and facial expression detected from the video obtained from the person who ordered the item, the position of the person's face detected from the video captured by the imaging unit is determined to be the release position of the item. A delivery method using delivery drones, implemented by a goods delivery system.

Description

The present invention relates to a delivery drone and a delivery method using a delivery drone . A technology for generating landing candidate location information from images captured by unmanned aerial vehicles, or so-called drones, has been disclosed (see, for example, Patent Document 1). A technology for identifying a destination by comparing captured images with destination images has also been disclosed (see, for example, Patent Document 2). Japanese Patent Publication No. 2020-057225International Publication No. 2020/012632 Figure 1 is a schematic diagram of the goods delivery system according to the first embodiment.Figure 2 is a schematic diagram of the drone.Figure 3 is a block diagram showing an example configuration of a drone according to the first embodiment.Figure 4 is a block diagram showing an example of the configuration of an item delivery control device.Figure 5 is a block diagram showing an example of the configuration of a terminal device.Figure 6 shows an example of the processing flow in a drone for an item delivery system according to the first embodiment.Figure 7 shows an example of the processing flow in a drone for an item delivery system according to the second embodiment.Figure 8 is a block diagram showing an example configuration of a drone according to the second embodiment.Figure 9 shows an example of the processing flow in a drone for an item delivery system according to the third embodiment.Figure 10 shows an example of the processing flow in a drone for an item delivery system according to the fourth embodiment.Figure 11 shows an example of the processing flow in a drone for an item delivery system according to the fifth embodiment. The following describes in detail embodiments of the delivery drone (hereinafter referred to as "drone") according to the present invention with reference to the attached drawings. However, the present invention is not limited to the following embodiments. [First Embodiment] (Goods delivery system) Figure 1 is a schematic diagram of the goods delivery system 1 according to the first embodiment. The goods delivery system 1 is a system that delivers goods using a drone 10. The goods delivery system 1 includes a drone 10 and a goods delivery control device 50. In this embodiment, the goods delivery system 1 comprises a drone 10, a goods delivery control device 50 and a terminal device 70. The drone 10, the goods delivery control device 50 and the terminal device 70 can communicate information via a network. (Drone) Figure 2 is a schematic diagram of the drone 10. The drone 10 includes a drive unit 15 for flight and a release mechanism 16 for releasing an item from a held position. Figure 2 shows the state in which the item 100 is being held. The drone 10 is an unmanned flying vehicle that delivers an item to a destination. The drone 10 delivers the item using the destination location information of the item as the destination. The drone 10 may, for example, fly autonomously to the destination. The drone 10 may fly to the destination by being controlled by an operator via a remote controller. The drone 10 may fly to the destination by combining autonomous flight and operator control. In the following description, the drone 10 will be described as flying autonomously to the destination. The drone 10 flies to its destination with the goods ordered by the client loaded onto its body. The goods are held, for example, by an arm extending from the body. The method by which the drone 10 holds the goods is not limited to known methods. Upon reaching the destination, the drone 10 determines the item handover location and performs control to release the goods either above the handover location or by landing at the handover location. The goods may be transported by drone 10 from, for example, the origin or a delivery company's office to the destination. Alternatively, the goods may be transported by vehicle from, for example, the delivery company's office to a few kilometers to tens of meters before the destination, and then transported by drone 10 from there to the destination. Figure 3 is a block diagram showing an example configuration of the drone 10. The drone 10 comprises a GNSS (Global Navigation Satellite System) receiver 11, a sensor 12, a camera (imaging unit) 13, a drive unit 15, a release mechanism 16, a communication unit 19, and a control unit 20. The drone 10 can communicate information with the goods delivery control device 50 via a network using the communication unit 19, which is controlled by the communication control unit 29. The GNSS receiving unit 11 receives GNSS signals from GNSS satellites. The GNSS receiving unit 11 outputs the received GNSS signals to the position information acquisition unit 21 of the control unit 20. The GNSS receiving unit 11 is composed of, for example, a GNSS receiving circuit and antenna capable of receiving GNSS signals. Sensor 12 is a sensor that detects objects in the vicinity of the drone 10. Sensor 12 is