Search

EP-4742218-A1 - FLIGHT PATH GENERATION PLATFORM, UAV AND METHOD ADAPTED FOR SUPPORTING AERIAL LIVE VIDEO BROADCASTING

EP4742218A1EP 4742218 A1EP4742218 A1EP 4742218A1EP-4742218-A1

Abstract

Flight path generation platform and unmanned aerial vehicle (UAV). The platform generates and displays a visual representation of a three-dimensional (3D) model of an environment on a display. The platform generates a visual representation of a flight path of the UAV superposed to the visual representation of the 3D model of the environment. The platform generates the flight path of the UAV based on the visual representation of the flight path. The flight path is loaded and stored in a memory of the UAV. The UAV processes the flight path to generate control commands sent to flying components of the UAV, the control commands controlling operations of the flying components so that the UAV follows a trajectory according to the flight path. The UAV transmits a video generated by a video camera of the UAV via a wireless communication interface of the UAV.

Inventors

  • GUFFARTH, Daniel
  • KRUPA, MIROSLAV

Assignees

  • ZEITGEIST VENTURES GmbH

Dates

Publication Date
20260513
Application Date
20251110

Claims (15)

  1. A flight path generation platform (100), comprising: a display (150); a user interface (140); memory (120) for storing a three-dimensional (3D) model of an environment, the environment being located in a geographical area and comprising a plurality of elements located in the geographical area; and a processing unit (110) for: generating a visual representation of the 3D model of the environment (152); displaying the visual representation of the 3D model of the environment (152) on the display (150); generating a visual representation of a flight path (154) of an unmanned aerial vehicle (UAV) on the display (150) through interactions of a user with the user interface (140), the visual representation of the flight path (154) being superposed to the visual representation of the 3D model of the environment (152); and generating the flight path of the UAV based on the visual representation of the flight path (154), the flight path of the UAV comprising a plurality of way points, each way point having 3D coordinates.
  2. The platform (100) of claim 1, wherein the 3D model of the environment comprises a 3D model of the elements of the environment and the visual representation of the 3D model of the environment (152) comprises a visual representation of the 3D model of the elements of the environment.
  3. The platform (100) of claim 1, wherein a determination of the 3D coordinates of the way points is based on 3D coordinates of the 3D model of the environment, taking into consideration the relative position on the display (150) of the visual representation of the flight path (154) with respect to the visual representation of the 3D model of the environment (152).
  4. The platform (100) of claim 1, wherein the geospatial coordinates comprise absolute Global Positioning System (GPS) coordinates and altitude.
  5. The platform (100) of claim 1, wherein the 3D model of the environment further comprises fly restriction data for enforcing at least one of a no-fly zone and a regulated fly zone, the generation of the flight path of the UAV taking into consideration the fly restriction data.
  6. The platform (100) of claim 5, wherein the fly restriction data comprise a two-dimensional (2D) or a 3D model of the at least one of the no-fly zone and the regulated fly zone.
  7. The platform (100) of claim 1, wherein the processing unit (110) further generates a geofence for at least some of the way points of the flight path, the geofence defining a 3D perimeter within which the UAV is authorized to be located for a given waypoint of the flight path.
  8. A unmanned aerial vehicle (UAV) (200) comprising: memory (220) for storing a flight path comprising a plurality of way points having 3D coordinates; a video camera (270) adapted to perform aerial live video broadcasting; a wireless communication interface (230); flying components (260); and a processing unit (210) for: processing the flight path to generate control commands sent to the flying components (260), the control commands controlling operations of the flying components (260) so that the UAV (200) follows a trajectory according to the flight path; and transmitting a video generated by the video camera (270) via the wireless communication interface (230).
  9. The UAV (200) of claim 8, wherein the processing unit (210) generates a timestamp for each way point of the flight path, each timestamp being generated based on a predetermined start time and a predetermined speed of the UAV (200); and the control commands sent to the flying components (260) comprise an acceleration of the UAV (200) in one or more directions, the acceleration of the UAV (200) in the one or more directions being calculated by the processing unit (210) so that the UAV (200) having reached a given waypoint of the flying path at a time compliant with the associated timestamp, the UAV (200) reaches the next waypoint of the flying path at a future time compliant with the timestamp associated to the next waypoint.
  10. The UAV (200) of claim 8, wherein a geofence is associated to at least some of the way points, and wherein the processing unit (210) determines a modification of the trajectory of the UAV (200) compliant with the geofence and applies the modification of the trajectory to the flying components (260).
  11. The UAV (200) of claim 10, wherein the processing unit (210) receives one or more commands via the wireless communication interface (230) for modifying the trajectory of the UAV (200), the processing unit (210) determining based on the one or more commands the modification of the trajectory of the UAV (200) compliant with the geofence.
  12. The UAV (200) of claim 8, wherein the processing unit (210) determines a modification of at least one operating parameter of the video camera (270) and applies the modification of the at least one operating parameter to the video camera (270), the modification being determined based on at least one of the following: predetermined information stored in the memory (220), a command received via the wireless communication interface (230), and execution by the processing unit (210) of an algorithm for automatically controlling the video camera (270).
  13. The UAV (200) of claim 8, wherein the processing unit (210) receives one or more commands via the wireless communication interface (230) for modifying operating conditions of the video camera (270), and the processing unit (210) performs at least one of the following: determining a modification of at least one operating parameter of the video camera (270) to modify the operating conditions of the camera (270) according to the one or more commands, and applying the modification of the at least one operating parameter to the video camera (270); and determining a modification of the trajectory of the UAV (200) to modify the operating conditions of the video camera (270) according to the one or more commands, and applying the modification to the trajectory to the flying components (260).
  14. The UAV (200) of claim 8, wherein the processing unit (210) executes a tracking algorithm to follow one or more targets with the video camera (270), the tracking algorithm automatically determining at least one of the following based on an analysis of the video generated by the video camera (270): a modification of at least one operating parameter of the video camera (270), the modification of the at least one operating parameter being applied by the processing unit (210) to the video camera (270); and a modification of the trajectory of the UAV (200), the modification to the trajectory being applied by the processing unit (210) to the flying components (260).
  15. A method (300) for providing aerial live broadcasting from an unmanned aerial vehicle (UAV) (200), the method comprising: storing (305) in a memory (120) of a flight path generation platform (100) a three-dimensional (3D) model of an environment, the environment being located in a geographical area and comprising a plurality of elements located in the geographical area generating (310) by a processing unit (110) of the platform (100) a visual representation of the 3D model of the environment (152); displaying (315) by the processing unit (110) of the platform (100) the visual representation of the 3D model of the environment (152) on a display (150) of the platform (100); generating (320) by the processing unit (110) of the platform (100) a visual representation of a flight path (154) of the UAV (200) through interactions of a user with a user interface (140) of the platform (100); displaying (325) by the processing unit (110) of the platform (100) the visual representation of the flight path (154) of the UAV (100) on the display (150) of the platform (100), the visual representation of the flight path (154) of the UAV (200) being superposed to the visual representation of the 3D model of the environment (152); generating (330) by the processing unit (110) of the platform (100) the flight path of the UAV (100) based on the visual representation of the flight path (154) of the UAV (200), the flight path of the UAV (200) comprising a plurality of way points, each way point having 3D coordinates; storing (335) the flight path in a memory (220) of the UAV (200); processing (340) by a processing unit (210) of the UAV (200) the flight path to generate control commands sent to flying components (260) of the UAV (200), the control commands controlling operations of the flying components (260), so that the UAV (200) follows a trajectory according to the flight path; transmitting (345) a video generated by a video camera (270) of the UAV (200) via a wireless communication interface (230) of the UAV (200); and performing (350) at least one of: determining by the processing unit (210) of the UAV (200) a modification of the trajectory of the UAV (200) compliant with a geofence associated to at least some of the way points and applying the modification of the trajectory to the flying components (260) of the UAV (200); and determining by the processing unit (210) of the UAV (200) a modification of at least one operating parameter of the video camera (270) of the UAV (200) and applying the modification of the at least one operating parameter to the video camera (270) of the UAV (200).

Description

TECHNICAL FIELD The present disclosure relates to the field of aerial live video broadcasting. More specifically, the present disclosure relates to a flight path generation platform, unmanned aerial vehicle (UAV) and method adapted for supporting aerial live video broadcasting. BACKGROUND A camera dolly is used for performing live video broadcasting of certain types of events, like car racing. Tracks are installed at a location allowing good video coverage of the event. The dolly moves on the tracks and carries a video camera. For example, the rails are installed along a racing track at a racing circuit. The movement of the dolly on the tracks allows it to follow a target (e.g. a car or a group of cars) for a certain amount of time with the video camera carried by the dolly. Although the camera dolly provides a significant improvement by contrast to a fixed video camera, the movement of the video camera remains limited, even for an actuated video camera having a degree of freedom of movement when mounted on the dolly. For example, the dolly (and the video camera) moves along a straight line. For each point of the straight line, the video camera has the capability to be rotated along a vertical axis. The development in drone technology has been very fast in the last years, rendering this technology available and affordable for more and more fields of activity. In particular, it is now current to have a video camera embedded in a drone and wirelessly transmitting (video) aerial views to a drone control device operated by a user of the drone. The usage of a drone having live video broadcasting capabilities appears to be a good solution to overcome the limitations in terms of freedom of movement of a camera mounted on a dolly. There is therefore a need for a new flight path generation platform, UAV and method adapted for supporting aerial live video broadcasting. SUMMARY According to a first aspect, the present disclosure relates to a flightpath generation platform. The platform comprises a display, a user interface, memory and a processing unit. The memory stores a three-dimensional (3D) model of an environment. The environment is located in a geographical area and comprises a plurality of elements located in the geographical area. The processing unit generates a visual representation of the 3D model of the environment. The processing unit displays the visual representation of the 3D model of the environment on the display. The processing unit generates a visual representation of a flight path of an unmanned aerial vehicle (UAV) on the display through interactions of a user with the user interface. The visual representation of the flight path is superposed to the visual representation of the 3D model of the environment. The processing unit generates the flight path of the UAV based on the visual representation of the flight path, the flight path of the UAV comprising a plurality of way points, each way point having 3D coordinates. According to a particular aspect, the 3D model of the environment comprises a 3D model of the elements of the environment and the visual representation of the 3D model of the environment comprises a visual representation of the 3D model of the elements of the environment. According to another particular aspect, the environment is a car racing circuit and the elements of the environment comprise at least one of the following: a racing track, areas where spectators are located, paddocks, other buildings, and grass areas. According to still another particular aspect, a determination of the 3D coordinates of the way points is based on 3D coordinates of the 3D model of the environment, taking into consideration the relative position on the display of the visual representation of the flight path with respect to the visual representation of the 3D model of the environment. According to yet another particular aspect, the 3D coordinates of each way point consist of geospatial coordinates. In a particular embodiment, the geospatial coordinates comprise absolute Global Positioning System (GPS) coordinates and altitude. According to another particular aspect, the 3D model of the environment further comprises fly restriction data for enforcing at least one of a no-fly zone and a regulated fly zone, the generation of the flight path of the UAV taking into consideration the fly restriction data. In a particular embodiment, the fly restriction data comprise a two-dimensional (2D) or a 3D model of the at least one of the no-fly zone and the regulated fly zone. According to still another particular aspect, the processing unit further generates a geofence for at least some of the way points of the flight path. The geofence defines a 3D perimeter within which the UAV is authorized to be located for a given waypoint of the flight path. According to yet another particular aspect, the processing unit uses the flight path and the 3D model of the environment to generate a simulated view of the environment fr