Search

EP-3695385-B1 - LIGHTING INTEGRATION

EP3695385B1EP 3695385 B1EP3695385 B1EP 3695385B1EP-3695385-B1

Inventors

  • GEISSLER, MICHAEL PAUL ALEXANDER
  • PARSLEY, Martin Peter

Dates

Publication Date
20260506
Application Date
20181012

Claims (8)

  1. A method for forming augmented image data, the method comprising: forming a primary image feed showing a subject illuminated by a lighting unit; estimating the location of the lighting unit, wherein the lighting unit carries an imaging device and the location of the lighting unit is estimated by receiving a series of images of the environment captured by the imaging device and forming, in dependence on the images, an estimate of the lighting unit's position in the environment; detecting in the images captured by the imaging device a representation of each of a plurality of indicia located in the environment; determining locations of each of the plurality of indicia as represented in the images and forming the said estimate of position by comparing the locations of representations of the indicia in images captured at different times; receiving overlay data defining an overlay of three-dimensional appearance; rendering the overlay data in dependence on the estimated location to form an augmentation image feed; and overlaying the augmentation image feed on the primary image feed to form a secondary image feed.
  2. A method as claimed in claim 1, comprising detecting the representation of each of the indicia in the image as a relatively high brightness region of the image.
  3. A method as claimed in any preceding claim, comprising disposing the plurality of indicia in an irregular pattern in the environment.
  4. A method as claimed in any preceding claim, wherein the indicia are retroreflective.
  5. A method as claimed in claim 3 or 4, wherein the indicia are identical.
  6. A method as claimed in any of claims 3 to 5, wherein the indicia are located on a downwards-facing surface of the environment.
  7. A method as claimed in any preceding claim, wherein the step of rendering the overlay data is performed so that, in the secondary image feed, the angle from which regions derived from the augmentation image appear to be lit matches the angle from which regions derived from the primary image feed are lit by the lighting unit.
  8. Apparatus configured to perform the steps of any preceding claim.

Description

This invention relates to harmonising the appearance of lighting in, for example, video production. It is known to film actors, presenters, animated models and other physical objects against a background of a known colour. Conventionally that background is a green screen. Then the resulting video feed can be edited to replace the green background with other artefacts such as scenery, computer-generated animation or statistical charts, creating an augmented video feed. When the overlain artefacts have a three-dimensional appearance, if they are lit differently from the presenters then that can reduce the realism of the final product. US 2007/248283 describes a system for producing a virtual scene combining live video enhanced by other imagery, including computer generated imagery. The system includes a scene camera with an attached tracking camera, the tracking camera viewing a tracking marker pattern, which has a plurality of tracking markers with identifying indicia. The tracking marker pattern is positioned proximate so that when viewed by the tracking camera the coordinate position of the scene camera can be determined in real time. US 6 556 722 B1 describes that the position of an object, for example a studio camera is determined by means of a camera which views several markers disposed about a studio ceiling, the markers being patterened as a series of light and dark rings to encode information in binary form enabling the markers to be identified as the camera moves about the studio. Methods and apparatus of more general applicability are also disclosed. WO 2014/132090 A2 describes an optical navigation system comprising a camera oriented to face towards a plurality of markers located at spaced apart locations from the camera, a calculating means adapted to calculate an angle (θ, Φ) subtended between pairs of markers, the subtended angles (θ, Φ) being calculated by monitoring the pixel locations of the markers in a series of images captured by the camera, the optical navigation system additionally comprising means for creating a three-dimensional model whereby the location of the camera relative to the markers is determined by triangulating the subtended angles (θ, Φ) in the three-dimensional model. It would be desirable to be able to produce realistic augmented video or still photographic products more easily. According to the present invention there is provided a method/apparatus as set out in the accompanying claims. The present invention will now be described by way of example with reference to the accompanying drawings. In the drawings: Figure 1 shows a system for implementing a video overlay.Figure 2 shows examples of indicia.Figure 3 shows a pattern of indicia in an environment and frames captured by an imaging device such as a camera.Figure 4 shows a frame of captured video.Figure 5 shows a frame of augmented video. The system of figure 1 comprises a studio 1 having a background 2 of a known colour or pattern. In this case the background is a green screen. A subject 3, which in this case is a presenter, is located in front of the green screen. A camera 4 is located so as to photograph the subject 3 against the background 2. In this case the camera is a video camera which generates a video feed of its view of its field of view, but it could be a still camera which generates a photograph of the field of view. The camera is located so that its field of view includes the subject and at least some of the background. The camera may be hand-held, floor mounted or supported in any other convenient way. Conveniently, the camera is movable within the studio. A lighting unit 5 is located in the studio. The lighting unit is positioned to illuminate the subject. The lighting unit could be implemented in any convenient way. It could be floor-mounted, ceiling-mounted or hand-held. Conveniently, the lighting unit is movable within the studio. The camera and/or the lighting unit could move while filming is taking place. There could be multiple cameras each generating a respective video feed. There could be multiple cameras illuminating the subject simultaneously. When filming is taking place, the lighting unit is positioned to illuminate the subject from the desired angle and the camera is positioned to film the subject from the desired angle. Then the subject performs in the appropriate manner while being filmed by the camera. The camera forms a video feed of the performance. In the video feed the subject is seen against the known background 2. The camera 4 and the lighting unit 5 are provided with mechanisms to allow their positions in the studio to be estimated. Those mechanisms could be radio location systems, e.g. using time of flight between the camera/lighting unit and beacons in known locations to trilaterate the locations of the camera and the lighting unit. More preferably, the camera and the lighting unit are provided with positioning systems as described in EP 2 962 284. Indicia 10 are applied to ob