Search

EP-4474713-B1 - COOKING APPARATUS AND METHOD FOR CONTROLLING COOKING APPARATUS

EP4474713B1EP 4474713 B1EP4474713 B1EP 4474713B1EP-4474713-B1

Inventors

  • KA, KEEHWAN
  • HAN, Seongjoo
  • CHOI, YOONHEE

Dates

Publication Date
20260506
Application Date
20230526

Claims (15)

  1. A cooking apparatus comprising: a chamber (50); a camera (60) configured to photograph a cooking object placed in the chamber; and a controller (200) configured to be electrically connected to the camera, wherein the controller comprises a memory (220) that stores a sound matching table (900) including sound information that matches with a property of the cooking object and a change in a state of the cooking object, wherein the controller is configured to: control the camera to acquire a plurality of image frames of the cooking object placed in the chamber while cooking of the cooking object is being performed; identify the cooking object from a plurality of image frames and estimate a property of the cooking object; identify a change in a state of the cooking object in the chamber from the acquired plurality of image frames; extract a portion of image frames among the acquired plurality of image frames based on the change in the state of the cooking object; and add a sound corresponding to the change in the state of the cooking object to the extracted portion of image frames to generate a summary video of a cooking process, and wherein the controller is further configured to use the sound matching table to determine the sound to be added to the extracted portion of image frames.
  2. The cooking apparatus of claim 1, wherein the controller (200) is configured to: generate a raw video based on the acquired plurality of image frames; divide the generated raw video into a plurality of sections (S1, S2, S3, S4, S5) based on the change in the state of the cooking object; and extract at least one image frame from each of the divided plurality of sections to acquire the extracted portion of image frames.
  3. The cooking apparatus of claim 2, wherein the controller (200) is configured to: compare a reference image frame with each of the acquired plurality of image frames to determine an amount of the change in the state of the cooking object in each of the acquired plurality of image frames; detect at least one of a phase-change section in which a slope of the amount of the change in the state of the cooking object is greater than or equal to a predetermined threshold value and a phase-holding section in which a slope of the amount of the change in the state of the cooking object is less than the threshold value; and divide the raw video into the plurality of sections based on the at least one of the phase-change section and the phase-constant section.
  4. The cooking apparatus of claim 2, wherein the controller (200) is configured to, based on the change in the state of the cooking object being different in each of the divided plurality of sections, add a different sound to each of the divided plurality of sections.
  5. The cooking apparatus of claim 4, wherein the controller (200) is configured to use the sound matching table to determine the sound to be added to each of the divided plurality of sections.
  6. The cooking apparatus of claim 1, wherein the controller (200) is configured to: based on a plurality of cooking objects being identified in the chamber, identify properties of the plurality of cooking objects and changes in states of the plurality of cooking objects; synthesize a plurality of sounds respectively matching with the properties of the plurality of cooking objects and the changes in the states of the plurality of cooking objects to generate a harmonious sound; and add the harmonious sound to the extracted portion of image frames.
  7. The cooking apparatus of claim 6, wherein the controller (200) is configured to determine a volume of each of the plurality of sounds to be different based on the property of each of the plurality of cooking objects.
  8. The cooking apparatus of claim 1, wherein the controller (200) is configured to insert metadata including time information, cooking object state information and sound information into each of the image frames in the extracted portion.
  9. The cooking apparatus of claim 1, wherein the controller (200) is configured to detect at least one of: a change in a size of the cooking object; a change in a form of the cooking object; a change in a colour of the cooking object; and a change in a texture of the cooking object to identify the change in the state of the cooking object.
  10. The cooking apparatus of claim 1, further comprising a communication circuit (100) configured to communicate with a server (3), wherein the controller (200) is configured to control the communication circuit to transmit, to the server, display information to display the summary video to a user device (2).
  11. A method of controlling a cooking apparatus according to any of the preceding claims, the method comprising: controlling the camera to acquire a plurality of image frames of a cooking object placed in the chamber while cooking of the cooking object is being performed (1101); identifying the cooking object from a plurality of image frames and estimating a property of the cooking object; identifying a change in a state of a cooking object in the chamber from the acquired plurality of image frames (1102); extracting a portion of image frames among the acquired plurality of image frames based on the change in the state of the cooking object (1103); and generating a summary of video of a cooking process (1105) by adding a sound corresponding to the change in the state of the cooking object to the extracted portion of image frames (1104), wherein the sound matching table is used to determine the sound to be added to the extracted portion of image frames.
  12. The method of claim 11, wherein the extracting of the portion of image frames includes: generating a raw video based on the acquired plurality of image frames (1301); dividing the generated raw video into a plurality of sections based on the change in the state of the cooking object; and extracting at least one image frame from each of the divided plurality of sections to acquire the extracted portion of image frames (1305).
  13. The method of claim 12, wherein the dividing into the plurality of sections includes: comparing a reference image frame with each of the acquired plurality of image frames to determine an amount of the change in the state of the cooking object in each of the acquired plurality of image frames (1302); detecting at least one of a phase-change section in which a slope of the amount of the change in the state of the cooking object is greater than or equal to a predetermined threshold value and a phase-holding section in which a slope of the amount of the change in the state of the cooking object remains less than the threshold value (1303); and dividing the raw video into the plurality of sections based on the at least one of the phase-change section and the phase-constant section (1304).
  14. The method of claim 12, wherein the generating of the summary video includes, based on the change in the state of the cooking object being different in each of the divided plurality of sections, adding a different sound to each of the divided plurality of sections (1306).
  15. The method of claim 14, wherein the generating of the summary video includes using the sound matching table to determine the sound to be added to each of the divided plurality of sections.

Description

[Technical Field] The invention relates to a cooking apparatus and a method of controlling the same. [Background Art] A cooking apparatus is an apparatus for heating and cooking a cooking object, such as food, and refers to an apparatus capable of providing various cooking related functions, such as heating, thawing, drying, and sterilizing of a cooking object. Examples of the cooking apparatus may include an oven, such as a gas oven or an electric oven, a microwave heating device (hereinafter referred to as a microwave oven), a gas range, an electric range, a gas grill, or an electric grill. In general, an oven cooks food by directly transferring heat to food or heating the inside of a cooking chamber using a heat source that generates heat. A microwave oven cooks food using frictional heat of molecules in food that is generated by disturbing the arrangement of molecules using high frequency as a heating source. Recently, a technology has emerged that installs a camera in a chamber of a cooking apparatus and provides a user with an image acquired by the camera. WO 2020/014159 A1 discloses a cooking appliance comprising a camera configured to capture an image of a cooking chamber when a heating element is emitting at a stabilized power and/or peak wavelength. US 2021/251263 A1 discloses a computer system for providing interactive cooking experiences. KR 2022 0040228 A discloses a cooking appliance capable of capturing an image of a food ingredient during cooking to obtain a video providing information about the ingredient. [Disclosure] [Technical Solution] The invention provides a cooking apparatus according to claim 1, and a method of controlling the same according to claim 11. [Description of Drawings] FIG. 1 illustrates a network system implemented by various electronic devices.FIG. 2 is a perspective view illustrating a cooking apparatus according to an embodiment.FIG. 3 is a cross-sectional view illustrating a cooking apparatus according to an embodiment.FIG. 4 illustrates an example in which a tray is mounted on a first support on a sidewall of a chamber.FIG. 5 illustrates control components of a cooking apparatus according to an embodiment.FIG. 6 illustrates the structure of a controller described in FIG. 5.FIG. 7 is a table for describing a change in a state of a cooking object over time.FIG. 8 illustrates a graph showing the amount of the change in the state of the cooking object described with reference to FIG. 7 and waveforms of sounds added to a plurality of sections classified according to the change in the state of the cooking object.FIG. 9 illustrates a sound matching table according to an embodiment.FIG. 10 illustrates a summary video provided through a user device and a graphic user interface for editing the summary video.FIG. 11 is a flowchart showing a method of controlling a cooking apparatus according to an embodiment.FIG. 12 is a flowchart showing a method of controlling a cooking apparatus when a plurality of cooking objects are cooked.FIG. 13 is a flowchart showing a part of operations of a method of controlling a cooking apparatus described with reference to FIGS. 11 and 12 in more detail.FIG. 14 is a flowchart for describing an example of interaction between a cooking apparatus, a server, and a user device.FIG. 15 is a flowchart for describing another example of interaction between a cooking apparatus, a server, and a user device. [Modes of the Disclosure] The various embodiments of the disclosure and terminology used herein are not intended to limit the technical features of the disclosure to the specific embodiments. In the description of the drawings, like numbers refer to like elements throughout the description of the drawings. The singular forms preceded by "a," "an," and "the" corresponding to an item are intended to include the plural forms as well unless the context clearly indicates otherwise. In the disclosure, a phrase such as "A or B," "at least one of A and B," "at least one of A or B," "A, B or C," "at least one of A, B and C," and "at least one of A, B, or C" may include any one of the items listed together in the corresponding phrase of the phrases, or any possible combination thereof. Terms, such as "first," "second," etc. are used to distinguish one element from another and do not modify the elements in other aspects (e.g., importance or sequence). When one (e.g., a first) element is referred to as being "coupled" or "connected" to another (e.g., a second) element with or without the term "functionally" or "communicatively," it means that the one element is connected to the other element directly, wirelessly, or via a third element. The terms such as "including" or "having," etc., are intended to indicate the existence of the features, numbers, operations, components, parts, or combinations thereof disclosed in the specification, and are not intended to preclude the possibility that one or more other features, numbers, operations, components, parts, or combinatio