Search

US-12620144-B2 - Mixed reality (MR) based color code correction and creation in an environmental setting

US12620144B2US 12620144 B2US12620144 B2US 12620144B2US-12620144-B2

Abstract

A computer-implemented method includes receiving an image of an area, identifying an item in the area, determining an intended color of the item according to a predefined color code, analyzing the image for determining if the intended color of the item is diminished on and/or absent from the item, and outputting a mixed reality (MR) image of the area in which the intended color is overlayed on the item in response to determining that the intended color of the item is diminished and/or absent.

Inventors

  • Shailendra Moyal
  • Sarbajit K. Rakshit

Assignees

  • INTERNATIONAL BUSINESS MACHINES CORPORATION

Dates

Publication Date
20260505
Application Date
20220914

Claims (20)

  1. 1 . A computer-implemented method, comprising: receiving an image of an area; identifying an item in the area; determining an intended color of the item according to a predefined color code; analyzing the image for determining if the intended color of the item is diminished on and/or absent from the item, wherein the intended color of the item is diminished by at least one cause selected from the group consisting of: age of the item resulting in a faded color, a temporary object obscuring the item, and an environmental factor obscuring the intended color of the item; and outputting a mixed reality (MR) image of the area in which the intended color is overlayed on the item in response to determining that the intended color of the item is diminished and/or absent.
  2. 2 . The computer-implemented method of claim 1 , wherein the item is selected from the group consisting of: an object, an area of a floor, and an infrastructure.
  3. 3 . The computer-implemented method of claim 1 , wherein a camera associated with a user captures the image, wherein the method is implemented on an apparatus worn by the user, wherein the apparatus comprises the camera and a display for outputting the MR image of the area.
  4. 4 . The computer-implemented method of claim 1 , wherein a camera associated with a user captures the image, wherein the user is selected from the group consisting of: a human and a robot.
  5. 5 . The computer-implemented method of claim 1 , comprising determining a distance between the item and a user using an augmented reality (AR) system.
  6. 6 . The computer-implemented method of claim 1 , wherein the intended color of the item is adjusted in the image using a machine learning network.
  7. 7 . The computer-implemented method of claim 6 , wherein the machine learning network identifies a correction of the intended color on the item for adjustment of the intended color in the image of the area.
  8. 8 . The computer-implemented method of claim 1 , wherein items present in the area have sensors, wherein the identifying comprises receiving an IoT feed from the sensors for recognizing items in the area.
  9. 9 . The computer-implemented method of claim 1 , comprising identifying a physical location of the item in the image for overlaying the intended color on the item.
  10. 10 . The computer-implemented method of claim 1 , comprising enhancing a brightness of the intended color of the item in the image so that the item is recognized in the area.
  11. 11 . The computer-implemented method of claim 1 , wherein the area includes a plurality of items, and further comprising: identifying a second item of the plurality of items in the area; determining an intended color of the second item according to a predefined color code; analyzing the image for determining if the intended color of the second item is diminished on and/or absent from the second item; and outputting a mixed reality (MR) image of the area in which the intended color is overlayed on the second item in response to determining that the intended color of the second item is diminished and/or absent.
  12. 12 . The computer-implemented method of claim 1 , wherein the color code is predefined according to a color code rule.
  13. 13 . The computer-implemented method of claim 1 , wherein the intended color of the item is diminished by the age of the item resulting in a faded color.
  14. 14 . The computer-implemented method of claim 1 , wherein the intended color of the item is diminished by the temporary object obscuring the item.
  15. 15 . The computer-implemented method of claim 1 , wherein the intended color of the item is diminished by the environmental factor obscuring the intended color of the item.
  16. 16 . A computer-implemented method, comprising: receiving an activity to be performed in an area; analyzing the activity for implementing the method, wherein the analyzing comprises determining whether the activity to be performed includes an area that is not designated for the activity; receiving an image of the area pertinent to the activity; identifying an item pertinent to the activity in the area; determining an intended color of the item according to a predefined color code; analyzing the image for determining if the intended color of the item is diminished on and/or absent from the item; and outputting a mixed reality (MR) image of the area in which the intended color is overlayed on the item in response to determining that the intended color of the item is diminished and/or absent.
  17. 17 . The computer-implemented method of claim 16 , wherein the item is selected from the group consisting of: an object, an area of a floor, and an infrastructure.
  18. 18 . The computer-implemented method of claim 16 , wherein a camera associated with a user captures the image, wherein the method is implemented on an apparatus worn by the user, wherein the apparatus comprises the camera and a display for outputting the MR image of the area.
  19. 19 . The computer-implemented method of claim 16 , wherein a camera associated with a user captures an image, wherein the user is selected from the group consisting of: a human and a robot.
  20. 20 . A Mixed Reality (MR)-based system, comprising: a head-mounted MR translucent display; a camera coupled to the head-mounted MR display; a processor; and logic integrated with the processor, executable by the processor, or integrated with and executable by the processor, the logic being configured to: connect to an augmented reality (AR) system for identifying items present in an area as depicted in an image captured by the camera and outputted to the display; connect to a predefined color code rule; and connect to a machine learning network module for: determining an intended color of one of the items according to the predefined color code rule, analyzing the image for determining if the intended color of the item is diminished on and/or absent from the item, wherein the intended color of the item is diminished by at least one cause selected from the group consisting of: age of the item resulting in a faded color, a temporary object obscuring the item, and an environmental factor obscuring the intended color of the item, and modifying the intended color in the image of the item present in the area according to the predefined color code rule, wherein the intended color is overlayed onto the item in the image depicted on the MR display.

Description

BACKGROUND The present invention relates to appropriate color code in an environmental setting, and more specifically, this invention relates to a mixed reality (MR) based method and system to correct fading out color code and create color code in a MR view of a space. In any environmental setting, for example an industrial floor, different types of color codes are used, such as a specific color on the floor, pipe fitting, machines etc., so that both human and robotic workers can recognize the surrounding, machines, etc. based on the color code. For various reasons, a color code may not be visualized properly, for instance, color may fade way over a period of a time, and/or color may not be seen through a colored fume, steam, dust etc. Thus, a method and system are needed by which a worker, robot, etc. may clearly and effectively visualize the color code of the floor, pipe fitting, machines, etc. SUMMARY In one embodiment, a computer-implemented method includes receiving an image of an area, identifying an item in the area, determining an intended color of the item according to a predefined color code, analyzing the image for determining if the intended color of the item is diminished on and/or absent from the item, and outputting a mixed reality (MR) image of the area in which the intended color is overlayed on the item in response to determining that the intended color of the item is diminished and/or absent. In another embodiment, a computer-implemented method includes receiving an activity to be performed in an area, receiving an image of the area pertinent to the activity, identifying an item pertinent to the activity in the area, determining an intended color of the item according to a predefined color code, analyzing the image for determining if the intended color of the item is diminished on and/or absent from the item, and outputting a MR image of the area in which the intended color is overlayed on the item in response to determining that the intended color of the item is diminished and/or absent. In yet another embodiment, a MR-based system includes a head-mounted MR translucent display, a camera coupled to the head-mounted MR display, a processor, and logic integrated with the processor, executable by the processor, or integrated with and executable by the processor. The logic is configured to connect to an augmented reality (AR) system for identifying items present in an area as depicted in an image captured by the camera and outputted to the display, connect to a predefined color code rule, and connect to a machine learning network module for modifying an intended color in the image of one of the items present in the area according to the predefined color code rule, where the intended color is overlayed onto the item in the image depicted on the MR display. Other aspects and embodiments of the present invention will become apparent from the following detailed description, which, when taken in conjunction with the drawings, illustrate by way of example the principles of the invention. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a diagram of a computing environment, in accordance with one embodiment of the present invention. FIG. 2 is a diagram of a storage system, in accordance with one embodiment of the present invention. FIG. 3 is a diagram of the color code of different types of pipelines. FIG. 4 is a flow chart of a computer-implemented method for a mixed reality (MR)-based system for color correction and creation, in accordance with one embodiment of the present invention. FIG. 5 is a flow chart of a computer-implemented method for a MR-based system for color correction and creation according to an activity, in accordance with one embodiment of the present invention. FIG. 6A are schematic drawings of color correction of objects in an area in which the objects are obscured by an environmental factor, in accordance with one embodiment of the present invention. Part (a) is a depiction of an image without color correction, and part (b) is a depiction of the MR-based image with color correction. FIG. 6B are schematic drawings of an area where a color code is created for an activity to be performed, in accordance with one embodiment of the present invention. Part (a) is a depiction of an image of a floor area without a color code, part (b) is a depiction of the MR-based image with a color code, and part (c) is a depiction of the MR-based image with a color code created for the floor and objects pertaining to the activity to be performed. FIG. 7 is a schematic drawing of color correction and creation of an entire industrial space, in accordance with one embodiment of the present invention. Part (a) represents an area where an activity is to be performed, part (b) represents an image of the industrial space where each area for a step of the activity has color correction or color creation according to a color code. DETAILED DESCRIPTION The following description is made for the purpose of illustrati