US-12620102-B2 - Measuring a feature near the edge of an object
Abstract
A method for measuring a feature near the edge of an object comprising receiving image data characterizing the object, identifying an edge of the object, identifying a point on the perimeter of the feature at a position opposite the edge of the object, determining a reference plane, determining a three-dimensional reference line on the reference plane associated with the edge of the object, determining a three-dimensional measurement point on the reference plane based on the point on the perimeter of the feature, and determining the perpendicular distance from the measurement point to the reference line.
Inventors
- Clark A. Bendall
Assignees
- BAKER HUGHES HOLDINGS LLC
Dates
- Publication Date
- 20260505
- Application Date
- 20230814
Claims (20)
- 1 . A method comprising: receiving one or more two dimensional images characterizing an object; determining a plurality of three-dimensional surface points on the surface of the object based on the images; identifying an edge of the object; identifying a point on a perimeter of a feature on the object at a position farthest from the edge of the object; determining a reference plane based on the identified edge of the object and the three-dimensional coordinates; determining a three-dimensional reference line on the reference plane associated with the edge of the object; determining a three-dimensional measurement point on the reference plane based on the point on the perimeter of the feature; determining a distance between the measurement point and the reference line; and providing the determined distance.
- 2 . The method of claim 1 , wherein the one or more two dimensional images comprise a stereo image pair or a structured light image.
- 3 . The method of claim 1 , wherein identifying an edge of the object comprises receiving, by a user-input device, a first user interaction designating a first edge point proximate the edge of the object.
- 4 . The method of claim 3 , wherein identifying an edge of the object comprises receiving, by the user input device, a second user interaction designating a second edge point proximate the edge of the object.
- 5 . The method of claim 1 , wherein identifying an edge of the object comprises applying edge detection techniques to map the edge.
- 6 . The method of claim 1 , wherein identifying a point on the perimeter of the feature comprises receiving, by a user input device, a third user interaction designating the position of the point.
- 7 . The method of claim 1 , wherein determining a reference plane comprises identifying three or more of the three-dimensional surface points on the surface of the object based on the position of the identified edge of the object.
- 8 . A measurement device comprising: an image sensor configured to generate two-dimensional image data based on light reflected from a surface of an object; a display configured to displaying within a graphical users interface a visual representation of the object based on the two dimensional image data; a user input device; a processor; and a non-transitory memory coupled to the processor, the non-transitory memory storing instructions to cause the processor to perform operations comprising: receiving the two dimensional image data from the image sensor; determining a plurality of three-dimensional surface points on the surface of the object based at least in part on the two dimensional image data; identifying an edge of the object; identifying a point on a perimeter of a feature at a position farthest from the edge of the object; determining a reference plane based on the identified edge of the object and the three-dimensional coordinates; determining a three-dimensional reference line on the reference plane associated with the edge of the object; determining a three-dimensional measurement point on the reference plane based on the point on the perimeter of the feature; determining a distance between the measurement point and the reference line; and providing the determined distance.
- 9 . The measurement device of claim 8 , wherein the two dimensional image data comprises a stereo image pair or a structured light pattern.
- 10 . The measurement device of claim 8 , wherein identifying an edge of the object comprises receiving via the user input device a first user interaction designating a first edge point proximate the edge of the object.
- 11 . The measurement device of claim 10 , wherein identifying an edge of the object comprises receiving via the user input device a second user interaction designating a second edge point proximate the edge of the object.
- 12 . The measurement device of claim 8 , wherein identifying an edge of the object comprises applying edge detection techniques to map the edge of the object.
- 13 . The measurement device of claim 8 , wherein identifying a point on the perimeter of the feature comprises receiving via the user input device a third user interaction designating the position of the point.
- 14 . The measurement device of claim 8 , wherein determining a reference plane comprises identifying three or more of the three-dimensional surface points on the surface of the object based on the position of the identified edge of the object.
- 15 . A non-transitory computer readable memory storing instructions which, when executed by at least one data processor forming part of at least one computing system, causes the at least one data processor to perform operations comprising: receiving one or more two dimensional images characterizing an object; determining a plurality of three-dimensional surface points on the surface of the object based on the images; identifying an edge of the object; identifying a point on a perimeter of a feature on the object at a position farthest from the edge of the object; determining a reference plane based on the identified edge of the object and the three-dimensional coordinates; determining a three-dimensional reference line on the reference plane associated with the edge of the object; determining a three-dimensional measurement point on the reference plane based on the point on the perimeter of the feature; determining a distance between the measurement point and the reference line; and providing the determined distance.
- 16 . The non-transitory computer readable memory of claim 15 , wherein the one or more two dimensional images comprise a stereo image pair or a structured light image.
- 17 . The non-transitory computer readable memory of claim 15 , wherein identifying an edge of the object comprises receiving, by a user-input device, a first user interaction designating a first edge point proximate the edge of the object.
- 18 . The non-transitory computer readable memory of claim 17 , wherein identifying an edge of the object comprises receiving, by the user input device, a second user interaction designating a second edge point proximate the edge of the object.
- 19 . The non-transitory computer readable memory of claim 15 , wherein identifying an edge of the object comprises applying edge detection techniques to map the edge.
- 20 . The non-transitory computer readable memory of claim 15 , wherein identifying a point on the perimeter of the feature comprises receiving, by a user input device, a third user interaction designating the position of the point.
Description
CROSS-REFERENCE TO RELATED APPLICATION This application claims the benefit of and priority under 35 U.S.C. § 119 to U.S. Provisional Application No. 63/371,681 filed Aug. 17, 2022, the contents of each of which are hereby incorporated by reference in their entirety. TECHNICAL FIELD This disclosure described technologies relating to field measurements based on images. BACKGROUND Video inspection devices, such as video endoscopes or borescopes, are often used to inspect inaccessible areas of industrial assets such as jet engines or gas turbines. These devices may utilize stereoscopic optics or structured light projections to enable measurement of features or damage on the surface of parts within the asset. Such measurements may be used to determine if the asset can continue to operate safely or must be taken out of service for repair. SUMMARY This disclosure relates to measuring a feature near the edge of an object. An example of the subject matter described within this disclosure is a method with the following features. One or more two dimensional images characterizing an object are received. Multiple three-dimensional surface points on the surface of the object are determined based on the images. An edge of the object is identified. A point on a perimeter of the feature at a position opposite the edge of the object is identified. A reference plane is determined based on the identified edge of the object and the three-dimensional coordinates. A three-dimensional reference line on the reference plane associated with the edge of the object is determined. A three-dimensional measurement point on the reference plane is determined based on the point on the perimeter of the feature. A distance between the measurement point and the reference line is determined. The determined distance is provided. The disclosed method can be implemented in a variety of ways. For example, within a system that includes at least one data processor and a non-transitory memory storing instructions for the processor to perform aspects of the method. Alternatively or in addition, the method can be in included non-transitory computer readable memory storing the method as instructions which, when executed by at least one data processor forming part of at least one computing system, causes the at least one data processor to perform operations of the method. In some implementations, such a system can be a measurement device with the following features. An image is sensor configured to generate two-dimensional image data based on light reflected from a surface of an object. A data processor configured to determine a plurality of three-dimensional surface points on the surface of the object based on the two-dimensional image data. A A display is configured to display, within a graphical users interface, a visual representation of the object based on the two dimensional image data. A user input device can be included. In addition, the aforementioned processor and non-transitory memory can also be included with such a system. Aspects of the example method, which can be combined with the example method alone or in combination with other aspects, include the following. The one or more two dimensional images include a stereo image pair or a structured light image. Aspects of the example method, which can be combined with the example method alone or in combination with other aspects, include the following. Identifying an edge of the object includes receiving, by a user-input device, a first user interaction designating a first edge point proximate the edge of the object. Aspects of the example method, which can be combined with the example method alone or in combination with other aspects, include the following. Identifying an edge of the object comprises receiving, by the user input device, a second user interaction designating a second edge point proximate the edge of the object. Aspects of the example method, which can be combined with the example method alone or in combination with other aspects, include the following. Identifying an edge of the object includes applying edge detection techniques to map the edge. Aspects of the example method, which can be combined with the example method alone or in combination with other aspects, include the following. Identifying a point on the perimeter of the feature comprises receiving, by the user input device, a third user interaction designating the position of the point. Aspects of the example method, which can be combined with the example method alone or in combination with other aspects, include the following. Determining a reference plane includes identifying three or more of the three-dimensional surface points on the surface of the object based on the position of the identified edge of the object. BRIEF DESCRIPTION OF DRAWINGS FIG. 1 is a flowchart of an example method that can be used with aspects of this disclosure. FIG. 2 is an example borescope that can be used with aspects of this disclosure. F