Search

US-12616448-B2 - Methods and systems for automated analysis of cervix elastography

US12616448B2US 12616448 B2US12616448 B2US 12616448B2US-12616448-B2

Abstract

Various methods and systems are provided for cervix elastographic analysis. In one embodiment, the method comprises acquiring an ultrasound scene comprising a plurality of frames, generating a first loss image for the plurality of frames, selecting a background and a foreground of the ultrasound scene based on the first loss image, and generating a softness ratio from the selected background and foreground.

Inventors

  • Mostafa Shahriari Shourabi
  • Christian Perrey

Assignees

  • GE Precision Healthcare LLC

Dates

Publication Date
20260505
Application Date
20231011

Claims (20)

  1. 1 . A method comprising: acquiring an ultrasound scene comprising a plurality of frames; generating a first loss image for the plurality of frames; selecting a background and a foreground of the ultrasound scene based on the first loss image; and generating a softness ratio from the selected background and foreground.
  2. 2 . The method of claim 1 , wherein generating the first loss image comprises: extracting a region from the plurality of frames comprising the ultrasound scene; and for each pixel of the ultrasound scene: defining a set of pixels; determining a mean strain and a strain variance for the set of pixels; and determining a loss value that is a sum of the mean strain and the strain variance for the set of pixels.
  3. 3 . The method of claim 2 , wherein the set of pixels comprises a plurality of strain values for an individual pixel at each frame of the plurality of frames.
  4. 4 . The method of claim 1 , further comprising a user input indicating a cervix position on the ultrasound scene, wherein the only user inputs are the ultrasound scene and the cervix position.
  5. 5 . The method of claim 4 , further comprising assigning a foreground region of interest and a background region of interest based on proximity to the user input indicating the cervix position.
  6. 6 . The method of claim 5 , wherein the foreground region of interest is closer to the cervix position than the background region of interest.
  7. 7 . The method of claim 5 , further comprising selecting the foreground based on a largest loss value within the foreground region of interest and selecting the background based on a lowest loss value within the background region of interest.
  8. 8 . The method of claim 1 , wherein the ultrasound scene comprises a cervical ultrasound scene.
  9. 9 . The method of claim 1 , wherein the ultrasound scene comprises a sequence of shear wave images.
  10. 10 . The method of claim 1 , wherein the generating the first loss image, the selecting the background and the foreground, and the generating the softness ratio execute automatically.
  11. 11 . An ultrasound system comprising: an ultrasound probe; a display communicatively coupled to the ultrasound probe; a user input device communicatively coupled to the display; and a processor and non-transitory memory communicatively coupled to the display, the non-transitory memory including instructions that when executed cause the processor to: acquire an ultrasound scene comprising a plurality of frames; receive a user input indicating a cervix position on the ultrasound scene; generate a first loss image for the plurality of frames; select a background and a foreground of the ultrasound scene based on the first loss image; and, generate a softness ratio from the selected background and foreground.
  12. 12 . The ultrasound system of claim 11 , wherein the first loss image comprises a plurality of loss values, the plurality of loss values determined based on a sum of a mean strain and a strain variance for each individual pixel for each frame of the ultrasound scene.
  13. 13 . The ultrasound system of claim 11 , wherein the instructions cause the processor to select the background and the foreground automatically.
  14. 14 . The ultrasound system of claim 11 , wherein the user input the cervix position and the ultrasound scene are the only user inputs.
  15. 15 . The ultrasound system of claim 11 , wherein the instructions cause the processor to select the foreground based on a largest loss value within a foreground region of interest and the background based on a lowest loss value within a background region of interest.
  16. 16 . A method comprising: acquiring a cervical ultrasound scene comprising a plurality of frames; receiving a user input indicating a cervix position on the cervical ultrasound scene; generating a first loss image for the cervical ultrasound scene; assigning a foreground region of interest and a background region of interest based on the cervix position; selecting a foreground within the foreground region of interest and a background within the background region of interest based on the first loss image; determining a softness ratio for each frame based on average strain values in the foreground and average strain values in the background; and determining a final softness ratio based on an average softness ratio of all frames.
  17. 17 . The method of claim 16 , wherein generating the first loss image comprises: extracting a region from the plurality of frames comprising the cervical ultrasound scene; and for each pixel of the cervical ultrasound scene: defining a set of pixels; determining a mean strain and a strain variance for the set of pixels; and determining a loss value that is a sum of the mean strain and the strain variance for the set of pixels.
  18. 18 . The method of claim 17 , wherein the set of pixels comprises a plurality of strain values for an individual pixel at each frame of the plurality of frames.
  19. 19 . The method of claim 16 , wherein the cervical ultrasound scene and the cervix position are the only user inputs.
  20. 20 . The method of claim 16 , further comprising excluding outlier frames based the softness ratio of each frame.

Description

FIELD Embodiments of the subject matter disclosed herein relate to an automated approach to analyzing cervix elastography such as for assessing likelihood of pre-term birth. BACKGROUND Clinical ultrasound is an imaging modality that employs ultrasound waves to probe the internal structures of a body of a patient and produce a corresponding image. An ultrasound probe comprising a plurality of transducer elements emits ultrasonic pulses which reflect or echo, refract, or are absorbed by structures in the body. The ultrasound probe then receives reflected echoes, which are processed into an image. For example, a medical imaging device such as an ultrasound imaging device may be used to obtain images of a heart, uterus, liver, lungs, and various other anatomical regions of a patient. One application of clinical ultrasound is elastography. Elastography seeks to evaluate the mechanical properties of tissues, particularly their stiffness and elasticity, by analyzing how they respond to external forces or physiological processes. One of the key parameters in elastography is strain, which quantifies the percentage of tissue deformation that occurs when static or oscillatory compression is applied. The strain metric measures tissue deformation under pressure, with softer tissues deforming more readily, resulting in larger strain values. Conversely, stiffer tissues exhibit reduced deformation, leading to lower strain values. Cervix elastography is one approach for predicting pre-term birth. During cervix elastography, an ultrasound technician selects cervical tissue of interest and reference tissue of an ultrasound image, referred to as a foreground and a background, respectively, and calculates strain values to obtain a softness ratio of the cervix tissue. The softness ratio is used to assess cervical changes that may be indicative of pre-term labor. Shear wave imaging analysis is an alternative approach for predicting pre-term birth. However, the shear wave method utilizes a high energy push pulse, which may be transmitted to a region close to the head of the fetus. BRIEF DESCRIPTION In one embodiment, a method comprises acquiring an ultrasound scene comprising a plurality of frames, generating a first loss image for the plurality of frames, selecting a background and a foreground of the ultrasound scene based on the first loss image, and generating a softness ratio from the selected background and foreground. It should be understood that the brief description above is provided to introduce in simplified form a selection of concepts that are further described in the detailed description. It is not meant to identify key or essential features of the claimed subject matter, the scope of which is defined uniquely by the claims that follow the detailed description. Furthermore, the claimed subject matter is not limited to implementations that solve any disadvantages noted above or in any part of this disclosure. BRIEF DESCRIPTION OF THE DRAWINGS The present invention will be better understood from reading the following description of non-limiting embodiments, with reference to the attached drawings, wherein below: FIG. 1 shows an ultrasound imaging system, according to one or more embodiments of the present disclosure. FIG. 2 shows an overview of a process for automated analysis of cervix elastography according to an exemplary embodiment. FIG. 3 shows a flow chart illustrating a first method for automated analysis of cervix elastography according to an exemplary embodiment. FIG. 4 shows a flow chart illustrating a second method for automated analysis of cervix elastography according to an exemplary embodiment. FIG. 5 shows a flow chart illustrating a third method for automated analysis of cervix elastography according to an exemplary embodiment. FIG. 6 shows a display device displaying a frame of a cervical ultrasound scene including user input according to an exemplary embodiment. FIG. 7 shows a display device displaying a loss image generated for a cervical ultrasound scene including automated foreground region of interest assignment according to an exemplary embodiment. FIG. 8 shows a display device displaying a loss image generated for a cervical ultrasound scene including automated background region of interest assignment according to an exemplary embodiment. FIG. 9 shows a display device displaying a loss image generated for a cervical ultrasound scene including automated foreground selection according to an exemplary embodiment. FIG. 10 shows a display device displaying a loss image generated for a cervical ultrasound scene including automated background selection according to an exemplary embodiment. DETAILED DESCRIPTION The following description relates to various embodiments of systems and methods for automated cervix elastography. Predicting pre-term birth is an application of cervix elastography. As clinically practiced, an ultrasound technician performs a cervical ultrasound, selecting from the ultrasound im