Search

US-12616439-B2 - Lesion locating method and lesion locating system

US12616439B2US 12616439 B2US12616439 B2US 12616439B2US-12616439-B2

Abstract

The lesion locating method includes locating a lesion by locating a mark on a body surface; using a camera in locating to assist an ultrasound probe for locating; forming a reference view according to data acquired in real time by the camera, where the reference view has a preset size, and a virtual mark point corresponding to the mark is formed in the reference view; and determining, according to a position of the virtual mark point in in the reference view and an actual positional relationship of the camera and the ultrasound probe, an actual locating trajectory that enables a center line of the ultrasound probe to coincide with the mark. The camera is configured to acquire camera data to form the reference view, and then assists the ultrasound probe for locating with the reference view.

Inventors

  • Minyi Sun
  • Hongbing Hu
  • Ying Zou
  • Bing Fu
  • Xiaobing Wu
  • Liang Hu
  • Cai ZHANG
  • Haoran Huang

Assignees

  • CHONGQING HAIFU MEDICAL TECHNOLOGY CO., LTD.

Dates

Publication Date
20260505
Application Date
20210830
Priority Date
20201224

Claims (20)

  1. 1 . A lesion locating method, comprising locating a lesion by locating a mark on a body surface; using an image acquisition assembly in locating to locate the mark, wherein the image acquisition assembly comprises an ultrasound probe, and at least one camera distributed on one side or both sides of a sector-scanning plane of the ultrasound probe and fixed in position relative to the ultrasound probe, and a center line of the camera is parallel to a center line of the ultrasound probe, and the lesion locating method comprises: forming a reference view according to data acquired in real time by the camera, wherein the reference view has a preset size, and a virtual mark point corresponding to the mark is formed in the reference view; and according to a position of the virtual mark point in the reference view and an actual positional relationship of the camera and the ultrasound probe, determining an actual locating trajectory that enables the center line of the ultrasound probe to coincide with the mark.
  2. 2 . The lesion locating method according to claim 1 , wherein forming the reference view according to the data acquired in real time by the camera comprises: inputting a pre-locating instruction, according to which the image acquisition assembly is moved to a position above the mark; and judging, when the image acquisition assembly completes the pre-locating instruction, whether a current view acquired and formed by the camera in real time contains a virtual mark point corresponding to the mark; taking, if the current view contains the virtual mark point corresponding to the mark, the current view as the reference view and a current height distance of the ultrasound probe to the mark as a pre-locating height, and inputting, if the current view does not contain the virtual mark point corresponding to the mark, the pre-locating instruction again, until the reference view is formed.
  3. 3 . The lesion locating method according to claim 2 , wherein calculating the actual locating trajectory further comprises: limiting, according to the pre-locating height, a motion boundary condition of the image acquisition assembly so that the ultrasound probe is movable within a preset plane, wherein the preset plane is a plane perpendicular to the center line of the ultrasound probe and corresponding to the pre-locating height.
  4. 4 . The lesion locating method according to claim 2 , wherein forming the reference view comprises: establishing an imaging proportional relation between the current view and an actual acquisition region of the camera, and forming the reference view according to the imaging proportional relation; setting a preset value of the pre-locating height, and inputting a pre-locating instruction so that the pre-locating height is equal to the preset value and the proportional relation is a fixed value; or establishing a calculation model of the imaging proportional relation by taking the preset value of the pre-locating height as a variable, and calculating an actual imaging proportional relation after obtaining a value of the pre-locating height; or setting an installation position of the camera so that part of a side edge contour of the ultrasound probe always exists in the current view acquired by the camera, and calculating, when establishing the imaging proportional relation, the imaging proportional relation from an actual distance from the center line of the camera to the side edge contour and a reference distance in the reference view.
  5. 5 . The lesion locating method according to claim 4 , wherein when the center line of the camera is located on a midperpendicular plane of the sector-scanning plane of the ultrasound probe and the reference view is formed, the center line of the camera is located at a center of the reference view, the reference view takes a projection of the midperpendicular plane of the ultrasonic sector-scanning plane as a transverse axis and a direction perpendicular to the transverse axis as a longitudinal direction, and calculating the actual locating trajectory comprises: calculating an actual transverse displacement and an actual longitudinal displacement, respectively, wherein calculating the actual longitudinal displacement comprises: moving, according to a position of the virtual mark point in the reference view, the virtual mark point to a virtual longitudinal displacement desired to coincide with the transverse axis, and calculating the actual longitudinal displacement according to the virtual longitudinal displacement and the imaging proportional relation.
  6. 6 . The lesion locating method according to claim 5 , wherein at least one camera is provided, the actual locating trajectory is calculated from a reference view formed by a single camera, and calculating the actual transverse displacement comprises: calculating a virtual transverse displacement of the virtual mark point according to the position of the virtual mark point in the reference view, and calculating the actual transverse displacement according to the virtual transverse displacement and the imaging proportional relation, wherein a calculation formula of the virtual transverse displacement satisfies: L 0 = arctan ⁢ h 1 + h 2 a - π - θ 2 θ × L where L 0 is a virtual transverse displacement component, a is a center distance between the ultrasound probe and the camera, h 1 is a height distance between the ultrasound probe and the camera, h 2 is the pre-locating height, θ is a viewing angle corresponding to an acquisition region of the camera in the transverse direction, and L is a view width corresponding to the reference view in the transverse direction.
  7. 7 . The lesion locating method according to claim 5 , wherein at least two cameras are provided, comprising a first camera and a second camera, wherein the actual locating trajectory is calculated from corresponding reference views formed by the two cameras, the first camera and the second camera are symmetrically distributed on two sides of the sector-scanning plane of the ultrasound probe, and have a same height difference from the ultrasound probe, the first camera acquires data and forms a first reference view, and the second camera acquires data and forms a second reference view; calculating the actual transverse displacement comprises calculating the actual transverse displacement from a position of the virtual mark point in the first reference view and a position of the virtual mark point in the second reference view, wherein a calculation formula of the actual transverse displacement satisfies: y = tan ⁢ ( L 2 L ⁢ θ ) - tan ⁢ ( L 1 L ⁢ θ ) tan ⁢ ( L 1 L ⁢ θ ) + tan ⁢ ( L 2 L ⁢ θ ) ⁢ a where y is an actual transverse displacement component, a is a center distance between the ultrasound probe and each camera, L 1 is a transverse distance between the virtual mark point in the first reference view and a view center; and L 2 is a transverse distance between the virtual mark point in the second reference view and the view center; images acquired by the first camera and the second camera each have a viewing angle θ in the transverse direction; and the first reference view and the second reference view each have a preset view width L.
  8. 8 . The lesion locating method according to claim 4 , wherein at least two cameras are provided, comprising a first camera and a second camera, wherein the actual locating trajectory is calculated from corresponding reference views formed by the two cameras, the first camera and the second camera are distributed on two sides of the sector-scanning plane of the ultrasound probe, at least one of the first camera or the second camera has a center line deviating from a midperpendicular plane of the sector-scanning plane of the ultrasound probe, and the first camera and the second camera have a same height difference from the ultrasound probe, the first camera acquires data and forms a first reference view, and the second camera acquires data and forms a second reference view, and calculating the actual locating trajectory comprises: calculating a virtual transverse displacement and a virtual longitudinal displacement according to positions of the virtual mark point in the first reference view and the second reference view; and calculating an actual transverse displacement and an actual longitudinal displacement according to the virtual transverse displacement, the virtual longitudinal displacement and the imaging proportional relation; wherein in calculation of the virtual transverse displacement and the virtual longitudinal displacement, a virtual projection point of the center line of the ultrasound probe is taken as an origin, a virtual sector-scanning projection line of the sector-scanning plane of the ultrasound probe is taken as a Y axis, and a virtual midperpendicular projection line of the midperpendicular plane of the sector-scanning plane of the ultrasound probe is taken as an X axis to establish a coordinate system, and according to the positions of the virtual mark point in the first reference view and the second reference view, a coordinate calculation formula set of the virtual mark point is established: y 1 =(tan θ 1 ) x 1 +b 1 −a 1 tan θ 1 ; y 1 =(tan θ 2 ) x 1 +b 2 −a 2 tan θ 2 ; where coordinates of the virtual mark point 41 are (x 1 , y 1 ), θ 1 is an angle between the virtual mark point 41 and the sector-scanning plane of the ultrasound probe (corresponding to the X axis) in the first reference view 4 a , a coordinate position of the first camera 21 is (a 1 , b 1 ), a coordinate position of the second camera 22 is (a 2 , b 2 ), and θ 2 is an angle between the virtual mark point 41 and the sector-scanning plane of the ultrasound probe (corresponding to the X axis) in the second reference view 4 b.
  9. 9 . The lesion locating method according to claim 4 , wherein cameras are divided into at least two camera groups, each of which comprises one or two cameras, an actual locating trajectory to be verified is formed according to a reference view acquired and formed by a camera group, and a final actual locating trajectory is obtained according to at least two actual locating trajectories to be verified, and wherein: at least two cameras are provided, comprising a first camera and a second camera, wherein the first camera and the second camera are symmetrically distributed on two sides of the sector-scanning plane of the ultrasound probe, and each have a center line on a midperpendicular plane of the sector-scanning plane of the ultrasound probe and a same height difference from the ultrasound probe, and while locating a lesion, a first actual locating trajectory is calculated from a corresponding reference view formed by the first camera or the second camera, a second actual locating trajectory is calculated from corresponding reference views formed by the first camera and the second camera, and a final actual locating trajectory is determined from the first actual locating trajectory and the second actual locating trajectory; or at least two cameras are provided, comprising a first camera and a second camera, wherein the first camera and the second camera are distributed on two sides of the sector-scanning plane of the ultrasound probe, a center line of the first camera is located on the midperpendicular plane of the sector-scanning plane of the ultrasound probe, while a center line of the second camera deviates from the midperpendicular plane of the sector-scanning plane of the ultrasound probe, the first camera and the second camera have a same height difference from the ultrasound probe, and while locating a lesion, a first actual locating trajectory is calculated from a corresponding reference view formed by the first camera, a second actual locating trajectory is calculated from corresponding reference views formed by the two cameras, and a final actual locating trajectory is determined from the first actual locating trajectory and the second actual locating trajectory; or at least three cameras are provided, comprising a first camera, a second camera and a third camera, wherein the first camera and the third camera are distributed on one side of the sector-scanning plane of the ultrasound probe, the second camera is distributed on the other side of the sector-scanning plane of the ultrasound probe, a center line of the third camera is located on the midperpendicular plane of the sector-scanning plane of the ultrasound probe, while center lines of the first camera and the second camera deviate from the midperpendicular plane of the sector-scanning plane of the ultrasound probe, the first camera, the second camera and the third camera have a same height difference from the ultrasound probe, and while locating a lesion, a first actual locating trajectory is calculated from a corresponding reference view formed by the third camera, a second actual locating trajectory is calculated from corresponding reference views formed by the first camera and the second camera, and a final actual locating trajectory is determined from the first actual locating trajectory and the second actual locating trajectory; or at least four cameras are provided, comprising a first camera, a second camera, a third camera and a fourth camera, wherein the first camera and the second camera are symmetrically distributed on two sides of the sector-scanning plane of the ultrasound probe, and each have a center line on the midperpendicular plane of the sector-scanning plane of the ultrasound probe, while the third camera and the fourth camera are distributed on two sides of the sector-scanning plane of the ultrasound probe, and each have a center line deviating from the midperpendicular plane of the sector-scanning plane of the ultrasound probe, a first actual locating trajectory is calculated from corresponding reference views formed by the first camera and the second camera, a second actual locating trajectory is calculated from corresponding reference views formed by the third camera and the fourth camera, and a final actual locating trajectory is determined from the first actual locating trajectory and the second actual locating trajectory; or at least four cameras are provided, comprising a first camera, a second camera, a third camera and a fourth camera, wherein the first camera and the second camera are symmetrically distributed on two sides of the sector-scanning plane of the ultrasound probe, and each have a center line on the midperpendicular plane of the sector-scanning plane of the ultrasound probe, while the third camera and the fourth camera are distributed on two sides of the sector-scanning plane of the ultrasound probe, and each have a center line deviating from the midperpendicular plane of the sector-scanning plane of the ultrasound probe, a first actual locating trajectory is calculated from corresponding reference views formed by the first camera and the second camera, a second actual locating trajectory is calculated from corresponding reference views formed by the third camera and the fourth camera, a third actual locating trajectory is calculated from a corresponding reference view formed by the first camera or the second camera, and a final actual locating trajectory is determined from the first actual locating trajectory, the second actual locating trajectory, and the third actual locating trajectory.
  10. 10 . The lesion locating method according to claim 4 , wherein further comprising: merging, according to the actual positional relationship of the camera and the ultrasound probe, the center line of the ultrasound probe in the reference view as a virtual projection point, and determining the actual locating trajectory, comprising: determining, according to a positional relationship of the virtual mark point and the virtual projection point in the reference view, a movement direction corresponding to coincidence of the virtual projection point and the virtual mark point, and controlling movement of the ultrasound probe according to the movement direction until the virtual projection point and the virtual mark point coincide with each other in the reference view.
  11. 11 . The lesion locating method according to claim 1 , wherein a reference scale with fixed position and shape is set corresponding to the reference view, and wherein the reference scale has corresponding scale values which are converted into and displayed as size values corresponding to the actual acquisition region of the camera according to an imaging proportion.
  12. 12 . A lesion locating system, for locating a lesion by locating a mark on a body surface, said system comprising: an image acquisition assembly having an ultrasound probe, and at least one camera distributed on one side or both sides of a sector-scanning plane of the ultrasound probe and fixed in position relative to the ultrasound probe, and a center line of the camera is parallel to a center line of the ultrasound probe; a reference image display device configured to display a reference view, wherein the reference view is formed according to data acquired in real time by the camera, the reference view has a size of a fixed value, and a virtual mark point corresponding to the mark is formed in the reference view; and a processor comprising an actual locating trajectory calculation unit configured to calculate, according to a position of the virtual mark point in the reference view and an actual positional relationship of the camera and the ultrasound probe, an actual locating trajectory that enables the center line of the ultrasound probe to coincide with the mark.
  13. 13 . The lesion locating system according to claim 12 , further comprising: a pre-locating instruction input unit configured to input a pre-locating instruction, according to which the image acquisition assembly is moved to a position above the mark; and an actuating mechanism configured to drive the image acquisition assembly to move; wherein the processor comprises a pre-locating processing unit configured to control the actuating mechanism to move according to the pre-locating instruction, judge, after an action corresponding to the pre-locating instruction is completed by the actuating mechanism, whether a current view acquired and formed by the camera contains a virtual mark point corresponding to the mark, and take, if the current view contains the virtual mark point corresponding to the mark, the current view as the reference view.
  14. 14 . The lesion locating system according to claim 13 , wherein the actual locating trajectory calculation unit has a motion boundary condition calculation subunit for calculating a motion boundary condition, and the motion boundary condition calculation subunit is configured to limit, according to the pre-locating height, a motion boundary condition of the image acquisition assembly so that the ultrasound probe is movable within a preset plane, wherein the preset plane is a plane perpendicular to the center line of the ultrasound probe and corresponding to the pre-locating height.
  15. 15 . The lesion locating system according to claim 13 , wherein the processor further comprises an imaging unit configured to: establish an imaging proportional relation between the current view and an actual acquisition region of the camera, and form the reference view according to the imaging proportional relation; set a preset value of the pre-locating height, and input a pre-locating instruction so that the pre-locating height is equal to the preset value and the proportional relation is a fixed value; or establish a calculation model of the imaging proportional relation by taking the preset value of the pre-locating height as a variable, and calculate an actual imaging proportional relation after obtaining a value of the pre-locating height; or set an installation position of the camera so that part of a side edge contour of the ultrasound probe always exists in the current view acquired by the camera, and calculate, when establishing the imaging proportional relation, the imaging proportional relation from an actual distance from the center line of the camera to the side edge contour and a reference distance in the reference view.
  16. 16 . The lesion locating system according to claim 15 , wherein the center line of the camera is located on a midperpendicular plane of the sector-scanning plane of the ultrasound probe, so the imaging unit is configured to: form the reference view so that the center line of the camera is located at a center of the reference view, wherein the reference view takes a projection of the midperpendicular plane of the ultrasonic sector-scanning plane as a transverse axis and a direction perpendicular to the transverse axis as a longitudinal direction; and the actual locating trajectory calculation unit comprises an actual transverse displacement calculation subunit and an actual longitudinal displacement calculation subunit, wherein the actual longitudinal displacement calculation subunit is configured to: move, according to a position of the virtual mark point in the reference view, the virtual mark point to a virtual longitudinal displacement desired to coincide with the transverse axis, and calculate the actual longitudinal displacement according to the virtual longitudinal displacement and the imaging proportional relation.
  17. 17 . The lesion locating system according to claim 16 , wherein one camera is provided, and the actual transverse displacement calculation subunit is configured to: calculate a virtual transverse displacement of the virtual mark point according to the position of the virtual mark point in the reference view, and calculate the actual transverse displacement according to the virtual transverse displacement and the imaging proportional relation, wherein a calculation formula of the virtual transverse displacement satisfies: L 0 = arctan ⁢ h 1 + h 2 a - π - θ 2 θ × L where L 0 is a virtual transverse displacement component, a is a center distance between the ultrasound probe and the camera, h 1 is a height distance between the ultrasound probe and the camera, h 2 is the pre-locating height, θ is a viewing angle corresponding to an acquisition region of the camera in the transverse direction, and L is a view width corresponding to the reference view in the transverse direction.
  18. 18 . The lesion locating system according to claim 17 , wherein two cameras are provided, comprising a first camera and a second camera, wherein the first camera and the second camera are symmetrically distributed on two sides of the sector-scanning plane of the ultrasound probe, and have a same height difference from the ultrasound probe; the imaging unit is configured to: form a first reference view from data acquired by the first camera, and form a second reference view from data acquired by the second camera; and the actual transverse displacement calculation subunit is configured to: calculate the actual transverse displacement from a position of the virtual mark point in the first reference view and a position of the virtual mark point in the second reference view, wherein a calculation formula of the actual transverse displacement satisfies: y = tan ⁢ ( L 2 L ⁢ θ ) - tan ⁢ ( L 1 L ⁢ θ ) tan ⁢ ( L 1 L ⁢ θ ) + tan ⁢ ( L 2 L ⁢ θ ) ⁢ a where y is an actual transverse displacement component, a is a center distance between the ultrasound probe and each camera, L 1 is a transverse distance between the virtual mark point in the first reference view and a view center; and L 2 is a transverse distance between the virtual mark point in the second reference view and the view center; images acquired by the first camera and the second camera each have a viewing angle θ in the transverse direction; and the first reference view and the second reference view each have a preset view width L.
  19. 19 . The lesion locating system according to claim 15 , wherein two cameras are provided, comprising a first camera and a second camera, wherein the first camera and the second camera are distributed on two sides of the sector-scanning plane of the ultrasound probe, at least one of the first camera or the second camera has a center line deviating from a midperpendicular plane of the sector-scanning plane of the ultrasound probe, and the first camera and the second camera have a same height difference from the ultrasound probe; the imaging unit is configured to: form a first reference view from data acquired by the first camera, and form a second reference view from data acquired by the second camera; and the actual locating trajectory calculation unit is configured to: calculate a virtual transverse displacement and a virtual longitudinal displacement according to positions of the virtual mark point in the first reference view and the second reference view; and calculate an actual transverse displacement and an actual longitudinal displacement according to the virtual transverse displacement, the virtual longitudinal displacement and the imaging proportional relation; wherein in calculation of the virtual transverse displacement and the virtual longitudinal displacement, a virtual projection point of the center line of the ultrasound probe is taken as an origin, a virtual sector-scanning projection line of the sector-scanning plane of the ultrasound probe is taken as a Y axis, and a virtual midperpendicular projection line of the midperpendicular plane of the sector-scanning plane of the ultrasound probe is taken as an X axis to establish a coordinate system, and according to the positions of the virtual mark point in the first reference view and the second reference view, a coordinate calculation formula set of the virtual mark point is established: y 1 =(tan θ 1 ) x 1 +b 1 −a 1 tan θ 1 ; y 1 =(tan θ 2 ) x 1 +b 2 −a 2 tan θ 2 ; where coordinates of the virtual mark point are (x 1 , y 1 ), in the first reference view, θ 1 is a viewing angle of an acquisition region of the first camera in a width direction corresponding to the X axis, a coordinate position of the first camera is (a 1 , b 1 ), a coordinate position of the second camera is (a 2 , b 2 ), and θ 2 is a viewing angle of an acquisition region of the second camera in the width direction corresponding to the X axis.
  20. 20 . The lesion locating system according to claim 12 , wherein a reference scale is provided in the reference view or on a display device of the reference view, the reference scale has corresponding scale values which are converted into and displayed as size values corresponding to the actual acquisition region of the camera according to an imaging proportion.

Description

CROSS REFERENCE TO RELATED APPLICATIONS This application is the United States national phase of International Patent Application No. PCT/CN2021/115360 filed Aug. 30, 2021, and claims priority to Chinese Patent Application No. 202011551545.3 filed Dec. 24, 2020, the disclosures of which are hereby incorporated by reference in their entireties. BACKGROUND OF THE INVENTION Field of the Invention The present disclosure belongs to the technical field of ultrasound treatment, and in particular relates to a lesion locating method and a lesion locating system. Description of Related Art By focusing ultrasonic waves, the high-intensity focused ultrasound treatment technology can form high-intensity and continuous ultrasonic energy on a lesion, thereby generating a transient high-temperature effect, a cavitation effect, a mechanical effect and a sonochemical effect, breaking cell membranes and nuclear membranes, coagulating protein, and selectively causing coagulative necrosis of lesion tissues to disable proliferation, infiltration and transfer capabilities of the lesion. In the treatment process with an existing ultrasound treatment device, a B-mode ultrasound probe is usually used for guiding location of a lesion, and the B-mode ultrasound probe needs to be repeatedly moved for many times in the locating process to help a doctor to imagine a surrounding anatomical structure of the lesion, and analyze and find a location of the lesion, which is complicated and consumes a lot of time. SUMMARY OF THE INVENTION In view of the above disadvantages of the existing art, an object of the present disclosure is to provide a lesion locating method and a lesion locating system which can solve the problems of complicated process and long time consumption in locating a lesion in the existing art. To achieve the above and other related objects: the present disclosure provides a lesion locating method involving locating a lesion by locating a mark on a body surface; using an image acquisition assembly in locating to locate the mark, wherein the image acquisition assembly includes an ultrasound probe, and at least one camera distributed on one side or both sides of a sector-scanning plane of the ultrasound probe and fixed in position relative to the ultrasound probe, and a center line of the camera is parallel to a center line of the ultrasound probe, and the lesion locating method includes:forming a reference view according to data acquired in real time by the camera, wherein the reference view has a preset size, and a virtual mark point corresponding to the mark is formed in the reference view; anddetermining, according to a position of the virtual mark point in the reference view and an actual positional relationship of the camera and the ultrasound probe, an actual locating trajectory that enables the center line of the ultrasound probe to coincide with the mark. Optionally, forming the reference view according to the data acquired in real time by the camera includes: inputting a pre-locating instruction, according to which the image acquisition assembly is moved to a position above the mark;judging, when the image acquisition assembly completes the pre-locating instruction, whether a current view acquired and formed by the camera in real time contains a virtual mark point corresponding to the mark;taking, if the current view contains the virtual mark point corresponding to the mark, the current view as the reference view and a current height distance of the ultrasound probe to the mark as a pre-locating height, and inputting, if the current view does not contain the virtual mark point corresponding to the mark, the pre-locating instruction again, until the reference view is formed. Optionally, calculating the actual locating trajectory further includes: limiting, according to the pre-locating height, a motion boundary condition of the image acquisition assembly so that the ultrasound probe is movable within a preset plane, wherein the preset plane is a plane perpendicular to the center line of the ultrasound probe and corresponding to the pre-locating height. Optionally, forming the reference view includes: establishing an imaging proportional relation between the current view and an actual acquisition region of the camera, and forming the reference view according to the imaging proportional relation;setting a preset value of the pre-locating height, and inputting a pre-locating instruction so that the pre-locating height is equal to the preset value and the proportional relation is a fixed value;orestablishing a calculation model of the imaging proportional relation by taking the preset value of the pre-locating height as a variable, and calculating an actual imaging proportional relation after obtaining a value of the pre-locating height;orsetting an installation position of the camera so that part of a side edge contour of the ultrasound probe always exists in the current view acquired by the camera, and calculating, when e