Search

CN-122023285-A - Human mammary gland automatic checking system based on large language model combined six-axis robot

CN122023285ACN 122023285 ACN122023285 ACN 122023285ACN-122023285-A

Abstract

The invention discloses a human mammary gland automatic inspection system based on a large language model combined with a six-axis robot, which can accurately understand natural language instructions of a clinician, automatically plan and execute scanning paths, acquire two-dimensional ultrasonic image sequences, reconstruct a three-dimensional mammary gland model, automatically generate a structured clinical diagnosis report through an intelligent algorithm analysis model, realize full chain automation of mammary gland ultrasonic scanning from intention understanding report generation, remarkably improve standardization degree, efficiency and diagnosis consistency of mammary gland ultrasonic inspection, provide a reliable automatic solution for early screening and accurate diagnosis of mammary gland diseases, and solve the technical difficulties of low efficiency, poor repeatability, dependence on operator experience and the like caused by manual operation in the prior art.

Inventors

  • YING TAO
  • HUA CHEN
  • YAN YULIN
  • HUANG NINGNING
  • WANG XIA
  • WU XING

Assignees

  • 上海市第六人民医院

Dates

Publication Date
20260512
Application Date
20260112

Claims (7)

  1. 1. Human mammary gland automatic checking system based on big language model combines six robots, characterized by comprising: The central control unit receives an ultrasonic scanning request sent by a general surgeon workstation through an exchanger and a hospital local area network, invokes a large language model to carry out intelligent semantic analysis and intention recognition on the scanning requirement of natural language description in the ultrasonic scanning request, and generates a structured task description; Judging whether the structured task description is in a preset mammary gland scanning range, if so, generating a key point description containing an anatomical position and scanning key points, generating a patient positioning prompting instruction and a liftable bed triggering instruction, and generating a 3D camera control instruction based on the key point description Acquiring a 3D point cloud image of a breast area, performing rough registration with a human standard breast model, combining a visual robot path planning algorithm with a structured task description and a key point description, marking a key scanning area on the point cloud, and generating a track of a six-axis robot end effector as a high-precision scanning path instruction and a scanning trigger signal Receiving a time-space synchronous 2D scanning image sequence from a clinical ultrasonic scanning machine and a robot pose from a six-axis robot, mapping each pixel point to a world coordinate system through coordinate transformation, reconstructing by adopting a voxel fusion algorithm to generate a three-dimensional breast model, segmenting the three-dimensional breast model by integrating morphological analysis and a post-processing algorithm of a machine learning classifier, distinguishing glandular, fat and suspected focus areas, extracting suspected focus areas to calculate key morphological features, calculating the occupancy rate of the volume of the nodules in the breast to the glandular volume, spatial distribution and key morphological features, extracting texture features and depth features of the suspected nodule areas of the key morphological features to serve as multidimensional feature vectors, inputting the multidimensional feature vectors into a deep learning classifier or a support vector machine, and calculating elasticity scores, benign and malignant probability and BI-RADS classification to serve as key clinical indexes The key clinical indexes are combined with a medical knowledge base and converted into detailed ultrasonic diagnosis reports which accord with clinical specifications, the detailed ultrasonic diagnosis reports are sent to a clinical diagnosis report display and printing system through a system local area network for auditing by a sonographer, and after auditing is completed, the detailed ultrasonic diagnosis reports are fed back to general surgeons through the hospital local area network.
  2. 2. The automated human breast examination system based on a large language model combined six-axis robot of claim 1, wherein the central control unit registers patient information and doctor's ultrasound scanning request in a MySQL database after acquiring the ultrasound scanning request.
  3. 3. The automated human breast inspection system based on a large language model combined six-axis robot of claim 1, wherein if the structured task description is not within a preset breast scanning range, a reject reason and suggestion is generated and fed back to the applicant doctor through a primary path.
  4. 4. The human breast automatic inspection system based on a large language model combined six-axis robot according to claim 1, wherein the track of the end effector of the six-axis robot contains position and posture information, and the track fitting is performed by adopting a spline interpolation technology and adopting a quintic polynomial interpolation, so as to generate a continuous and smooth robot motion track.
  5. 5. The human breast automatic inspection system based on a large language model combined with a six-axis robot according to claim 1, wherein the trajectory of the six-axis robot end effector is obtained, and simultaneously, the feedback contact pressure from the robot force sensor is processed, and the motion is dynamically adjusted through impedance control, so that the contact force is ensured to be stable at a preset safety threshold.
  6. 6. The automated human breast inspection system of claim 1, wherein the vision-based robot path planning algorithm targets coverage of all critical areas while minimizing path length and movement time and avoiding obstacles.
  7. 7. The automated human breast examination system based on a large language model combined six-axis robot of claim 1, wherein the detailed ultrasonic diagnostic report includes examination findings, measurement data, diagnostic comments and advice.

Description

Human mammary gland automatic checking system based on large language model combined six-axis robot Technical Field The invention belongs to the technical field, and particularly relates to a human mammary gland automatic inspection system based on a large language model combined six-axis robot. Background Traditional breast ultrasound scanning relies on the experience and skill level of the operator, resulting in inconsistent results and inefficiency. The prior art lacks a full-automatic system capable of integrating intelligent demand analysis, automatic path planning, accurate motion execution and intelligent report generation, and is difficult to realize standardized and high-repeatability breast ultrasound examination. Disclosure of Invention In order to solve the technical problems, the technical scheme of the invention provides a human mammary gland automatic inspection system based on a large language model combined six-axis robot, which comprises the following components: The central control unit receives an ultrasonic scanning request sent by a general surgeon workstation through an exchanger and a hospital local area network, invokes a large language model to carry out intelligent semantic analysis and intention recognition on the scanning requirement of natural language description in the ultrasonic scanning request, and generates a structured task description; Judging whether the structured task description is in a preset mammary gland scanning range, if so, generating a key point description containing an anatomical position and scanning key points, generating a patient positioning prompting instruction and a liftable bed triggering instruction, and generating a 3D camera control instruction based on the key point description Acquiring a 3D point cloud image of a breast area, performing rough registration with a human standard breast model, combining a visual robot path planning algorithm with a structured task description and a key point description, marking a key scanning area on the point cloud, and generating a track of a six-axis robot end effector as a high-precision scanning path instruction and a scanning trigger signal Receiving a time-space synchronous 2D scanning image sequence from a clinical ultrasonic scanning machine and a robot pose from a six-axis robot, mapping each pixel point to a world coordinate system through coordinate transformation, reconstructing by adopting a voxel fusion algorithm to generate a three-dimensional breast model, segmenting the three-dimensional breast model by integrating morphological analysis and a post-processing algorithm of a machine learning classifier, distinguishing glandular, fat and suspected focus areas, extracting suspected focus areas to calculate key morphological features, calculating the occupancy rate of the volume of the nodules in the breast to the glandular volume, spatial distribution and key morphological features, extracting texture features and depth features of the suspected nodule areas of the key morphological features to serve as multidimensional feature vectors, inputting the multidimensional feature vectors into a deep learning classifier or a support vector machine, and calculating elasticity scores, benign and malignant probability and BI-RADS classification to serve as key clinical indexes The key clinical indexes are combined with a medical knowledge base and converted into detailed ultrasonic diagnosis reports which accord with clinical specifications, the detailed ultrasonic diagnosis reports are sent to a clinical diagnosis report display and printing system through a system local area network for auditing by a sonographer, and after auditing is completed, the detailed ultrasonic diagnosis reports are fed back to general surgeons through the hospital local area network. Preferably, the central control unit registers the patient information and the ultrasound scanning request of the doctor in a MySQL based database after acquiring the ultrasound scanning request. Preferably, if the structured task description is not within the preset mammary gland scanning range, generating reject reasons and suggestions, and feeding back to the applicant doctor through the original path. Preferably, the track of the six-axis robot end effector contains position and gesture information, and a spline interpolation technology is utilized to perform track fitting by adopting a quintic polynomial interpolation, so that a continuous and smooth robot motion track is generated. Preferably, the track of the six-axis robot end effector is obtained, and the feedback contact pressure from the robot force sensor is processed, and the motion is dynamically regulated through impedance control, so that the contact force is ensured to be stable at a preset safety threshold. Preferably, the vision-based robot path planning algorithm targets coverage of all critical areas while minimizing path length and movement time and avoiding obstacles. Preferab