US-12622645-B2 - Signal-emitting and receiving medical devices which provide data for real-time multi-dimensional anatomic visualization maps
Abstract
The present invention relates to the process of using signal-emitting and/or receiving objects or smart medical devices for image acquisition, and which can utilize a variety of external energy sources which are directly applied and/or incorporated into the host subject to produce a continuous and dynamic visual representation of the host subject on a computer display, which representation hereafter will be referred to as a visualization map. The derived images can be targeted, to small (i.e., focal) areas of clinical interest, to organ systems, or the entire body. The present invention provides a scalable method for continuous and dynamic imaging over prolonged periods of time, as dictated by the clinical context.
Inventors
- Bruce Reiner
Assignees
- Bruce Reiner
Dates
- Publication Date
- 20260512
- Application Date
- 20220609
Claims (20)
- 1 . A system to create anatomic visualization maps of a body of a patient, comprising: a medical device, including: at least one of a signal emitter which emits energy in a form of a transmitted signal, or a signal receiver which receives transmitted energy as a received signal, the signal receiver including at least one sensor or an antenna; a plurality of sensors and/or detectors which provide real-time anatomic and physiologic data to the signal emitter; a passive or active propulsion mechanism; and an energy source; and an external signal receiver and/or transmitter which receives the transmitted signal; a controller which receives the transmitted signal from the external signal receiver and/or transmitter and converts the transmitted signal into a standardized form of data; and an external processor which receives the data and records the data in a database; wherein the external processor performs computational analysis on the data to produce a 4-D anatomic visualization map of the body showing real-time anatomic and physiologic change that is displayed on a display; and wherein imagery from the 4-D anatomic visualization map is projected or superimposed directly onto the patient in real-time medical intervention.
- 2 . The system of claim 1 , wherein the signal emitter emits energy in a form including at least one of chemical, electrical, radiant, sound, light, magnetic/magneto-inductive, mechanical, thermal, nuclear, motion, or elastic; and wherein transfer of the data is conducted by methods including at least one of near field communication (NFC), Bluetooth, infrared, microwave, Zigbee, satellite, light, or radio frequency (RF) transmission.
- 3 . The system of claim 2 , wherein the external signal receiver and/or transmitter is embedded in an article of clothing or linens proximate to the body of the patient and the medical device and is accessed by the medical device via at least one of the data transfer methods.
- 4 . The system of claim 1 , wherein the medical device is at least one of embedded in a patient or circulated within the patient in a localized anatomic region or systemically, throughout a body of the patient; and wherein the medical device is introduced into the body from one of a urinary bladder, lungs, bloodstream, skin, lymphatic system, or gastrointestinal tract.
- 5 . The system of claim 1 , wherein the medical device is one of a microbot, nanobot, miniaturized smart medical device, or other standard medical device including at least one of prosthesis, surgical hardware, or implant; and wherein on condition that the medical device is disposed in the body, the medical device is one of internally located and fixed or temporarily placed and/or transportable.
- 6 . The system of claim 5 , wherein the standard medical device is one of temporary or permanent in the body, the temporary medical device including at least one of vascular or bladder catheters, intravascular balloon pumps, drainage tubes, or short-term surgical hardware; and wherein the permanent medical device is at least one of vascular stents, pacemakers, infusion pumps, arthroplasties, prosthetic valves, or permanent surgical hardware.
- 7 . The system of claim 5 , wherein the microbots and nanobots are at least one of physically or coalesced with specific cell types in the body, or tagged to targets in the body including at least one of antibodies, circulating cells including at least one of macrophages, red blood cells, platelets, or lymphocytes, genetic material, bacteria, or tumor cells.
- 8 . The system of claim 1 , wherein a plurality of medical devices is internally mapped by being positioned in proximity to one another internally in the body of the patient in a predetermined configuration.
- 9 . The system of claim 1 , wherein the medical device includes only signal emitters or signal receivers, or both signal emitters and signal receivers.
- 10 . The system of claim 1 , wherein the signal receiver of the medical device receives signals transmitted from at least one of the signal transmitters of other medical devices or from the external signal receiver and/or transmitter.
- 11 . The system of claim 1 , wherein transmitted signals received from the signal receivers are converted by the controller and/or the external processor into the 4D anatomic visualization map by at least one of spectroscopy, thermography, radiography and computer tomography, scintillators, magnetic resonance imaging (MRI), or ultrasound, and by at least one of iterative reconstruction, filtered back projection, convolutional neural networks, or Fourier transformation; and wherein noise and measurement errors in the data are removed by filtering techniques by the external processor including at least Kalman filters.
- 12 . The system of claim 11 , wherein the 4D anatomic visualization map is automatically created based on automated signal activations initiated by the external processor, at predetermined intervals or under predetermined conditions including movement of the medical device from a predetermined location; and wherein the data is plotted over time by the external processor to create a dynamic 4D visualization map.
- 13 . The system of claim 1 , wherein the plurality of sensors and/or detectors includes at least one of biosensors, flow sensors, energy receptors, or distance sensors; wherein the distance sensors include at least one of ultrasonic, infrared, laser distance or time of flight light emitting diode (LED) sensors; wherein the distance sensors derive distance by measuring at least one of a time between signal transmission and receipt by the signal receiver of at least one of an intensity of the signal transmission or a pulse change; and wherein the medical device navigates in the body based on a continuous feedback of transmitted signals to the signal receiver from other medical devices or the external transmitter/receiver, or from transmitted signals from within a target location.
- 14 . The system of claim 1 , wherein the active propulsion mechanism includes a propulsion device activated by a propulsion activation mechanism to position the medical device, the propulsion device including at least one of chemically powered motors, enzymatically powered motors, external field driven motors, internally mounted miniaturized electrodes, miniaturized electromagnetic pumps, or appendages, activated by a propulsion activation mechanism.
- 15 . The system of claim 1 , wherein the transmitted signal is unique to each medical device and signal differentiation between a plurality of medical devices is accomplished by analysis of alteration in signal type, strength, direction, transmission time, frequency, or pattern.
- 16 . The system of claim 1 , wherein the 4D anatomic visualization map is created with other data sources in combination to produce a hybrid visual display, the other data sources including at least one of MRI spectroscopy, positron emission and computed tomography (PET-CT), or multispectral optoacoustic tomography.
- 17 . The system of claim 1 , wherein the medical device further comprises: a reservoir and/or a tool disposed in a recess; and a deployment mechanism to deploy the tool from the recess; wherein the tool performs a plurality of actions including at least one of localized drug delivery, biopsy, microsurgery, thermal ablation, cryotherapy, embolization, or cauterization.
- 18 . The system of claim 1 , wherein expanded temporal analysis is performed by the controller to render at least one 4-D anatomic visualization map which shows maximum change by temporal subtraction which visualizes changes in anatomy and pathology over a predetermined period of signal analysis.
- 19 . A system to create anatomic visualization maps of a body of a patient comprising: a medical device, including: at least one of a signal emitter which emits energy in a form of a transmitted signal, or a signal receiver which receives transmitted energy as a received signal, the signal receiver including at least one sensor or an antenna; a plurality of sensors and/or detectors; a passive or active propulsion mechanism; and an energy source; and an external signal receiver and/or transmitter which receives the transmitted signal; a controller which receives the transmitted signal from the external signal receiver and/or transmitter and converts the transmitted signal into a standardized form of data; and an external processor which receives the data and records the data in a database; wherein the external processor performs computational analysis on the data to produce a 4-dimensional (4D) anatomic visualization map of the body that is displayed on a display; wherein a plurality of medical devices is internally mapped by being positioned in proximity to one another internally in the body of the patient in a predetermined configuration; wherein the medical device includes an internal processor and at least one of the internal processor and/or the external processor determines a relative positioning of each medical device in relation to each other by analyzing metrics including at least one of distance, speed, or direction of travel of the transmitted signal, and thereby continuously updating the location of each medical device; and wherein the computational analysis of the external processor includes a location of the medical device in 3-dimensional (3D) space which is achieved by one of triangulation or predetermined frequency of the transmitted signal.
- 20 . A system to create anatomic visualization maps of a body of a patient, comprising: a medical device, including: at least one of a signal emitter which emits energy in a form of a transmitted signal, or a signal receiver which receives transmitted energy as a received signal, the signal receiver including at least one sensor or an antenna; a plurality of sensors and/or detectors; a passive or active propulsion mechanism; and an energy source; and an external signal receiver and/or transmitter which receives the transmitted signal; a controller which receives the transmitted signal from the external signal receiver and/or transmitter and converts the transmitted signal into a standardized form of data; and an external processor which receives the data and records the data in a database; wherein the external processor performs computational analysis on the data to produce a 4-dimensional (4D) anatomic visualization map of the body that is displayed on a display; wherein the signal receiver of the medical device receives signals transmitted from at least one of the signal transmitters of other medical devices or from the external signal receiver and/or transmitter; and wherein one of a plurality of the medical devices or the external signal receiver and/or transmitter form a network, the plurality of medical devices forming a mesh network wherein each signal emitter of each medical device communicates only with other signal receivers of other medical devices and only one or more of the plurality of medical devices in the mesh network communicate with the controller.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS The present invention is a continuation-in-part (CIP) of U.S. patent application Ser. No. 17/712,693, filed Apr. 4, 2022, which is a continuation application of U.S. patent application Ser. No. 16/503,920 (the '920 Application), filed Jul. 5, 2019, now U.S. Pat. No. 11,324,451, which claims benefit of priority from U.S. Provisional Patent Application No. 62/694,248, filed Jul. 5, 2018, where the '920 Application is a CIP of U.S. patent application Ser. No. 15/632,817, filed Jun. 26, 2017, now abandoned, which claims benefit of priority of U.S. Nonprovisional Patent Application No. 62/355,031, filed Jun. 27, 2016, the contents of all of which are herein incorporated by reference in their entirety. The present invention is also a CIP of U.S. patent application Ser. No. 17/575,048, filed Jan. 13, 2022, which is a Continuation of U.S. patent application Ser. No. 15/434,783, filed Feb. 16, 2017, now U.S. Pat. No. 11,224,382, which claims priority from U.S. Provisional Patent Application No. 62/295,787, filed Feb. 16, 2016, the contents of all of which are herein incorporated by reference in their entirety. BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a novel technology related to the process of using signal-emitting and/or receiving objects or smart medical devices for image acquisition, and which utilize a variety of external energy sources which are directly applied and/or incorporated into the host subject to produce a continuous and dynamic visual representation of the host subject, which hereafter will be referred to as a visualization map. The derived images can be targeted, to small (i.e., focal) areas of clinical interest, to organ systems, or the entire body. The present invention provides a scalable method for continuous and dynamic imaging over prolonged periods of time, as dictated by the clinical context. 2. Description of the Related Art Conventional medical imaging involves a wide array of technologies which utilize different forms of energy for the creation of medical images. These include (but are not limited to) radiography, computer tomography (CT), nuclear medicine (e.g., positron emission tomography (PET)), ultrasound, and magnetic resonance imaging (MRI). Once these various forms of energy are applied to the host subject, the steps of image acquisition, image reconstruction and processing, and image computing are performed, resulting in 2 or 3-dimensional medical imaging datasets. In the step of image acquisition, the energy applied may take a variety of forms, including (but not limited to) photons (radiography and CT), radioactive materials (nuclear medicine), radiofrequency signals from a magnetic field (MRI), or acoustic echoes (ultrasound). Regardless of the type of imaging modality, the data acquisition process includes conversion of the absorbed/modified energy into an electrical signal, preconditioning of the signal, and its digitization. In the subsequent step of image reconstruction, mathematical algorithms are utilized to convert the acquired raw energy data into the form of an image. There are two primary classes of algorithms used for image reconstruction: analytical and iterative. Examples include filtered back projection in CT, Fourier transformation in MRI, delay and sum beamforming in ultrasound. In the step of image computing, computational and mathematical methods are applied to the reconstructed imaging data to extract clinically relevant information. These methods include enhancement, analysis, and visualization. Regardless of the specific imaging modality and energy source utilized, all existing medical imaging technologies lead to the creation of an imaging dataset which is static in nature, representing a single snapshot of the anatomy and pathology of the host subject at a specific point in time. As a result, if one wishes to evaluate changes in anatomy, physiology, and/or pathology over a given time period, repeated imaging acquisitions are required. The requirement to repeatedly acquire these images over a given time period results in additional cost, radiation exposure, time delays in diagnosis and treatment, and potential for iatrogenic complications. In current practice, determination of anatomy (i.e., anatomic mapping) and medical device localization is performed using conventional medical imaging technologies including (but not limited to) radiography, computed tomography (CT), ultrasound, nuclear medicine, and magnetic resonance imaging (MRI). While these technologies continue to undergo incremental innovation, they do possess a number of practical limitations, which in large part are reflective of their static nature. In all of these technologies, medical images are created which capture a two (2)- or three (3)-dimensional imaging dataset which is essentially fixed (or static) at the time of image capture. As underlying conditions within the host patient inevitably change, rep