Search

US-12621617-B2 - Musical perception of a recipient of an auditory device

US12621617B2US 12621617 B2US12621617 B2US 12621617B2US-12621617-B2

Abstract

Treatment actions can be taken based on an analysis of the ability of a recipient of an auditory prosthesis to perceive musical activities. The occurrence of musical activities and the recipient's response thereto can be determined based on transducers. The recipient's musical perception can be determined by analyzing the signals from the transducers as stand-alone measures or by comparing the signals against known music. The treatment actions can relate to the treatment of a hearing-impairment of the recipient of the auditory prosthesis, such as by modifying the settings of the auditory prosthesis to affect the ongoing operation of the auditory prosthesis.

Inventors

  • Ivana Popovac
  • Alexander von Brasch
  • Justin James Gilmour
  • Matthew Zygorodimos
  • Rishi Wadhwani
  • James Davies

Assignees

  • COCHLEAR LIMITED

Dates

Publication Date
20260505
Application Date
20231030

Claims (20)

  1. 1 . A method comprising: determining an occurrence of an audible activity proximate a recipient of an auditory device using first transducer signals from one or more first transducers; determining an indication of an ability of the recipient to perceive the audible activity using second transducer signals from one or more second transducers; and generating, via the auditory device, an output reflecting the ability of the recipient to perceive the audible activity.
  2. 2 . The method of claim 1 , further comprising determining an overall quality of life of the recipient based on a determined reduced ability of the recipient to perceive the audible activity.
  3. 3 . The method of claim 2 , wherein determining the overall quality of life of the recipient comprises determining a plurality of performance metrics related to the recipient.
  4. 4 . The method of claim 3 , wherein at least one of the plurality of performance metrics is unrelated to music.
  5. 5 . The method of claim 1 , wherein the output changes one or more settings of the auditory device.
  6. 6 . The method of claim 1 , further comprising: determining the occurrence of the audible activity proximate the recipient has ceased; and suspending determination of the indication of the ability of the recipient to perceive the audible activity in response to determining the occurrence of the audible activity has ceased.
  7. 7 . The method of claim 6 , further comprising: identifying a visual cue; and determining the indication of the ability of the recipient to perceive the audible activity based on the visual cue.
  8. 8 . The method of claim 7 , further comprising: determining a response of the recipient is caused by the visual cue rather than by the audible activity, wherein determining the indication of the ability of the recipient to perceive the audible activity based on the visual cue comprises determining a reduced ability of the recipient to perceive the audible activity in response to determining the response of the recipient is caused by the visual cue rather than by the audible activity.
  9. 9 . A method comprising: determining an occurrence of a sensory input to a recipient of a sensory device using first transducer signals from one or more first transducers; determining an indication of an ability of the recipient to perceive the sensory input using second transducer signals from one or more second transducers; and generating, with the sensory device, an output reflecting the ability of the recipient to perceive the sensory input.
  10. 10 . The method of claim 9 , wherein the output comprises recommendations of one or more pieces of music relative to the ability of the recipient to perceive the sensory input.
  11. 11 . The method of claim 9 , wherein determining the indication of the ability of the recipient to perceive the sensory input comprises comparing a repetitive behavior of the recipient to a threshold range.
  12. 12 . A wearable device comprising: one or more first transducers configured to detect an input signal indicative of music; and one or more second transducers configured to determine movement of a recipient of the wearable device in presence of music indicated by the input signal.
  13. 13 . The wearable device of claim 12 , wherein the input signal comprises a stream of musical data.
  14. 14 . The wearable device of claim 12 , wherein the input signal comprises an audio transducer signal of a sonic environment proximate the recipient.
  15. 15 . The wearable device of claim 14 , further comprising a computing device configured to compare a property of the audio transducer signal to a threshold to determine the presence of music proximate the recipient.
  16. 16 . The wearable device of claim 15 , wherein the property comprises a frequency, an amplitude modulation, a spectral spread, or any combination thereof.
  17. 17 . The wearable device of claim 12 , wherein operation of the one or more second transducers is suspended based on an absence of the input signal indicative of music detected by the one or more first transducers.
  18. 18 . The wearable device of claim 12 , wherein the movement of the recipient determined by the one or more second transducers comprises a respiratory or blood related biometric activity of the recipient.
  19. 19 . The wearable device of claim 12 , wherein the movement of the recipient determined by the one or more second transducers comprises a neural activation of the recipient.
  20. 20 . A non-transitory, computer-readable medium comprising instructions that, when executed by one or more processors, are configured to cause the one or more processors to perform operations comprising: collecting transducer signals from one or more transducers associated with a recipient of an auditory device; generating an indication of a musical perception of the recipient using the transducer signals; generating an analysis of the musical perception of the recipient using the indication; and recommending one or more pieces of music to the recipient based on the musical perception of the recipient using the analysis.

Description

This is a continuation of U.S. patent application Ser. No. 17/294,529, filed May 17, 2021, which is a National Stage Entry of PCT International Patent Application No. PCT/IB2020/000481, filed Jun. 15, 2020, which claims the benefit of priority to U.S. Provisional Patent Application No. 62/862,181, filed Jun. 17, 2019, the entire disclosure of which is incorporated by reference in its entirety. BACKGROUND Medical devices having one or more implantable components, generally referred to herein as implantable medical devices, have provided a wide range of therapeutic benefits to recipients over recent decades. In particular, partially or fully-implantable medical devices such as hearing prostheses (e.g., bone conduction devices, mechanical stimulators, cochlear implants, etc.), implantable pacemakers, defibrillators, functional electrical stimulation devices, and other implantable medical devices, have been successful in performing lifesaving and/or lifestyle enhancement functions and/or recipient monitoring for a number of years. The types of implantable medical devices and the ranges of functions performed thereby have increased over the years. For example, many implantable medical devices now often include one or more instruments, apparatus, sensors, processors, controllers or other functional mechanical or electrical components that are permanently or temporarily implanted in a recipient. These functional devices are typically used to diagnose, prevent, monitor, treat, or manage a disease/injury or symptom thereof, or to investigate, replace or modify the anatomy or a physiological process. Many of these functional devices utilize power and/or data received from external devices that are part of, or operate in conjunction with, the implantable medical device. SUMMARY In an example, there is a computer-implemented process comprising: determining an occurrence of a musical activity proximate a recipient of an auditory prosthesis using first transducer signals from one or more first transducers; determining an indication of the recipient's ability to perceive the musical activity using second transducer signals from one or more second transducers; generating an analysis of the recipient's musical perception using the indication; and taking a treatment action relating to the recipient based on the analysis. In another example, there is a process comprising: maintaining, at a server, a template defining an association between transducer signals and musical perception; receiving, at the server, recipient transducer signals associated with a recipient of an auditory prosthesis; using the template and the recipient transducer signals to generate an analysis of the recipient's musical perception; and taking a treatment action relating to the auditory prosthesis based on the analysis. In another example, a computer-implemented process comprising: collecting transducer signals from one or more transducers associated with a recipient of an auditory prosthesis; generating indications of the recipient's musical perception using the transducer signals; generating an analysis of the recipient's musical perception using the indications; and taking a treatment action relating to the recipient using the analysis. In another example, there is a system comprising: a server comprising: a processing unit a memory a template stored in the memory that defines an association between transducer signals and musical perception; instructions stored in the memory that, when executed by the processing unit cause the processer to: receive, from an auditory prosthesis application, recipient transducer signals associated with a recipient of an auditory prosthesis; generate an analysis of the recipient's musical perception using the template and the recipient transducer signals; and take a treatment action relating to the auditory prosthesis based on the analysis. BRIEF DESCRIPTION OF THE DRAWINGS The same number represents the same element or same type of element in all drawings. FIG. 1 illustrates an example system that includes an auditory prosthesis of a recipient and a computing device connected to a server over a network. FIG. 2 illustrates a first example process for taking a treatment action with respect to the musical perception of a recipient of an auditory prosthesis. FIG. 3 illustrates a second example process for taking a treatment action with respect to the musical perception of a recipient of an auditory prosthesis. FIG. 4 illustrates a third example process for taking a treatment action with respect to the musical perception of a recipient of an auditory prosthesis. FIG. 5 illustrates an example cochlear implant system that can benefit from the use of technologies described herein. FIG. 6 is a view of an example percutaneous bone conduction device that can benefit from use of the technologies disclosed herein. FIG. 7 illustrates an example transcutaneous bone conduction device having a passive implantable component that can be