Search

US-12625555-B1 - Systems for determining user breath movements and synchronizing output based on breath movements

US12625555B1US 12625555 B1US12625555 B1US 12625555B1US-12625555-B1

Abstract

While a user performs an activity, a photoplethysmograph (PPG) sensor or other type of sensor is secured to the user's torso. Movement of the user's torso due to breathing changes the distance between the torso and the sensor, affecting an amplitude of a signal determined by the sensor. The amplitude is used to determine the position and rate of movement of the user's torso, which may indicate breath movements of the user. A camera acquires video data representing the user performing the activity. The position of the user's body and the portion of the activity being performed are determined based on the video data. An expected breath movement for the portion of the activity is determined. Corrective instructions or confirmations are output to the user based on the expected breath movement and the user's breath movement that was determined using the sensor.

Inventors

  • Ilia Vitsnudel

Assignees

  • AMAZON TECHNOLOGIES, INC.

Dates

Publication Date
20260512
Application Date
20210917

Claims (14)

  1. 1 . A system comprising: a camera; a photoplethysmograph (PPG) sensor comprising a light source and a detector; one or more memories storing computer-executable instructions; and one or more hardware processors to execute the computer-executable instructions to: acquire video data using the camera, wherein the video data represents a user performing an activity; determine a pose of the user based on the video data; determine a portion of the activity that corresponds to the pose; determine, based on activity data that associates expected breath movements with portions of the activity, an expected breath movement that corresponds to the portion of the activity; determine a signal using the PPG sensor, wherein the signal represents light from the light source reflected by a torso of a body of the user; determine, based on an amplitude of the signal, a breath movement of the user; determine correspondence between the breath movement of the user and the expected breath movement that corresponds to the portion of the activity; and present an instruction indicative of the correspondence between the breath movement of the user and the expected breath movement.
  2. 2 . The system of claim 1 , the one or more hardware processors to further execute the computer-executable instructions to: determine a portion of the pose that indicates a position of the torso of the body of the user, wherein the breath movement is further determined based at least in part on the position of the torso of the body of the user.
  3. 3 . The system of claim 1 , the one or more hardware processors to further execute the computer-executable instructions to: before the signal is determined using the PPG sensor, cause the light source to emit a first quantity of light; determine a difference between a second quantity of light reflected by the torso of the body of the user and a threshold quantity of light; cause the light source to emit a third quantity of light based on the difference; and determine that a fourth quantity of light reflected by the torso of the body of the user is within a threshold value of the threshold quantity of light.
  4. 4 . A method comprising: acquiring video data using a camera, wherein the video data represents a user performing an activity; determining, based on the video data, a portion of the activity being performed by the user; determining, based on first activity data that associates expected breath movements with portions of the activity, an expected breath movement that corresponds to the portion of the activity being performed by the user; determining a first signal using a first sensor, wherein the first signal is emitted from a first source secured relative to a body of the user and is at least partially reflected by a first portion of the body of the user; determining, based on a first amplitude of the first signal, a first breath movement of the user during performance of the activity; determining a first correspondence between the first breath movement of the user and the expected breath movement that corresponds to the portion of the activity being performed by the user; and presenting an output indicative of the first correspondence between the first breath movement of the user and the expected breath movement.
  5. 5 . The method of claim 4 , further comprising: determining a pose of the user based on the video data, wherein the pose represents a position of the body of the user; wherein the expected breath movement is determined based in part on the pose of the user.
  6. 6 . The method of claim 4 , further comprising: determining a pose of the user based on the video data, wherein the pose represents a position of the body of the user; wherein the portion of the activity being performed by the user corresponds to the pose.
  7. 7 . The method of claim 4 , further comprising: determining one or more characteristics of the first signal that is at least partially reflected by the first portion of the body of the user; and determining one or more physiological values associated with the user based on the one or more characteristics of the first signal; wherein the output is determined based in part on the one or more physiological values.
  8. 8 . The method of claim 4 , wherein the first signal represents first light from a first light source that is secured relative to the body of the user and is at least partially reflected by the first portion of the body of the user, the method further comprising: before determining the first signal, causing the first light source to emit a first quantity of light; determining a first value associated with at least a portion of the first quantity of light reflected by the body of the user; determining a difference between the first value and a threshold value; and based on the difference, causing the first light source to emit a second quantity of light that differs from the first quantity of light, wherein the first signal is acquired based on the second quantity of light.
  9. 9 . The method of claim 4 , further comprising: determining a second signal using a second sensor associated with the user; determining a characteristic of the activity; and determining, based on second activity data that associates accuracy of signals with characteristics of activities, that the characteristic corresponds to the first signal; wherein the first breath movement is determined based on the first signal in response to a second correspondence between the second activity data and the characteristic.
  10. 10 . A system comprising: a camera; a first sensor; one or more memories storing computer-executable instructions; and one or more hardware processors to execute the computer-executable instructions to: acquire video data using the camera, wherein the video data represents a user performing an activity; determine, based on the video data, a portion of the activity being performed by the user; determine an expected breath movement that corresponds to the portion of the activity being performed by the user; determine a first signal using the first sensor, wherein the first signal is one or more of emitted from or reflected by a first portion of a body of the user; determine, based on a first characteristic of the first signal, a first breath movement of the user during performance of the activity; determine a correspondence between the first breath movement of the user and the expected breath movement that corresponds to the portion of the activity being performed by the user; and present an output indicative of the correspondence between the first breath movement of the user and the expected breath movement.
  11. 11 . The system of claim 10 , wherein the first sensor comprises a photoplethysmograph (PPG) sensor that emits first light, and the one or more hardware processors to further execute the computer-executable instructions to: determine one or more characteristics of the first light that is at least partially reflected by the body of the user; and determine one or more physiological values associated with the user based on the one or more characteristics.
  12. 12 . The system of claim 10 , wherein the first characteristic of the first signal includes an amplitude.
  13. 13 . The system of claim 10 , the one or more hardware processors to further execute the computer-executable instructions to: based on the video data, determine a position of one or more of the first portion or a second portion of the body of the user; wherein the expected breath movement is determined based on the position of the one or more of the first portion or the second portion of the body of the user determined using the video data.
  14. 14 . The system of claim 10 , the one or more hardware processors to further execute the computer-executable instructions to: based on the video data, determine a position of the body of the user; determine the portion of the activity that corresponds to one or more of the position of the body of the user or second output indicative of the activity; and determine the expected breath movement that corresponds to the portion of the activity being performed by the user.

Description

BACKGROUND Video and audio instruction may be used to assist a user when performing a variety of activities, such as fitness exercises. In some cases, input from cameras and other sensors may be used to determine characteristics of the user's performance, which may affect the specific instruction that is presented. Determining the breath movements of the user during performance of the activity may provide information regarding the user's performance and useful output that may be presented. However, user motion during various activities may cause many methods for determining the user's breath movements to be inaccurate, or in some cases impossible. BRIEF DESCRIPTION OF FIGURES The detailed description is set forth with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items or features. FIG. 1 depicts an implementation of a system for determining breath movements based on signals from a sensor and determining outputs based on expected breath movements. FIG. 2 is a block diagram depicting an implementation of a system for determining breath movements based on sensor data from one or more sensors and determining outputs associated with the breath movements. FIG. 3 is a diagram depicting an implementation of a process for calibrating a sensor based on a threshold amplitude for signals and using deviations from the threshold amplitude to determine breath movements of a user. FIG. 4 is a graph depicting example signals determined using multiple sensors over a period of time. FIG. 5 is a diagram depicting an implementation of a process for determining breath movements of a user based on sensor data from one or more sensors. FIG. 6 is a block diagram depicting an implementation of a computing device within the present disclosure. While implementations are described in this disclosure by way of example, those skilled in the art will recognize that the implementations are not limited to the examples or figures described. It should be understood that the figures and detailed description thereto are not intended to limit implementations to the particular form disclosed but, on the contrary, the intention is to coverall modifications, equivalents, and alternatives falling within the spirit and scope as defined by the appended claims. The headings used in this disclosure are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word “may” is used in a permissive sense (i.e., meaning having the potential to) rather than the mandatory sense (i.e., meaning must). Similarly, the words “include”, “including”, and “includes” mean “including, but not limited to”. DETAILED DESCRIPTION Video and audio instruction may be used to assist users when performing various activities. For example, a video may depict an instructor performing a fitness exercise while providing verbal instruction. A user may attempt to perform the exercise by duplicating the visible movements of the instructor and following audible instructions. In some cases, data regarding performance of an activity by a user may be acquired and used to provide feedback or additional instruction. For example, a user may perform a fitness exercise within a field of view of a camera, and video data acquired by the camera may be used to determine the positions and movements of the user's body. The determined positions and movements may be compared to known positions and movements that represent correct performance of the activity, and output may be presented that indicates correct or incorrect performance, such as encouragement, corrective instruction, a score or other type of rating, and so forth. In some cases, proper performance of an activity may include use of certain breath movements (e.g., specific times for a user to inhale, exhale, hold their breath, rates of inhalation and exhalation (e.g., velocity of airflow), breathing rates, volume of air to inhale and exhale, and so forth). In other cases, certain breath movements may facilitate performance of an activity, such as timing inhalation and exhalation to coincide with certain portions of a weight training exercise or maintaining a target breath rate while performing aerobic exercise. However, determination of breath movements during performance of certain activities may be subject to inaccuracy. For example, video data acquired using a camera may be used to determine movements of a user's chest and abdomen that may indicate breathing, however, if the activity performed by the user involves significant motion, determination of the smaller movements of the chest and abdomen that correspond to breathing may be inaccurate, or determination of such movements may not be possible. Similarly, an accelerometer or other mo