Search

US-12619645-B2 - Pupil dynamics entropy and task context for automatic prediction of confidence in data

US12619645B2US 12619645 B2US12619645 B2US 12619645B2US-12619645-B2

Abstract

A pilot monitoring system receives data of a pilot's pose such as arm/hand positions and eyes to detect their gaze and pupil dynamics, coupled with knowledge about their current task to detect what a pilot is paying attention to, and temporally predict what they may do next. The system may use interactions between the pilot and the instrumentation to estimate a probability distribution of the next intention of the pilot. Such probability distribution may be used subsequently to evaluate the performance or training effectiveness and readiness of the pilot. The system determine data that will be necessary for a later pilot action based on the probability distribution, and compile that data from avionics systems for later display.

Inventors

  • Peggy Wu

Assignees

  • ROCKWELL COLLINS, INC.

Dates

Publication Date
20260505
Application Date
20241217

Claims (9)

  1. 1 . A computer apparatus comprising: at least one camera; and at least one processor in data communication with a memory storing processor executable code; and wherein the processor executable code configures the at least one processor to instantiate a trained neural network to: perform an initial calibration to determine a baseline behavior of the pilot for a task; receive an image stream from the at least one camera; receive data being displayed to a pilot; process the image stream to identify eye tracking data including pupil dynamics and eyelid position; determine a pilot pose estimate based on the eye tracking data including pupil dynamics identified in the image stream; correlate the pilot pose estimate to the data being displayed, the data being displayed comprising instrument readings and alerts associated with the task; compare the baseline behavior to subsequent pupil behavior to detect anomalous behavior representative of pilot skepticism with respect to the displayed data; determine a pilot confidence level based on the pilot pose estimate and the anomalous behavior; and when the pilot confidence level is below a predetermined threshold, retrieve supplemental data comprising a source for the data and metadata of the corresponding source, pertaining to the displayed data, and displaying the supplemental data to the pilot.
  2. 2 . The computer apparatus of claim 1 , wherein the pose estimate comprises at least a pilot eye movement, a pilot gaze, a pilot eye lid position, a pilot hand and arm position, and a pilot hand and arm movement.
  3. 3 . The computer apparatus of claim 1 , wherein behavior representative of pilot skepticism comprises dwell time and characteristic eye lid position.
  4. 4 . A method for monitoring pilot behavior via a trained neural network, the method comprising: performing an initial calibration to determine a baseline behavior of the pilot for a task; receiving an image stream from at least one camera; receiving data being displayed to a pilot; processing the image stream to identify eye tracking data including pupil dynamics and eyelid position; determining a pilot pose estimate based on the eye tracking data including pupil dynamics identified in the image stream; correlating the pilot pose estimate to the data being displayed, the data being displayed comprising instrument readings and alerts associated with the task; comparing the baseline behavior to subsequent pupil behavior to detect anomalous behavior representative of pilot skepticism with respect to the displayed data; determining a pilot confidence level based on the pilot pose estimate and the anomalous behavior; and when the pilot confidence level is below a predetermined threshold, retrieving supplemental data comprising a source for the data and metadata of the corresponding source, pertaining to the displayed data and displaying the supplemental data to the pilot.
  5. 5 . The method of claim 4 , wherein the pose estimate comprises at least a pilot eye movement, a pilot gaze, a pilot eye lid position, a pilot hand and arm position, and a pilot hand and arm movement.
  6. 6 . The method of claim 4 , wherein behavior representative of pilot skepticism comprises dwell time and characteristic eye lid position.
  7. 7 . A pilot monitoring system comprising: at least one camera; and at least one processor in data communication with a memory storing processor executable code; and wherein the processor executable code configures the at least one processor to instantiate a trained neural network to: perform an initial calibration to determine a baseline behavior of the pilot for a task; receive an image stream from the at least one camera; receive data being displayed to a pilot; process the image stream to identify eye tracking data including pupil dynamics and eyelid position; determine a pilot pose estimate based on the eye tracking data including pupil dynamics identified in the image stream; correlate the pilot pose estimate to the data being displayed, the data being displayed comprising instrument readings and alerts associated with the task; compare the baseline behavior to subsequent pupil behavior to detect anomalous behavior representative of pilot skepticism with respect to the displayed data; determine a pilot confidence level based on the pilot pose estimate and the anomalous behavior; and when the pilot confidence level is below a predetermined threshold, retrieve supplemental data comprising a source for the data and metadata of the corresponding source, pertaining to the displayed data, and displaying the supplemental data to the pilot.
  8. 8 . The pilot monitoring system of claim 7 , wherein the pose estimate comprises at least a pilot eye movement, a pilot gaze, a pilot eye lid position, a pilot hand and arm position, and a pilot hand and arm movement.
  9. 9 . The pilot monitoring system of claim 7 , wherein behavior representative of pilot skepticism comprises dwell time and characteristic eye lid position.

Description

The present application claims the benefit under 35 U.S.C. § 119(e) of U.S. Provisional App. No. 63/612,156 (filed Dec. 19, 2023), which is incorporated herein by reference. BACKGROUND Operators utilizing augmented data visualizations may not have the time or cognitive capability to review the raw data that data aggregations and visualizations are designed to represent. In highly complex tasks, this may lead to the operator having a lack in confidence in the visualizations. This can result in time and resources wasted by the operator to verify data sources and calculations that have already been performed, thus taking away time for more mission-relevant tasks. Consequently, it would be advantageous if an apparatus existed that is suitable for monitoring a pilot's situational awareness and determining a pilot's future intentions. SUMMARY In one aspect, embodiments of the inventive concepts disclosed herein are directed to a pilot monitoring system that receives data of a pilot's pose such as arm/hand positions and eyes to detect their gaze and pupil dynamics, coupled with knowledge about their current task to detect what a pilot is paying attention to, and temporally predict what they may do next. In a further aspect, the system may use interactions between the pilot and the instrumentation to estimate a probability distribution of the next intention of the pilot. Such probability distribution may be used subsequently to evaluate the performance or training effectiveness and readiness of the pilot. In a further aspect, the system determine data that will be necessary for a later pilot action based on the probability distribution, and compile that data from avionics systems for later display. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and should not restrict the scope of the claims. The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate exemplary embodiments of the inventive concepts disclosed herein and together with the general description, serve to explain the principles. BRIEF DESCRIPTION OF THE DRAWINGS The numerous advantages of the embodiments of the inventive concepts disclosed herein may be better understood by those skilled in the art by reference to the accompanying figures in which: FIG. 1 shows a block diagram of a system suitable for implementing embodiments of the incentive concepts disclosed herein; FIG. 2 shows a flowchart of an exemplary embodiment of the inventive concepts disclosed herein; and FIG. 3 shows a block diagram of a neural network according an exemplary embodiment of the inventive concepts disclosed herein. DETAILED DESCRIPTION Before explaining various embodiments of the inventive concepts disclosed herein in detail, it is to be understood that the inventive concepts are not limited in their application to the arrangement of the components or steps or methodologies set forth in the following description or illustrated in the drawings. In the following detailed description of embodiments of the instant inventive concepts, numerous specific details are set forth in order to provide a more thorough understanding of the inventive concepts. However, it will be apparent to one of ordinary skill in the art having the benefit of the instant disclosure that the inventive concepts disclosed herein may be practiced without these specific details. In other instances, well-known features may not be described in detail to avoid unnecessarily complicating the instant disclosure. The inventive concepts disclosed herein are capable of other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting. As used herein a letter following a reference numeral is intended to reference an embodiment of a feature or element that may be similar, but not necessarily identical, to a previously described element or feature bearing the same reference numeral (e.g., 1, 1a, 1b). Such shorthand notations are used for purposes of convenience only, and should not be construed to limit the inventive concepts disclosed herein in any way unless expressly stated to the contrary. Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by anyone of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present). In addition, use of “a” or “an” are employed to describe elements and components of embodiments of the instant inventive concepts. This is done merely for convenience and to give a general sense of the inventive concepts, and “a” and “an” are intended to include one or at least one and the singular also inclu