Search

US-20260127971-A1 - PERFORMANCE ANALYTICS ENGINE FOR GROUP RESPONSES

US20260127971A1US 20260127971 A1US20260127971 A1US 20260127971A1US-20260127971-A1

Abstract

A system including a computer server implementing a learning resource configured to monitor a user interaction with the learning resource, and encode, based on the user interactions, a user event. The system includes a computer server implementing an event processor. The event processor is configured to receive, from the computer server, the user event, parse the user event to determine the identifications of the user generating the user event, the assessment item, and the learning resource, and the indication of whether the user event is associated with a correct answer or an incorrect answer, and store, in an analytics storage database, a data record including the identification of the user generating the user event, the assessment item, the learning resource, and the indication of whether the user event is associated with a correct answer or an incorrect answer.

Inventors

  • Stephen Carroll
  • Brian DAILEY
  • Emilia PANKOWSKA
  • Jennifer Arlene COLEMAN
  • Zachary ELEWITZ

Assignees

  • PEARSON EDUCATION, INC.

Dates

Publication Date
20260507
Application Date
20251013

Claims (20)

  1. 1 - 18 . (canceled)
  2. 19 . An electronic device comprising a software application running on a processor, the processor to: receive, with a queue intake, user events from a plurality of resources, each of the plurality of resources implemented by a computer server; route the user events from the queue intake to a sorting entity, each user event including data values; parse, with a challenging content module, the data values in each user event, the challenging content module communicatively coupled to the sorting entity and an event storage database to store the parsed data values; receive a request for challenging content from a first resource of the plurality of resources, and in response to receiving the request, the processor is to: determine, from the request, a first assessment item and retrieve from the event storage database a first set of data values associated with the first assessment item; determine a percentage of the first set of data values associated with a correct answer; determine a threshold percentage, and identify the first assessment item as challenging content when the percentage is below the threshold percentage; and generate outputs of the challenging content, each of the outputs uniquely generated to be usable by each of the plurality of resources.
  3. 20 . The electronic device of claim 1 , wherein a report engine generates the output and a report storage database stores the output, the report storage database being different from the event storage database.
  4. 21 . The electronic device of claim 1 , wherein the queue intake receives unstructured data from a first of the plurality of resources and receives structured data from a second of the plurality of resources.
  5. 22 . The electronic device of claim 3 , wherein the structured and unstructured data are simultaneously received in real-time.
  6. 23 . The electronic device of claim 1 , wherein output generated for the first resource is unstructured and output generated for a second resource of the plurality of resources is structured.
  7. 24 . The electronic device of claim 1 , wherein the data values comprise an item identification, an assessment identification, a class identification, a user identification, a date and time, an answer identification, a correct status, and a resource identification associated with the user event.
  8. 25 . The electronic device of claim 1 , wherein the processor includes an analytics report engine to: retrieve data used to generate the outputs; compile the outputs; and transmit the outputs to the plurality of resources.
  9. 26 . The electronic device of claim 1 , wherein a duplicate of each output is stored in a report storage database, and subsequent outputs are compared to the outputs stored in the report storage database.
  10. 27 . A method comprising: receiving, with a queue intake, user events that include data values, the user events being received from two or more resources implemented by one or more computer servers; routing the user events from the queue intake to a sorting entity; parsing, with a challenging content module, the data values in each user event, storing, with an event storage database, parsed data values from the challenging content module; receiving a request for challenging content from a first resource of the two or more resources, and in response to receiving the request: determining from the request a first assessment item and retrieving, from the event storage database, a first set of data values associated with the first assessment item; determining a percentage of the first set of data values associated with a correct answer and a threshold percentage, identifying the first assessment item as challenging content when the percentage is below the threshold percentage; and generating outputs of the challenging content, each of the outputs uniquely generated to be usable a corresponding resource of the two or more resources.
  11. 28 . The method of claim 9 , further comprising ranking each data value in a user event.
  12. 29 . The method of claim 10 , further comprising filtering data values based on specific content ranking requirements.
  13. 30 . The method of claim 11 , wherein the specific content ranking requirements include one or more of an aggregation level, individual learner's aggregation context, challenging items, and threshold setting.
  14. 31 . The method of claim 11 , further comprising returning only data values above a given rank score.
  15. 32 . The method of claim 9 , wherein receiving the user events comprises receiving user events in two or more different formats.
  16. 33 . The method of claim 14 , wherein receiving the user events comprises a first user event with structured data and a second user event with unstructured data.
  17. 34 . A non-transitory tangible computer-readable medium comprising instructions when executed cause a processor of an electronic device to: receive a first user event from a first resource implemented by a first computer server and a second user event from a second resource implemented by a second computer server, wherein the first and second user events are received in different formats; parse the data values in each user event, and store the parsed data values in an event storage database; receive a request for challenging content from the first resource, and in response to receiving the request, the processor is to: retrieve a set of data values corresponding to the first resource; determine a percentage of the first set of data values associated with a correct answer and determine a threshold percentage; and identify the first assessment item as challenging content when the percentage is below the threshold percentage; and generate, a first report of challenging content uniquely generated to be usable by the first resource and a second report of challenging content uniquely generated to be usable by the second resource.
  18. 35 . The non-transitory tangible computer-readable medium of claim 16 , wherein the first report is to generate a dashboard in the first resource, the dashboard comprising: a list of assignments, an indicator indicating one or more challenging assessment items are identified in an assignment, a pop-up, responsive to selecting the assignment with one or more challenging assessment items, to identify content identified as challenging.
  19. 36 . The non-transitory tangible computer-readable medium of claim 16 , wherein a duplicate of each report is stored in a report storage database, and subsequent reports include a comparison between each subsequent report and the reports stored in the report storage database.
  20. 37 . The non-transitory tangible computer-readable medium of claim 16 , wherein the first report is to automatically generate a dashboard to identify challenging content in the first resource, and the second report is to automatically generate a dashboard to identify challenging content in the second resource.

Description

RELATED APPLICATIONS This application is a continuation application of U.S. application Ser. No. 17/130,924, filed Dec. 22, 2020, the entire content of which is incorporated herein by reference. FIELD OF THE INVENTION This disclosure relates to the field of systems and methods configured to process user interaction events across a platform of systems and learning resources to generate performance metrics for items responses generated by a groups of users. SUMMARY OF THE INVENTION The present invention provides systems and methods comprising one or more server hardware computing devices or client hardware computing devices, communicatively coupled to a network, and each comprising at least one processor executing specific computer-executable instructions within a memory. An embodiment of the present invention includes a system including an analytics storage database and a plurality of computer servers. Each computer server of the plurality of computer servers implements a learning resource. Each learning resource is configured to monitor user interactions with the learning resource, and encode, based on the user interactions, user events, each user event including identifications of the user generating the user event, an assessment item, and the learning resource and including an indication of whether the user event is associated with a correct answer or an incorrect answer. The system includes a computer server implementing an event processor. The event processor is configured to receive, from the plurality of computer servers, a plurality of user events, and, for each user event parse each received user event to determine the identifications of the user generating the user event, the assessment item, and the learning resource, and the indication of whether the user event is associated with a correct answer or an incorrect answer. The events processor is configured to store, in the analytics storage database, a data record including the identification of the user generating the user event, the assessment item, the learning resource, and the indication of whether the user event is associated with a correct answer or an incorrect answer, receive, from a first learning resource, a request to generate an analytics report, determine, from the request, a first assessment item, retrieve, from the analytics storage database, a first set of data records associated with the first assessment item, determine a percentage of data records in the first set of data records associated with a correct answer, determine that the percentage of data records falls below a threshold percentage, and transmit to the first learning resource a report indicating that the first assessment item is associated with a challenging content. Another embodiment includes a system including a computer server implementing a learning resource configured to monitor a user interaction with the learning resource, and encode, based on the user interactions, a user event including identifications of the user generating the user event, an assessment item, and the learning resource and including an indication of whether the user event is associated with a correct answer or an incorrect answer. The system includes a computer server implementing an event processor. The event processor is configured to receive, from the computer server, the user event, parse the user event to determine the identifications of the user generating the user event, the assessment item, and the learning resource, and the indication of whether the user event is associated with a correct answer or an incorrect answer, and store, in an analytics storage database, a data record including the identification of the user generating the user event, the assessment item, the learning resource, and the indication of whether the user event is associated with a correct answer or an incorrect answer. An embodiment includes a method including receiving, from a learning resource, a user event, parsing the user event to determine identifications of the user generating the user event, an assessment item, and a learning resource, and an indication of whether the user event is associated with a correct answer or an incorrect answer, and storing, in an analytics storage database, a data record including the identification of the user generating the user event, the assessment item, the learning resource, and the indication of whether the user event is associated with a correct answer or an incorrect answer. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 illustrates a system level block diagram for a non-limiting example of a distributed computing environment that may be used in practicing the invention. FIG. 2 illustrates a system level block diagram for an illustrative computer system that may be used in practicing the invention. FIG. 3 illustrates a block diagram depicting functional components of the present system. FIG. 4 is a flowchart depicting a method for receiving and processing user event reports f