Search

US-12626049-B2 - System and method for automatic summarization in interlocutor turn-based electronic conversational flow

US12626049B2US 12626049 B2US12626049 B2US 12626049B2US-12626049-B2

Abstract

A computer-implemented method and computer system improvements for automatically generating a summary note related to a digitally-recorded interlocutor conversation, such as a chat transcript, including accessing a data corpus having at least one digitally-recorded conversation of text-based interlocutory conversation(s) including at least one labeled value, extracting at least one Conversation Feature from the labeled value(s), creating at least one Summary Feature from the extracted Conversation Feature(s), generating at least one Summary Note by combining one or more Narrative Structures with the Summary Feature(s), and digitally outputting the at least one Summary Note.

Inventors

  • David John Attwater
  • Pedro Vale Lima
  • Jonathan E. Eisenzopf

Assignees

  • discourse.ai, Inc.

Dates

Publication Date
20260512
Application Date
20220129

Claims (17)

  1. 1 . A method implemented on a computing device for automatically generating a computer-based conversation summary note related to a digitally-recorded interlocutor conversation, the method comprising: accessing, by a computer processor, a data corpus having a plurality of digitally-recorded conversations of text-based interlocutory conversations; extracting, by a computer system, one or more Conversation Features from one or more regions of the digitally-recorded interlocutory conversations; creating, by a computer system, one or more Summary Features from the one or more extracted Conversation Features, wherein the one or more Conversation Features comprise disjointed phrases; generating and transforming, by a computer system, at least one Summary Note by combining one or more Narrative Structures with the one or more Summary Features, wherein the transforming employs a generative language model which preserves meaning while resolving one or more narrative elements selected from the group consisting of pronouns, tense, and aspect; and digitally outputting, by the computer system, the at least one Summary Note.
  2. 2 . The method of claim 1 wherein the digitally-recorded conversations of text-based interlocutory conversations comprises at least one labeled value.
  3. 3 . The method of claim 2 wherein the extracting of one or more Conversation Features further comprises extracting from at least one labeled value.
  4. 4 . The method of claim 2 wherein the at least one labeled value comprises one or more labels selected from the group consisting of a Client Intent, an Agent Intent, a Topic, a Turn Group Purpose, and an Outcome.
  5. 5 . The method of claim 2 wherein the at least one labeled value comprises one or more values labels selected from the group consisting of a Goal, an Action, an Object, and a Property.
  6. 6 . The method of claim 1 further comprising transforming, by the computer system, the Summary Note to improve readability, agreement and preferred format.
  7. 7 . The method of claim 1 wherein the creating of the one or more Summary Features comprises, at least in part, automatically creating at least one Summary Feature.
  8. 8 . The method of claim 1 wherein the creating of the one or more Summary Features comprises, at least in part, automatically creating at least one Summary Feature from a Summary Feature received from a user console.
  9. 9 . The method of claim 1 further comprising receiving one or more modification commands from a user console to change a Summary Feature.
  10. 10 . The method of claim 1 wherein the generating of at least one Summary Note further comprises combining with at least one Narrative Structure.
  11. 11 . The method of claim 10 wherein the combining with at least one Narrative Structure comprises appending a preamble.
  12. 12 . The method of claim 10 wherein the combining with at least one Narrative Structure comprises combining with a postlude.
  13. 13 . The method of claim 1 wherein the digitally outputting of the at least one Summary Note comprises storing the Summary Note to a Customer Relationship Management Database.
  14. 14 . The method of claim 1 wherein the digitally outputting of the at least one Summary Note comprises printing the Summary Note to a printer.
  15. 15 . The method of claim 1 wherein the digitally outputting of the at least one Summary Note comprises displaying the Summary Note on a user console.
  16. 16 . A computer program product for automatically generating a computer-based conversation summary note related to a digitally-recorded interlocutor conversation, comprising: one or more tangible, computer-readable computer memory devices which are not a propagating signal per se; and computer instructions encoded by the one or more tangible, computer-readable computer memory devices configured to cause one or more processors to, when executed: access a data corpus having a plurality of digitally-recorded conversations of text-based interlocutory conversations; extract one or more Conversation Features from one or more regions of the digitally-recorded interlocutory conversations; create one or more Summary Features from the one or more extracted Conversation Features, wherein the one or more Conversation Features comprise disjointed phrases; generate and transform at least one Summary Note by combining one or more Narrative Structures with the one or more Summary Features, wherein the transforming employs a generative language model which preserves meaning while resolving one or more narrative elements selected from the group consisting of pronouns, tense, and aspect; and digitally output the at least one Summary Note.
  17. 17 . A system for automatically generating a computer-based conversation summary note related to a digitally-recorded interlocutor conversation, comprising: one or more computer processors; one or more tangible, computer-readable computer memory devices which are not a propagating signal per se; and computer instructions encoded by the one or more tangible, computer-readable computer memory devices configured to cause the one or more processors to, when executed: access a data corpus having a plurality of digitally-recorded conversations of text-based interlocutory conversations; extract one or more Conversation Features from one or more regions of the digitally-recorded interlocutory conversations; create one or more Summary Features from the one or more extracted Conversation Features, wherein the one or more Conversation Features comprise disjointed phrases; generate and transform at least one Summary Note by combining one or more Narrative Structures with the one or more Summary Features, wherein the transforming employs a generative language model which preserves meaning while resolving one or more narrative elements selected from the group consisting of pronouns, tense, and aspect; and digitally output the at least one Summary Note.

Description

FIELD OF THE INVENTION This patent application relates to automated systems and methods for summarizing an electronic conversation between two or more parties, and especially to systems and methods which apply artificial intelligence and machine learning to digital conversation data in a turn-based model. BACKGROUND OF INVENTION The following patents and patent applications are incorporated by reference in their entireties: (a) U.S. pending patent application Ser. No. 17/124,005, filed on Dec. 16, 2020, by Pedro Vale Lima, et al.;(b) U.S. patent application Ser. No. 16/786,923, filed on Feb. 10, 2020, by Jonathan Eisenzopf, now U.S. Pat. No. 10,896,670;(c) U.S. patent application Ser. No. 16/734,973, which was filed on Jan. 6, 2020, by Jonathan Eisenzopf, now U.S. Pat. No. 11,004,013;(d) U.S. patent application Ser. No. 16/201,188, which was filed on Nov. 27, 2018, by Jonathan Eisenzopf, now U.S. Pat. No. 10,929,611; and(e) U.S. patent application Ser. No. 16/210,081, which was filed on Dec. 5, 2018, by Jonathan Eisenzopf. Online conversational text-based communication and interaction systems are growing in popularity as clients of business entities expect to be able to “chat” with business representatives via websites and smartphone application programs at any time of day, any day of the week, any time of year. It was estimated by consulting firm Deloitte in 2017 that 76% of customer interactions occur through conversations, but that 50% of those conversations fail to meet customer expectations, which was estimated to result in $1.6 trillion lost in global revenue annually due to the poor customer experience from these conversations according to the eleventh annual Accenture Global Consumer Pulse Survey in 2016. It is expected by some industry analysts that Artificial Intelligence (AI) can be leveraged to automate a large portion of these conversations, especially through chatbot platforms. The McKinsey Global Institute predicted in 2018 that AI-based conversation platforms that utilize manually supervised deep-learning technology with training from at least 10 million labeled conversation examples would match or exceed the success rate of human-to-human conversations. SUMMARY OF THE EXEMPLARY EMBODIMENTS OF THE INVENTION Disclosed herein are one or more example embodiments of a computer-implemented method and one or more computer system improvements for automatically generating a summary note related to a digitally-recorded interlocutor conversation, such as a chat transcript, including accessing a data corpus having at least one digitally-recorded conversation of text-based interlocutory conversation(s) including at least one labeled value, extracting at least one Conversation Feature from the labeled value(s), creating at least one Summary Feature from the extracted Conversation Feature(s), generating at least one Summary Note by combining one or more Narrative Structures with the Summary Feature(s), and digitally outputting the at least one Summary Note. BRIEF DESCRIPTION OF THE DRAWINGS The figures presented herein, when considered in light of this description, form a complete disclosure of one or more embodiments of the present invention, wherein like reference numbers in the figures represent similar or same elements or steps. FIG. 1 depicts an improved data processing system and its related components according to at least one embodiment of the invention disclosed in the related and incorporated U.S. patent application Ser. No. 16/201,188. FIG. 2 depicts one or more methods according to the invention disclosed in the related and incorporated U.S. patent application Ser. No. 16/201,188 performed by the improved data processing system to classify a plurality of conversation transcriptions between two or more interlocutors. FIG. 3 illustrates an exemplary conversation classification method including splitting a plurality of transcribed conversations between multiple interlocutors into a plurality of conversation segments. FIG. 4 shows an exemplary embodiment of a method for dominant weighting for a dominant path modeler. FIG. 5 illustrates an exemplary topic classification method used by a topic classifier to identify the correct topic of conversation. FIG. 6 depicts an exemplary weighted conversation model using a weighted conversation model. FIG. 7 sets forth an exemplary conversation ontology used to for rule-based decision making to split transcribed conversations into segments for classification by the improved data processing system as disclosed in the related and incorporated U.S. patent application Ser. No. 16/201,188. FIG. 8 illustrates an exemplary arrangement of computers, devices, and networks according to at least one embodiment of the invention disclosed in the related and incorporated U.S. patent application Ser. No. 16/201,188. FIG. 9 illustrates an exemplary arrangement, according to the invention disclosed in the related and incorporated U.S. patent application Ser. No. 16/210,081, o