Search

CN-122024772-A - Man-machine interaction method and device, electronic equipment, medium and product

CN122024772ACN 122024772 ACN122024772 ACN 122024772ACN-122024772-A

Abstract

The application discloses a man-machine interaction method, a man-machine interaction device, electronic equipment, media and products. The method comprises the steps of receiving interaction information input by a target user, carrying out emotion recognition processing on the interaction information to obtain structured emotion information, determining a target large language model from a pre-configured large language model instance pool according to the emotion information and emotion context information of the target user, generating an emotion response result through the target large language model, the interaction information and the emotion context information, and outputting the emotion response result. The method can solve the problem that the interaction strategy cannot be dynamically adjusted according to the real-time emotion state of the user, so that the specialization, the safety and the generation freedom degree of the answer are difficult to consider.

Inventors

  • ZHU SHANSHAN
  • LI XIAODAN
  • XIANG HONGBO

Assignees

  • 太乙智能潮玩科技(东莞)有限公司

Dates

Publication Date
20260512
Application Date
20260319
Priority Date
20251224

Claims (11)

  1. 1.A human-computer interaction method, comprising: receiving interaction information input by a target user; Carrying out emotion recognition processing on the interaction information to obtain structured emotion information; Determining a target large language model from a pre-configured large language model instance pool according to the emotion information and the emotion context information of the target user; Generating an emotion response result through the target large language model, the interaction information and the emotion context information; And outputting the emotion response result.
  2. 2. The human-computer interaction method according to claim 1, wherein the performing emotion recognition processing on the interaction information to obtain structured emotion information includes: preprocessing the interaction information to obtain preprocessed information; carrying out emotion analysis on the pretreatment information through a deep learning model based on psychological characteristic training to obtain structured emotion information; Wherein the mood information comprises at least a dominant mood category, a confidence level and a mood intensity value.
  3. 3. The human-machine interaction method according to claim 1, wherein the determining a target large language model from a pre-configured large language model instance pool according to the emotion information and the emotion context information of the target user comprises: Acquiring historical emotion trends, monitoring states and user configuration information of the target user; acquiring emotion context information according to the historical emotion trend, the monitoring state and the user configuration information; Making a decision according to the emotion information, the emotion context information and a preset routing rule to obtain a routing decision; And determining a target large language model from a preconfigured large language model instance pool according to the routing decision.
  4. 4. A human-computer interaction method according to claim 3, wherein the preset routing rules at least comprise intensity threshold rules, compound emotion rules, history dependent rules.
  5. 5. The human-machine interaction method according to claim 1, wherein the generating an emotional response result by the target large language model, the interaction information, and the emotional context information comprises: Calling a model interface of the target large language model from a large language model instance pool; generating an interaction request according to the interaction information and the emotion context information, and sending the interaction request to the target big language model through the model interface; And receiving an emotion response result generated by the target model according to the interaction request.
  6. 6. The human-machine interaction method according to claim 1, wherein after said outputting the emotion response result, the method further comprises: Obtaining the model type of the target large language model and obtaining key abstract information of the emotion response result; and storing the emotion information, the model type and the key abstract information into a shared memory module.
  7. 7. The human-computer interaction method according to claim 1, wherein the large language model instance pool at least comprises a professional psychological support model, a boring model, a crisis intervention model and a creative generation model.
  8. 8. A human-machine interaction device, characterized in that the human-machine interaction device comprises: The receiving unit is used for receiving interaction information input by a target user; the identification unit is used for carrying out emotion identification processing on the interaction information to obtain structured emotion information; a determining unit, configured to determine a target large language model from a pre-configured large language model instance pool according to the emotion information and emotion context information of the target user; a generating unit, configured to generate an emotion response result through the target large language model, the interaction information and the emotion context information; And the output unit is used for outputting the emotion response result.
  9. 9. An electronic device comprising a memory for storing a computer program and a processor that runs the computer program to cause the electronic device to perform the human-machine interaction method of any one of claims 1 to 7.
  10. 10. A readable storage medium, characterized in that a computer program is stored in the readable storage medium, which computer program, when being executed by a processor, performs the man-machine interaction method of any of claims 1 to 7.
  11. 11. A computer program product, characterized in that the computer program product comprises a computer program which, when being executed by a processor, performs the man-machine interaction method of any of claims 1 to 7.

Description

Man-machine interaction method and device, electronic equipment, medium and product Technical Field The application relates to the technical field of artificial intelligence, in particular to a man-machine interaction method, a man-machine interaction device, electronic equipment, a readable storage medium and a computer program product. Background Along with the wide application of large language models in intelligent dialogue systems, the requirements of users on interactive experience are increasingly diversified, and especially in emotion interaction scenes such as intelligent tide playing, psychological accompaniment and the like, the need for simultaneously guaranteeing the professionality, the freedom of generation and the safety of answers is urgent. There is a fixed flow invocation scheme in the prior art that invokes a corresponding specialized model based on a simple keyword or intent classification. However, this method lacks dynamic perceptibility of the emotional state of the user, and cannot recognize subtle emotional changes, for example, when the user mentions negative keywords in light speech, the system still triggers a crisis intervention model due to keyword matching, resulting in excessive reaction and disruption of the interactive experience. The static routing mechanism lacking emotion perception is difficult to realize natural and smooth personalized interaction while keeping safety, and cannot meet the higher requirement on emotional interaction in an intelligent tide playing scene. Disclosure of Invention In view of the above, the present application provides a man-machine interaction method, apparatus, electronic device, readable storage medium and computer program product, which can solve the problem that the interaction policy cannot be dynamically adjusted according to the real-time emotional state of the user, so that the speciality, safety and freedom of generation of the answer are difficult to be considered. In a first aspect, the present application provides a human-computer interaction method, including: receiving interaction information input by a target user; Carrying out emotion recognition processing on the interaction information to obtain structured emotion information; Determining a target large language model from a pre-configured large language model instance pool according to the emotion information and the emotion context information of the target user; Generating an emotion response result through the target large language model, the interaction information and the emotion context information; And outputting the emotion response result. According to the technical scheme, the method can conduct emotion recognition processing on the acquired interaction information to achieve accurate capturing of the user emotion, meanwhile, a target large language model is determined from a large language model instance pool by combining the emotion information and emotion context information of a target user to achieve dynamic adaptation of the model, an emotion response result is generated through the target large language model, the interaction information and the emotion context information, response professionality, safety and degree of freedom are guaranteed, and finally the emotion response result is output to complete emotional interaction. In some embodiments, the performing emotion recognition processing on the interaction information to obtain structured emotion information includes: preprocessing the interaction information to obtain preprocessed information; carrying out emotion analysis on the pretreatment information through a deep learning model based on psychological characteristic training to obtain structured emotion information; Wherein the mood information comprises at least a dominant mood category, a confidence level and a mood intensity value. According to the technical scheme, the method can provide a high-quality data basis for subsequent emotion analysis through preprocessing of the interaction information, and the structured emotion information comprising dominant emotion types, confidence degrees and emotion intensity values is accurately extracted by means of a deep learning model based on psychological feature training, so that objective quantification and accurate recognition of the emotion states of the user are realized. In some embodiments, the determining a target large language model from a preconfigured large language model instance pool according to the emotion information and the emotion context information of the target user comprises: Acquiring historical emotion trends, monitoring states and user configuration information of the target user; acquiring emotion context information according to the historical emotion trend, the monitoring state and the user configuration information; Making a decision according to the emotion information, the emotion context information and a preset routing rule to obtain a routing decision; And determi