Search

CN-121996119-A - Page interaction processing method and device

CN121996119ACN 121996119 ACN121996119 ACN 121996119ACN-121996119-A

Abstract

The embodiment of the specification provides a page interaction processing method and device, wherein the page interaction processing method comprises the steps of determining the interaction dimension of intelligent interaction currently performed based on a triggering instruction of an intelligent interaction control submitted after a user performs page switching operation in an application program, determining multi-mode interaction parameters for performing intelligent interaction response on the user according to interaction cooperation data corresponding to the interaction dimension, and performing intelligent interaction response on the user according to the multi-mode interaction parameters, so that intelligent interaction response is performed on the user from multiple modes in the application program.

Inventors

  • CHEN YUETIAN

Assignees

  • 支付宝(杭州)数字服务技术有限公司

Dates

Publication Date
20260508
Application Date
20260116

Claims (18)

  1. 1. A page interaction processing method comprises the following steps: Acquiring a triggering instruction of an intelligent interaction control submitted after a user performs page switching operation in an application program; Determining the interaction dimension of the intelligent interaction currently based on the triggering instruction, and acquiring interaction coordination data corresponding to the interaction dimension; Determining multi-mode interaction parameters for intelligent interaction response of the user according to the interaction coordination data; And generating execution data according to the multi-modal interaction parameters to obtain multi-modal execution data, and performing intelligent interaction response on the user based on the multi-modal execution data.
  2. 2. The method for processing page interaction according to claim 1, wherein the obtaining interaction coordination data corresponding to the interaction dimension includes: Acquiring an interaction state corresponding to a user category dimension and performing intelligent interaction response through a digital image; the determining the multi-mode interaction parameters for intelligent interaction response of the user according to the interaction coordination data comprises the following steps: If the interaction state is an interactable state, digital image data for performing intelligent interaction response on the user is determined; And determining a touch interaction parameter for intelligent interaction response according to the environment data and the equipment configuration data, and taking the digital image data and the touch interaction parameter as the multi-mode interaction parameter.
  3. 3. The page interaction processing method according to claim 2, wherein the determining the haptic interaction parameter for performing the intelligent interaction response according to the environment data and the device configuration data comprises: Performing adaptation detection according to the environment data and the equipment configuration data to obtain a data adaptation type; if the data adaptation type is an adaptation type, calculating a haptic interaction influence value according to interaction parameter influence values respectively corresponding to the reference haptic weight, the environment data and the equipment configuration data; and constructing interaction parameters based on the haptic interaction influence values to obtain the haptic interaction parameters.
  4. 4. The page interaction processing method according to claim 3, wherein after the performing of the adaptation detection to obtain the data adaptation type operation according to the environment data and the device configuration data, further comprising: If the data adaptation type is a non-adaptation type, setting the vibration intensity for vibration interaction response as a preset vibration intensity, and carrying out waveform pulse increase on the vibration waveform pulse for vibration interaction response to obtain an increased vibration waveform; and extending the rising edge of the waveform of the added vibration waveform to obtain an extended vibration waveform, and taking the vibration duration of the extended vibration waveform and the preset vibration intensity as the touch interaction parameters.
  5. 5. The page interaction processing method according to claim 1, wherein the step of obtaining interaction collaboration data corresponding to the interaction dimension comprises the steps of obtaining a user data tag corresponding to a user data dimension; the determining the multi-mode interaction parameters for intelligent interaction response of the user according to the interaction coordination data comprises the following steps: and inputting the user data tag into an interaction response model to generate interaction parameters so as to obtain the multi-mode interaction parameters.
  6. 6. The method for processing page interaction according to claim 5, wherein the generating of the interaction parameters is realized by the following steps: Carrying out user mode identification according to the user data tag to obtain preference values of the user for a plurality of modes, and constructing preference vectors based on the preference values and parameter sensitivity thresholds to obtain mode preference vectors of the user; And determining initial multi-modal interaction parameters based on the modal preference vector, and calculating the multi-modal interaction parameters according to the environment data and the initial multi-modal interaction parameters.
  7. 7. The page interaction processing method according to claim 1, wherein the determining the interaction dimension of the current intelligent interaction based on the trigger instruction includes: detecting whether external equipment exists in the user equipment submitted by the user to the trigger instruction; if the cross-terminal interaction dimension exists, determining the cross-terminal interaction dimension; The obtaining the interaction coordination data corresponding to the interaction dimension includes: and acquiring equipment acquisition data of the external equipment corresponding to the cross-terminal interaction dimension based on the equipment category of the external equipment.
  8. 8. The page interaction processing method according to claim 7, wherein the determining, according to the interaction coordination data, the multimodal interaction parameter for the user to perform the intelligent interaction response includes: If the equipment category is a first equipment category, carrying out motion category identification on the user according to the equipment movement data of the external equipment to obtain a motion category, and constructing vibration waveform parameters based on the motion data under the motion category; And determining the visual element flow speed of the visual interaction mode according to the environment data, and taking the visual element flow speed and the vibration waveform parameter as the multi-mode interaction parameter.
  9. 9. The page interaction processing method according to claim 7, wherein the determining, according to the interaction coordination data, the multimodal interaction parameter for the user to perform the intelligent interaction response includes: if the equipment type is a second equipment type, calculating the wearing fit degree of the external equipment based on the equipment pressure data of the external equipment, and determining the touch parameter threshold of the user according to the wearing fit degree and the physiological data; and constructing the haptic parameters according to the haptic parameter threshold to obtain the haptic interaction parameters.
  10. 10. The page interaction processing method according to claim 7, wherein the determining, according to the interaction coordination data, the multimodal interaction parameter for the user to perform the intelligent interaction response includes: If the equipment category is a third equipment category, determining sight focusing data of the user based on user image data; Visual interaction parameters for visual interaction response to the user are determined based on the gaze focus data.
  11. 11. The page interaction processing method according to claim 1, wherein the determining, according to the interaction coordination data, a multimodal interaction parameter for intelligent interaction response of the user includes: determining auditory interaction content for an auditory interaction response for the user based on the user routing record; and calculating the voice parameters of the auditory interaction content according to the equipment configuration data of the user, and taking the auditory interaction content and the voice parameters as the multi-mode interaction parameters.
  12. 12. The page interaction processing method according to claim 11, the method further comprising: extracting text keywords from user text of user voice data; and switching from the application program to the employment service corresponding to the text keyword to perform employment service processing.
  13. 13. The page interaction processing method according to claim 12, wherein the employment service processing is implemented by the following manner: performing employment status identification according to the service operation record of the user in the employment service to obtain the employment status of the user in the employment service; and carrying out employment information pushing to the user in the employment service according to the employment state.
  14. 14. The page interaction processing method according to claim 1, wherein the intelligent interaction control is displayed on an intelligent interaction page in the application program; the intelligent interaction page also displays dynamic interaction controls, and the dynamic interaction controls are obtained based on subroutines in the application program and/or functional modules in the application program.
  15. 15. The page interaction processing method according to claim 14, wherein the dynamic interaction control jumps from the application program to the subroutine and/or from the intelligent interaction page to an access page of the function module after being triggered.
  16. 16. A page interaction processing apparatus, comprising: The instruction acquisition module is configured to acquire a triggering instruction of the intelligent interaction control submitted after the user performs page switching operation in the application program; The data acquisition module is configured to determine the interaction dimension of the intelligent interaction currently based on the trigger instruction and acquire interaction coordination data corresponding to the interaction dimension; a parameter determination module configured to determine a multi-modal interaction parameter for intelligent interaction response to the user according to the interaction coordination data; and the interaction response module is configured to generate execution data according to the multi-mode interaction parameters to obtain multi-mode execution data, and conduct intelligent interaction response on the user based on the multi-mode execution data.
  17. 17. A page interaction processing apparatus comprising: And a memory configured to store computer-executable instructions that, when executed, cause the processor to: Acquiring a triggering instruction of an intelligent interaction control submitted after a user performs page switching operation in an application program; Determining the interaction dimension of the intelligent interaction currently based on the triggering instruction, and acquiring interaction coordination data corresponding to the interaction dimension; Determining multi-mode interaction parameters for intelligent interaction response of the user according to the interaction coordination data; And generating execution data according to the multi-modal interaction parameters to obtain multi-modal execution data, and performing intelligent interaction response on the user based on the multi-modal execution data.
  18. 18. A computer readable storage medium storing computer executable instructions which, when executed, implement the steps of the method of claim 1.

Description

Page interaction processing method and device Technical Field The present document relates to the field of data processing technologies, and in particular, to a method and an apparatus for processing page interaction. Background Along with the continuous popularization of internet technology and artificial intelligence, the development of online services provided based on the internet is faster and faster, users can conveniently perform service processing through page interaction of service pages of the online services, various service demands of the users are realized, along with the increase of the use frequency of the users for the online services, the online services are also gradually used as high-frequency use tools of the users, but along with the continuous increase of service parties of the online services of the same type, and meanwhile, along with the continuous increase of service requirements of the users for the online services, certain pressure and challenges are brought to the service parties of the online services. Disclosure of Invention One or more embodiments of the present disclosure provide a method for processing page interaction, including obtaining a trigger instruction of an intelligent interaction control submitted by a user after performing a page switching operation in an application program. And determining the interaction dimension of the intelligent interaction currently based on the trigger instruction, and acquiring interaction coordination data corresponding to the interaction dimension. And determining multi-mode interaction parameters for intelligent interaction response of the user according to the interaction coordination data. And generating execution data according to the multi-modal interaction parameters to obtain multi-modal execution data, and performing intelligent interaction response on the user based on the multi-modal execution data. One or more embodiments of the present disclosure provide a page interaction processing apparatus, including an instruction obtaining module configured to obtain a trigger instruction of an intelligent interaction control submitted by a user after performing a page switching operation in an application program. The data acquisition module is configured to determine the interaction dimension of the intelligent interaction currently based on the trigger instruction and acquire interaction coordination data corresponding to the interaction dimension. And the parameter determining module is configured to determine multi-mode interaction parameters for intelligent interaction response of the user according to the interaction coordination data. And the interaction response module is configured to generate execution data according to the multi-mode interaction parameters to obtain multi-mode execution data, and conduct intelligent interaction response on the user based on the multi-mode execution data. One or more embodiments of the present specification provide a page interaction processing device including a processor and a memory configured to store computer-executable instructions that, when executed, cause the processor to obtain trigger instructions for a smart interaction control submitted by a user after a page switch operation within an application. And determining the interaction dimension of the intelligent interaction currently based on the trigger instruction, and acquiring interaction coordination data corresponding to the interaction dimension. And determining multi-mode interaction parameters for intelligent interaction response of the user according to the interaction coordination data. And generating execution data according to the multi-modal interaction parameters to obtain multi-modal execution data, and performing intelligent interaction response on the user based on the multi-modal execution data. One or more embodiments of the present specification provide a computer-readable storage medium storing computer-executable instructions that, when executed, perform the steps of obtaining a trigger instruction for an intelligent interactive control submitted by a user after a page switch operation within an application. And determining the interaction dimension of the intelligent interaction currently based on the trigger instruction, and acquiring interaction coordination data corresponding to the interaction dimension. And determining multi-mode interaction parameters for intelligent interaction response of the user according to the interaction coordination data. And generating execution data according to the multi-modal interaction parameters to obtain multi-modal execution data, and performing intelligent interaction response on the user based on the multi-modal execution data. Drawings For a clearer description of one or more embodiments of the present description or of the solutions of the prior art, the drawings that are needed in the description of the embodiments or of the prior art will be briefly described b