Search

US-20260128961-A1 - COMMUNICATION METHOD AND COMMUNICATION APPARATUS

US20260128961A1US 20260128961 A1US20260128961 A1US 20260128961A1US-20260128961-A1

Abstract

Embodiments of the present application provide a communication method and a communication apparatus. The method includes: sending first information related to mutual information of a first artificial intelligence (AI) model and a second AI model, the first AI model and the second AI model constituting a two-sided model; and receiving a first message indicating an AI model related to the first AI model. The AI models at the UE and BS sides constitute a two-sided model, and the UE or the BS can send information related to the mutual information of the AI models to realize interoperability.

Inventors

  • Hao Tang
  • Yiqun Ge
  • Jianglei Ma

Assignees

  • HUAWEI TECHNOLOGIES CO., LTD.

Dates

Publication Date
20260507
Application Date
20251203

Claims (20)

  1. 1 . A method, comprising: sending first information related to mutual information of a first artificial intelligence (AI) model and a second AI model, the first AI model and the second AI model constituting a two-sided model; and receiving a first message indicating an AI model related to the first AI model.
  2. 2 . The method according to claim 1 , wherein the first information comprises at least one of first mutual information, second mutual information, a first ratio, a first neuron node size, or a first ratio range.
  3. 3 . The method according to claim 1 , wherein the first information comprises an index corresponding to first mutual information, an index corresponding to second mutual information, an index corresponding to a first ratio, an index corresponding to a first neuron node size, or an index corresponding to a first ratio range.
  4. 4 . The method according to claim 3 , wherein at least one piece of third mutual information, at least one piece of fourth mutual information, at least one second ratio, at least one second neuron node size, or at least one second ratio range is predetermined or configured by a network device, wherein each of the third mutual information, the fourth mutual information, the at least one second ratio, the at least one second neuron node size, or the at least one second ratio range corresponds to a respective index, and wherein the first mutual information is one of the at least one piece of the third mutual information, the second mutual information is one of the at least one piece of the fourth mutual information, the first ratio is one of the at least one second ratio, the first neuron node size is one of the at least one second neuron node size, or the first ratio range is one of the at least one second ratio range.
  5. 5 . The method according to claim 2 , wherein the first mutual information is an amount of information about an input included in an output of the first AI model, the second mutual information is an amount of information about the output included in an input of the second AI model, the first ratio is a ratio of the second mutual information to the first mutual information, the first neuron node size indicates an output format of the first AI model or an input format of the second AI model, or the first ratio range is a range of multiple first ratios.
  6. 6 . The method according to claim 1 , further comprising: receiving a second message indicating a calculation method for calculating the first information.
  7. 7 . A communication apparatus, comprising: at least one processor coupled with a memory storing instructions, when the at least one processor executes the instructions, cause the communication apparatus to: send first information related to mutual information of a first artificial intelligence (AI) model and a second AI model, the first AI model and the second AI model constituting a two-sided model; and receive a first message indicating an AI model related to the first AI model.
  8. 8 . The communication apparatus according to claim 7 , wherein the first information comprises at least one of first mutual information, second mutual information, a first ratio, a first neuron node size, or a first ratio range.
  9. 9 . The communication apparatus according to claim 7 , wherein the first information comprises an index corresponding to first mutual information, an index corresponding to second mutual information, an index corresponding to a first ratio, an index corresponding to a first neuron node size, or an index corresponding to a first ratio range.
  10. 10 . The communication apparatus according to claim 9 , wherein at least one piece of third mutual information, at least one piece of fourth mutual information, at least one second ratio, at least one second neuron node size, or at least one second ratio range is predetermined or configured by a network device, wherein each of the third mutual information, the fourth mutual information, the at least one second ratio, the at least one second neuron node size, or the at least one second ratio range corresponds to a respective index, and wherein the first mutual information is one of the at least one piece of the third mutual information, the second mutual information is one of the at least one piece of the fourth mutual information, the first ratio is one of the at least one second ratio, the first neuron node size is one of the at least one second neuron node size, or the first ratio range is one of the at least one second ratio range.
  11. 11 . The communication apparatus according to claim 8 , wherein the first mutual information is an amount of information about an input included in an output of the first AI model, the second mutual information is an amount of information about an output included in an input of the second AI model, the first ratio is a ratio of the second mutual information to the first mutual information, the first neuron node size indicates an output format of the first AI model or an input format of the second AI model, or the first ratio range is a range of multiple first ratios.
  12. 12 . The communication apparatus according to claim 7 , wherein the instructions further cause the communication apparatus to receive a second message indicating a calculation method for calculating the first information.
  13. 13 . The communication apparatus according to claim 12 , wherein the calculation method for calculating the first information is Hilbert-Schmidt independence criterion (HSIC) or a predefined mutual information approximation method.
  14. 14 . A communication apparatus, comprising: at least one processor coupled with a memory storing instructions, when the at least one processor executes the instructions, cause the communication apparatus to: receive first information related to mutual information of a first artificial intelligence (AI) model and a second AI model, the first AI model and the second AI model constituting a two-sided model; and send a first message indicating an AI model related to the first AI model.
  15. 15 . The communication apparatus according to claim 14 , wherein the first information comprises at least one of first mutual information, second mutual information, a first ratio, a first neuron node size, or a first ratio range.
  16. 16 . The communication apparatus according to claim 14 , wherein the first information comprises an index corresponding to first mutual information, an index corresponding to second mutual information, an index corresponding to a first ratio, an index corresponding to a first neuron node size, or an index corresponding to a first ratio range.
  17. 17 . The communication apparatus according to claim 16 , wherein at least one piece of third mutual information, at least one piece of fourth mutual information, at least one second ratio, at least one second neuron node size or at least one second ratio range is predetermined or configured by a network device, wherein each of the third mutual information, the fourth mutual information, the at least one second ratio, the at least one second neuron node size, or the at least one second ratio range corresponds to a respective index, and wherein the first mutual information is one of the at least one piece of the third mutual information, the second mutual information is one of the at least one piece of the fourth mutual information, the first ratio is one of the at least one second ratio, the first neuron node size is one of the at least one second neuron node size, or the first ratio range is one of the at least one second ratio range.
  18. 18 . The communication apparatus according to claim 15 , wherein the first mutual information is an amount of information about an input piece of an output of the first AI model, the second mutual information is an amount of information about an output piece of an input of the second AI model, the first ratio is a ratio of the second mutual information to the first mutual information, the first neuron node size indicates an output format of the first AI model or an input format of the second AI model, or the first ratio range is a range of multiple first ratios.
  19. 19 . The communication apparatus according to claim 14 , wherein the instructions further cause the communication apparatus to send a second message indicating a calculation method for calculating the first information.
  20. 20 . The communication apparatus according to claim 19 , wherein the calculation method for calculating the first information is Hilbert-Schmidt independence criterion (HSIC) or a predefined mutual information approximation method.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS This application is a continuation of International Application No. PCT/CN2023/124990, filed on Oct. 17, 2023, which claims priority to, U.S. Provisional Patent Application No. 63/507,786 filed on Jun. 13, 2023. The disclosures of the aforementioned applications are hereby incorporated by reference in their entirety. TECHNICAL FIELD Embodiments of the present application relate to the field of communication, and more specifically, to a communication method and a communication apparatus. BACKGROUND AI-based algorithms have been introduced into modern wireless communications to solve some wireless problems such as channel estimation, scheduling, channel state information (CSI) compression (from a user equipment to a base-station), multiple-in multiple-out (MIMO)'s beamforming, positioning, and so on. As data-driven methods, AI-based algorithms inevitably suffer from low generalization. Performance of artificial intelligence (AI) models is only as good as the data they are trained on. Even if the AI model is trained on a large number of data sets, it may also not possess the necessary knowledge to perform effectively in other environments, especially in wireless communication where the channel information is changed rapidly. For example, in an auto-encoder model, an encoder is deployed on a user equipment (UE) side and a decoder is deployed on a base station (BS) side. The BS and UE train their models independently and need to align the encoder and decoder. In addition, during inference, the generalization problem at UE or BS needs to be considered. For example, the generalization performance of the UE encoder model is worse than that of the decoder model of the BS, but it is hard to know whether the current encoder model is outdated or not during the inference process. Therefore, how to realize the interoperability between the model of BS and the model of UE is an urgent technical problem to be solved. SUMMARY Embodiments of the present application provide a communication method and a communication apparatus. In the technical solutions of the present application, AI models at the UE and BS sides constitute a two-sided model, and the UE or the BS can send information related to the mutual information of the AI models to realize interoperability. According to a first aspect, an embodiment of the present application provides a communication method including: sending first information related to mutual information of a first AI model and a second AI model, the first AI model and the second AI model constituting a two-sided model; and receiving a first message indicating an AI model related to the first AI model. In the communication method provided by the present application, the AI models at the UE and BS sides constitute a two-sided model, and the UE or the BS can send information related to the mutual information of the AI models to realize interoperability. A first AI model is an encoder and a second AI model is a decoder. Alternatively, a first AI model is a decoder and a second AI model is an encoder. The first AI model and the second AI model constitute a two-sided model. In one possible scenario, the first AI model is at the UE side and the second AI model is at the BS side. In another possible implementation scenario, the first AI model 1 and the second AI model 1 are at the UE side and the first AI model 2 and the second AI model 2 are at the BS side. For example, UE trains its own encoder1 and decoder1, BS trains its own encoder2 and decoder2, and finally aligns the UE's encoder 1 with the BS's decoder 2. In this case, the first information sent by the UE is information related to the mutual information of encoder1 and decoder1. In a possible implementation, the first information includes at least one of first mutual information, second mutual information, a first ratio, a first neuron node size, and a first ratio range. The first mutual information is an amount of information about an input included in an output of the first AI model. The second mutual information is an amount of information about an output included in an input of the second AI model. The first ratio is a ratio of the second mutual information to the first mutual information. The first neuron node size is configured to indicate an output format of the first AI model or an input format of the second AI model. The first ratio range is a range of multiple first ratios. The ratio range can also be some discrete value, such as a collection of values. In a two-sided model, if the encoder is deployed at a UE side and the decoder is deployed at a BS side, X is the input of the encoder, Y is the output of the decoder, and T is the output of the encoder as well as the input of the decoder. T is exchanged through the air interface between the BS and the UE. In a two-sided model, if encoder 1 and decoder 1 are deployed at the UE side, X is the input of the encoder 1, Y is the output of the decoder 1, and T