Search

US-20260127414-A1 - METHOD OF OPTIMIZING NETWORK BY USING FEATURE EXTRACTED FROM NETWORK AND ELECTRONIC DEVICE FOR PERFORMING THE METHOD

US20260127414A1US 20260127414 A1US20260127414 A1US 20260127414A1US-20260127414-A1

Abstract

A method includes: obtaining network entity data associated with each network entity, from each of one or more network entities; generating, using an encoder model, network embeddings for the one or more network entities, based on the network entity data; converting, using a transformation model, the network embeddings into a predefined number of parameters; inputting the predefined number of parameters to an inference model; obtaining, from the inference model, an output regarding the predefined number of parameters; and determining, based on the output of the inference model, one or more parameters associated with control of a network.

Inventors

  • Juhwan SONG
  • Hoesang CHOI
  • Minsuk Choi
  • Seowoo JANG
  • Ahyun LEE
  • Sanghyun Lee
  • Yujin NAM
  • Hyunjee KIM
  • Daejin Kim

Assignees

  • SAMSUNG ELECTRONICS CO., LTD.

Dates

Publication Date
20260507
Application Date
20251105
Priority Date
20241106

Claims (20)

  1. 1 . A method comprising: obtaining network entity data associated with each network entity, from each of one or more network entities; generating, using an encoder model, network embeddings for the one or more network entities, based on the network entity data; converting, using a transformation model, the network embeddings into a predefined number of parameters; inputting the predefined number of parameters to an inference model; obtaining, from the inference model, an output regarding the predefined number of parameters; and determining, based on the output of the inference model, one or more parameters associated with control of a network.
  2. 2 . The method of claim 1 , wherein the generating, using the encoder model, of the network embeddings comprises: inputting, to the encoder model, network entity data obtained from a first network entity among the one or more network entities; obtaining, from the encoder model, a probability distribution for the first network entity; and generating a network embedding for the first network entity by performing sampling based on the probability distribution for the first network entity.
  3. 3 . The method of claim 2 , wherein the generating of the network embedding for the first network entity comprises: extracting a sample from the probability distribution for the first network entity; and generating the network embedding for the first network entity based on the sample and Gaussian noise.
  4. 4 . The method of claim 1 , wherein the transformation model is based on self-attention, and wherein the converting of the network embeddings into the predefined number of parameters comprises: combining the network embeddings; inputting the combined network embeddings to the transformation model; and obtaining, from the transformation model, the predefined number of parameters.
  5. 5 . The method of claim 1 , wherein the obtaining of the network entity data comprises obtaining the network entity data periodically, and wherein the generating of the network embeddings for the one or more network entities comprises generating the network embedding for each network entity for each period in which the network entity data is obtained.
  6. 6 . The method of claim 5 , wherein the converting of the network embeddings into the predefined number of parameters comprises: combining network embeddings generated for each period in which the network entity data is obtained; inputting, to the transformation model, the network embeddings combined for each period in which the network entity data is obtained; and obtaining, from the transformation model, the predefined number of parameters for each period in which the network entity data is obtained.
  7. 7 . The method of claim 1 , wherein the inference model is configured to output one or more values for determining one or more parameters or policies for network entities in the network, based on the predefined number of parameters.
  8. 8 . The method of claim 1 , wherein the inference model is configured to output one or more values for predicting a performance indicator of the network or a state of the network, based on the predefined number of parameters.
  9. 9 . The method of claim 1 , wherein the encoder model is trained by: inputting, to the encoder model, training network entity data for a single network entity; obtaining, from the encoder model, a probability distribution for the single network entity; obtaining a sample for the single network entity based on the probability distribution for the single network entity and Gaussian noise; reconstructing, using a decoder model, network entity data for the single network entity, based on the sample; calculating a loss function based on the network entity data reconstructed by the decoder model; and updating one or more weights of the encoder model and one or more weights of the decoder model, based on the loss function.
  10. 10 . The method of claim 1 , wherein the inference model and the transformation model are trained by: generating, using the encoder model, a training network embedding for each network entity from training network entity data for at least one network entity; converting, using the transformation model, the training network embeddings into the predefined number of parameters; inputting, to the inference model, the parameters converted from the training network embeddings; obtaining, from the inference model, an output regarding the converted parameters; calculating a loss function based on the output of the inference model and based on the parameters converted from the training network embeddings; and updating the transformation model and the inference model based on the loss function.
  11. 11 . A computer-readable recording medium having recorded thereon a program for performing the method of claim 1 on a computer.
  12. 12 . An electronic device comprising: at least one processor; and memory storing one or more instructions, wherein the one or more instructions, when executed by the at least one processor individually or collectively, cause the electronic device to: obtain, from each of one or more network entities, network entity data associated with each network entity; generate, using an encoder model, network embeddings for the one or more network entities, based on the network entity data; convert, using a transformation model, the network embeddings into a predefined number of parameters; input the predefined number of parameters to an inference model; obtain, from the inference model, an output regarding the predefined number of parameters; and determine, based on an output of the inference model, one or more parameters associated with control of a network.
  13. 13 . The electronic device of claim 12 , wherein the one or more instructions, when executed by the at least one processor individually or collectively, further cause the electronic device to: input, to the encoder model, network entity data obtained from a first network entity among the one or more network entities; obtain, using the encoder model, a probability distribution for the first network entity; and generate a network embedding for the first network entity by performing sampling based on the probability distribution for the first network entity.
  14. 14 . The electronic device of claim 13 , wherein the one or more instructions, when executed by the at least one processor individually or collectively, further cause the electronic device to: extract a sample from the probability distribution for the first network entity; and generate the network embedding for the first network entity, based on the sample and Gaussian noise.
  15. 15 . The electronic device of claim 12 , wherein the transformation model is based on self-attention, and wherein the one or more instructions, when executed by the at least one processor individually or collectively, further cause the electronic device to: combine the network embeddings; input the combined network embeddings to the transformation model; and obtain, from the transformation model, the predefined number of parameters.
  16. 16 . The electronic device of claim 12 , wherein the network entity data is obtained periodically, and wherein the network embedding for each network entity is generated for each period in which the network entity data is obtained.
  17. 17 . The electronic device of claim 16 , wherein the one or more instructions, when executed by the at least one processor individually or collectively, further cause the electronic device to: combine network embeddings generated for each period in which the network entity data is obtained; input, to the transformation model, the network embeddings combined for each period in which the network entity data is obtained; and obtain, from the transformation model, the predefined number of parameters for each period in which the network entity data is obtained.
  18. 18 . The electronic device of claim 12 , wherein the inference model is configured to output one or more values for determining one or more parameters or policies for network entities in the network, based on the predefined number of parameters.
  19. 19 . The electronic device of claim 12 , wherein the inference model is configured to output one or more values for predicting a performance indicator of the network or a state of the network, based on the predefined number of parameters.
  20. 20 . The electronic device of claim 12 , wherein the encoder model is trained by: inputting, to the encoder model, training network entity data for a single network entity; obtaining a probability distribution for the single network entity from the encoder model; obtaining a sample for the single network entity based on the probability distribution for the single network entity and Gaussian noise; reconstructing, using a decoder model, network entity data for the single network entity based on the sample; calculating a loss function based on the network entity data reconstructed by the decoder model; and updating one or more weights of the encoder model and one or more weights of the decoder model, based on the loss function.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS This application is a by-pass continuation application of International Application No. PCT/KR2025/016565, filed on Oct. 20, 2025, which is based on and claims priority to Korean Patent Application No. 10-2024-0156468, filed on Nov. 6, 2024, and Korean Patent Application No. 10-2025-0048364, filed on Apr. 14, 2025, in the Korean Intellectual Property Office, the disclosures of which are incorporated by reference herein their entireties. BACKGROUND 1. Field The disclosure relates to a wireless communication method and a wireless communication electronic device, and more particularly, to a method of optimizing a network by using a feature extracted from the network and an electronic device for performing the method. 2. Description of Related Art Wireless communication technologies have been developed mainly for services targeting humans, such as voice calls, multimedia services, and data services. Since the 5th generation (5G) communication systems have been developed, a number of connected devices or devices connected to communication networks, has been grown and increased. Examples of the devices connected to networks may include vehicles, robots, drones, home appliances, displays, smart sensors connected to various infrastructures, construction machines, and factory equipment. Mobile devices are expected to evolve in various form-factors, such as augmented reality glasses, virtual reality headsets, and hologram devices. In order to provide various services by connecting hundreds of billions of devices and things in the 6th generation (6G) era, there have been ongoing efforts to develop improved 6G communication systems. For these reasons, 6G communication systems are referred to as ‘beyond-5G systems’ 6G communication systems will have a peak data rate of tera (1,000 giga)-level bps and a radio latency of less than 100 μsec, and thus, will be 50 times as fast as 5G communication systems and have 1/10 the radio latency of 5G communication systems. In order to accomplish such high data rate and ultra-low latency, it has been considered to implement 6G communication systems in a terahertz band (e.g., 95 GHz to 3 THz bands). due to more severe path loss and atmospheric absorption in the terahertz bands than those in mmWave bands introduced in 5G, technologies capable of securing the signal transmission distance (i.e., coverage) will become more crucial. It is necessary to develop, as major technologies for securing coverage, radio frequency (RF) elements, antennas, novel waveforms having better coverage than orthogonal frequency division multiplexing (OFDM), beamforming and massive multiple input multiple output (MIMO), full dimensional MIMO (FD-MIMO), array antennas, and multi-antenna transmission technologies such as large-scale antennas. In addition, there has been ongoing discussion on new technologies for improving the coverage of terahertz-band signals, such as metamaterial-based lenses and antennas, orbital angular momentum (OAM), and reconfigurable intelligent surface (RIS). Moreover, in order to improve the spectral efficiency and the overall network performances, the following technologies have been developed for 6G communication systems: a full-duplex technology for enabling an uplink transmission and a downlink transmission to simultaneously use the same frequency resource at the same time; a network technology for utilizing satellites, high-altitude platform stations (HAPS), and the like in an integrated manner; an improved network structure for supporting mobile base stations and the like and enabling network operation optimization and automation and the like; a dynamic spectrum sharing technology via collision avoidance based on a prediction of spectrum usage; a use of artificial intelligence (AI) in wireless communication for improvement of overall network operation by utilizing AI from a designing phase for developing 6G and internalizing end-to-end AI support functions; and a next-generation distributed computing technology for overcoming the limit of UE computing ability through reachable super-high-performance communication and computing resources (such as mobile edge computing (MEC), clouds, and the like) over the network. In addition, through designing new protocols to be used in 6G communication systems, developing mechanisms for implementing a hardware-based security environment and safe use of data, and developing technologies for maintaining privacy, attempts to strengthen the connectivity between devices, optimize the network, promote softwarization of network entities, and increase the openness of wireless communications are continuing. Research and development of 6G communication systems in hyper-connectivity, including ‘person to machine’ (P2M) as well as ‘machine to machine’ (M2M), will allow the next hyper-connected experience. Particularly, services such as truly immersive ‘extended reality’ (XR), high-fidelity mobile hologram, and digita