KR-20260067956-A - METHOD FOR OPTIMIZING NETWORK USING EXTRACTED FEATURE FROM NETWORK AND ELECTRONIC DEVICE PERFORMING THE SAME
Abstract
The present disclosure may provide a method for optimizing a network using characteristics extracted from a network and an electronic device for performing the same. The method may include: obtaining network entity data associated with each network entity from each of one or more network entities; for each of the one or more network entities, generating a network embedding based on the network entity data associated with each network entity using an encoder model; converting the network embeddings into a predetermined number of parameters using a transformation model; inputting the predetermined number of parameters into an inference model; and determining one or more parameters associated with the control of the network based on the output of the inference model.
Inventors
- 송주환
- 김대진
- 김현지
- 남유진
- 이상현
- 이아현
- 장서우
- 최민석
- 최회상
Assignees
- 삼성전자주식회사
Dates
- Publication Date
- 20260513
- Application Date
- 20250414
- Priority Date
- 20241106
Claims (20)
- A step of obtaining network entity data associated with each network entity from each of one or more network entities; For each of the above one or more network entities, a step of generating a network embedding based on network entity data associated with each network entity using an encoder model (204); A step of converting the network embeddings into a predefined number of parameters using a conversion model (208); The step of inputting the above-mentioned number of predefined parameters into the inference model (210); and A method comprising the step of determining one or more parameters associated with the control of a network based on the output of the above inference model (210).
- In paragraph 1, The step of generating a network embedding using the encoder model (204) above is: A step of inputting network entity data obtained from a first network entity among the one or more network entities into the encoder model (204); A step of obtaining a probability distribution for the first network entity from the encoder model (204); and A method comprising the step of generating a network embedding for the first network entity by performing sampling based on a probability distribution for the first network entity.
- In paragraph 2, The step of generating a network embedding for the first network entity is, A step of extracting a sample from a probability distribution for the first network entity; and A method comprising the step of generating a network embedding for the first network entity based on the above sample and Gaussian noise.
- In any one of paragraphs 1 through 3, The above transformation model (208) is a model based on self-attention, and The step of converting the above network embeddings into a predefined number of parameters is: A step of combining the above network embeddings; The step of inputting the combined network embeddings into the transformation model (208); and A method comprising the step of obtaining a predefined number of parameters from the above transformation model (208).
- In any one of paragraphs 1 through 4, The above network entity data is acquired periodically, and A method in which the network embedding for each network entity is generated at each period in which the network entity data is acquired.
- In paragraph 5, The step of converting the above network embeddings into a predefined number of parameters is: A step of combining network embeddings generated at each period in which the above network entity data is acquired; A step of inputting combined network embeddings into the transformation model (208) at each period in which the above network entity data is acquired; and A method comprising the step of obtaining a predetermined number of parameters at each period in which the network entity data is obtained from the above transformation model (208).
- In any one of paragraphs 1 through 6, The above inference model (210) outputs one or more values for determining one or more parameters or policies for network entities included in the network based on the above-defined number of parameters.
- In any one of paragraphs 1 through 6, The above inference model (210) outputs one or more values for predicting the performance indicators of the network or the state of the network based on the above-defined number of parameters.
- In any one of paragraphs 1 through 8, The above encoder model (204) is: Training network entity data for a single network entity is input into the encoder model (204) above; Obtaining a probability distribution for the single network entity from the encoder model (204); Based on the probability distribution and Gaussian noise for the single network entity, a sample for the single network entity is obtained; Using a decoder model (602), network entity data for the single network entity is restored based on the sample; Calculate a loss function based on network entity data restored by the above decoder model (602); and A method trained by updating one or more weights of the encoder model (204) and one or more weights of the decoder model (602) based on the loss function (616).
- In any one of paragraphs 1 through 9, The above inference model (210) and the above transformation model (208) are, Using the encoder model (204) above, a training network embedding for each network entity is generated from training network entity data for at least one network entity; Using the above transformation model (208), the training network embeddings are transformed into the above-predefined number of parameters; The parameters converted from the above training network embeddings are input into the inference model (210); Calculate a loss function (706) based on the output of the inference model (210) based on parameters transformed from the training network embeddings; and A method trained by updating the transformation model (208) and the inference model (210) based on the loss function (706).
- A computer-readable recording medium having a program recorded thereon for performing the method of any one of paragraphs 1 through 10 on a computer.
- In the electronic device (900), At least one processor (902) including a processing circuit; and It includes a memory (904) comprising one or more storage media that store one or more instructions, and The above one or more instructions, when executed by the above at least one processor, cause the electronic device: From each of one or more network entities, obtain network entity data associated with each network entity; For each of the above one or more network entities, a network embedding is generated based on network entity data associated with each network entity using an encoder model (204). Using a conversion model (208), the network embeddings are converted into a predefined number of parameters; Input the above-mentioned number of predefined parameters into the inference model (210); and An electronic device that determines one or more parameters associated with the control of a network based on the output of the above inference model (210).
- In Paragraph 12, When the above one or more instructions are executed by the above at least one processor (902), the electronic device (900) additionally: Network entity data obtained from the first network entity among the above one or more network entities is input into the encoder model (204), and Using the encoder model (204) above, obtain a probability distribution for the first network entity; and An electronic device that generates a network embedding for the first network entity by performing sampling based on a probability distribution for the first network entity.
- In Paragraph 13, When the above one or more instructions are executed by the above at least one processor (902), the electronic device (900) additionally: Extracting a sample from the probability distribution for the first network entity; and An electronic device that generates a network embedding for the first network entity based on the above sample and Gaussian noise.
- In any one of paragraphs 12 through 14, The above transformation model (208) is a model based on self-attention, and When the above one or more instructions are executed by the above at least one processor (902), the electronic device (900) additionally: Combining the above network embeddings; Input the combined network embeddings above into the transformation model (208); and An electronic device that obtains the predefined number of parameters from the above conversion model (208).
- In any one of paragraphs 12 through 15, The above network entity data is acquired periodically, and The network embedding for each network entity is an electronic device that is generated at each period in which the network entity data is acquired.
- In Paragraph 16, When the above one or more instructions are executed by the above at least one processor (902), the electronic device (900) additionally: Combine the network embeddings generated at each period when the above network entity data is acquired; At each period when the above network entity data is acquired, the combined network embeddings are input into the transformation model (208); and An electronic device that obtains the predefined number of parameters for each period in which the network entity data is obtained from the above conversion model (208).
- In any one of paragraphs 12 through 17, The above inference model (210) is an electronic device that outputs one or more values to be used to determine one or more parameters or policies for a network entity included in the network based on the above-defined number of parameters.
- In any one of paragraphs 12 through 17, The above inference model (210) is an electronic device that outputs one or more values for predicting the performance indicators of the network or the state of the network based on the above-defined number of parameters.
- In any one of paragraphs 12 through 19, The above encoder model (204) is: Training network entity data for a single network entity is input into the above encoder model (204); Obtain a probability distribution for the single network entity from the encoder model (204); Based on the probability distribution and Gaussian noise for the single network entity, a sample for the single network entity is obtained; Using a decoder model (602), network entity data for the single network entity is restored based on the sample; Calculate a loss function (616) based on network entity data restored by the above decoder model (602); and An electronic device trained by updating one or more weights of the encoder model (204) and one or more weights of the decoder model (602) based on the loss function (616).
Description
Method for Optimizing a Network Using Extracted Features from a Network and Electronic Device for Performing the Same The present disclosure relates to a wireless communication method and a wireless communication electronic device, and more specifically, to a method for optimizing a network using features extracted from a network and an electronic device for performing the same. Looking back at the evolution of wireless communication through successive generations, technologies have been developed primarily for human-oriented services, such as voice, multimedia, and data. Following the commercialization of 5G (5th-generation) communication systems, connected devices, which have been increasing explosively, are expected to be connected to communication networks. Examples of networked objects include vehicles, robots, drones, home appliances, displays, smart sensors installed in various infrastructures, construction machinery, and factory equipment. Mobile devices are expected to evolve into various form factors, such as augmented reality glasses, virtual reality headsets, and holographic devices. In the 6G (6th-generation) era, efforts are underway to develop improved 6G communication systems to connect hundreds of billions of devices and objects to provide diverse services. For this reason, 6G communication systems are being referred to as "beyond 5G" systems. In the 6G communication system predicted to be realized around 2030, the maximum transmission speed is tera (i.e., 1,000 gigabit) bps, and the wireless latency is 100 microseconds (μsec). In other words, compared to the 5G communication system, the transmission speed in the 6G communication system is 50 times faster, and the wireless latency is reduced to one-tenth. To achieve such high data transmission speeds and ultra-low latency, 6G communication systems are being considered for implementation in the terahertz band (e.g., the 95 GHz to 3 terahertz (3 THz) band). In the terahertz band, due to more severe path loss and atmospheric absorption compared to the millimeter wave (mmWave) band introduced in 5G, the importance of technology capable of guaranteeing signal reach, or coverage, is expected to increase. As key technologies to ensure coverage, radio frequency (RF) devices, antennas, new waveforms that offer better coverage than orthogonal frequency division multiplexing (OFDM), beamforming, and multi-antenna transmission technologies such as massive multiple-input and multiple-output (massive MIMO), full-dimensional MIMO (FD-MIMO), array antennas, and large-scale antennas must be developed. In addition, new technologies such as metamaterial-based lenses and antennas, high-dimensional spatial multiplexing technology using orbital angular momentum (OAM), and reconfigurable intelligent surface (RIS) are being discussed to improve coverage of terahertz band signals. In addition, to improve frequency efficiency and system network, development is underway in 6G communication systems for full duplex technology, in which uplink and downlink simultaneously utilize the same frequency resources at the same time; network technology that integrates satellites and HAPS (high-altitude platform stations); network structure innovation technology that supports mobile base stations and enables network operation optimization and automation; dynamic spectrum sharing technology through collision avoidance based on spectrum usage prediction; AI-based communication technology that utilizes AI (artificial intelligence) from the design stage and internalizes end-to-end AI support functions to realize system optimization; and next-generation distributed computing technology that realizes services of complexity exceeding the limits of terminal computing capabilities by utilizing ultra-high performance communication and computing resources (mobile edge computing (MEC), cloud, etc.). In addition, attempts are continuing to further strengthen connectivity between devices, further optimize networks, promote the softwareization of network entities, and increase the openness of wireless communication through the design of new protocols to be used in 6G communication systems, the implementation of hardware-based security environments, the development of mechanisms for the safe utilization of data, and the development of technologies regarding privacy maintenance methods. Due to the research and development of such 6G communication systems, it is expected that a new dimension of hyper-connected experience will become possible through the hyper-connectivity of 6G communication systems, which encompasses not only connections between objects but also connections between people and objects. Specifically, it is projected that 6G communication systems will enable the provision of services such as truly immersive extended reality (truly immersive XR), high-fidelity mobile holograms, and digital replicas. Furthermore, services such as remote surgery, industrial automation, and emergency re