Search

KR-20260067322-A - APPARATUS AND METHOD FOR LOW-POWER LOW-COST LOW-COMPLEXITY INTEGRATED VIRTUAL AND REAL TRANSMISSION AND RECEPTION IN WIRELESS COMMUNICATION SYSTEM

KR20260067322AKR 20260067322 AKR20260067322 AKR 20260067322AKR-20260067322-A

Abstract

The present disclosure generally relates to wireless communication systems, and more specifically, to an apparatus and method for low-power, low-cost, low-complexity virtual and real convergence transmission and reception in a wireless communication system. The method of operation of a terminal for XR service and obstacle avoidance according to the present invention provides 3GPP sensing data and non-3GPP sensing data to a sensing RF map service, receives a sensing result including RF environment mapping information from the sensing RF map service, and provides an XR service while avoiding obstacles in the real environment based on the received sensing result. In addition, the method of operation of a TRP for UAV sensing according to the present invention transmits on/off time information of an FSK-embedded backscattering function to a sensing target UAV, transmits a sensing signal at a set time, receives an FSK backscattering signal from the sensing target UAV, and estimates the radio propagation delay and movement speed of the sensing target UAV based on the received FSK backscattering signal.

Inventors

  • 장갑석
  • 김용선
  • 조원철
  • 고영조

Assignees

  • 한국전자통신연구원

Dates

Publication Date
20260512
Application Date
20251028
Priority Date
20241105

Claims (20)

  1. In a method of operation of a terminal that supports virtual and real fusion for XR (Extended Reality) services and obstacle avoidance in a wireless communication system, A process of providing 3GPP (3rd Generation Partnership Project) sensing data and non-3GPP sensing data to a sensing RF (Radio Frequency) map service, and A process of receiving a sensing result including RF environment mapping information from a sensing RF map service, and A method comprising a process of providing an XR service while avoiding obstacles in the real environment based on the received sensing results.
  2. A method according to claim 1, wherein the 3GPP sensing data includes RF sensing data, and the non-3GPP sensing data includes at least one of IMU (Inertial Measurement Unit) sensor data, RGB (Red, Green, Blue) camera data, headset attitude and position data, speed data, or environment image data.
  3. A method according to claim 1, wherein the sensing result includes RF environment mapping information including RAN (Radio Access Network) entity location information, reflector information, static blocking information, and wireless link blocking event information.
  4. A method according to claim 1, further comprising the process of transmitting control information for XR video rendering to an application server based on a tiled rendering architecture.
  5. A method according to claim 1, further comprising the process of providing a communication reference signal measurement value or report to the sensing RF map service.
  6. In a method of operation of a Transmission and Reception Point (TRP) for sensing an Unmanned Aerial Vehicle (UAV) in a wireless communication system, A process of transmitting on/off time information of an FSK (Frequency Shift Keying)-embedded backscattering function to a sensing target UAV, and The process of transmitting a sensing signal at a set time, and A process of receiving an FSK backscattered signal from the above-mentioned sensing target UAV, and A method comprising the process of estimating the radio propagation delay and movement speed of the sensing target UAV based on the received FSK backscattering signal.
  7. A method according to claim 6, further comprising the process of performing target identification and clutter noise avoidance by setting different frequency shift values for each sensing target UAV.
  8. A method according to claim 6, further comprising the process of transmitting sensing results with a plurality of TRPs to a sensing application server to estimate the location, movement path, shape, and size of the sensing target UAV.
  9. A method according to claim 6, wherein the FSK backscattering signal is generated by shifting the frequency of the received signal by a preset frequency variation value.
  10. In claim 6, the process of estimating the radio wave delay and movement speed is a method for achieving ultra-precision sensing accuracy.
  11. In a terminal that supports virtual and real fusion for XR (Extended Reality) services and obstacle avoidance in a wireless communication system, It includes a transceiver; and a processor operably connected to the transceiver, wherein the processor, A terminal that provides 3GPP (3rd Generation Partnership Project) sensing data and non-3GPP sensing data to a sensing RF (Radio Frequency) map service, receives a sensing result including RF environment mapping information from the sensing RF map service, and provides an XR service while avoiding obstacles in the real environment based on the received sensing result.
  12. A terminal according to claim 11, wherein the 3GPP sensing data includes RF sensing data, and the non-3GPP sensing data includes at least one of IMU (Inertial Measurement Unit) sensor data, RGB (Red, Green, Blue) camera data, headset attitude and position data, speed data, or environment image data.
  13. A terminal according to claim 11, wherein the sensing result comprises RF environment mapping information including RAN (Radio Access Network) entity location information, reflector information, static blocking information, and wireless link blocking event information.
  14. In claim 11, the processor is a terminal that transmits control information for XR video rendering to an application server based on a tiled rendering architecture.
  15. In claim 11, the processor is a terminal that provides a communication reference signal measurement or report to the sensing RF map service.
  16. In a Transmission and Reception Point (TRP) for UAV (Unmanned Aerial Vehicle) sensing in a wireless communication system, It includes a transceiver; and a processor operably connected to the transceiver, wherein the processor, A TRP that transmits on/off time information of an FSK (Frequency Shift Keying)-embedded backscattering function to a sensing target UAV, transmits a sensing signal at a set time, receives an FSK backscattering signal from the sensing target UAV, and estimates the radio propagation delay and movement speed of the sensing target UAV based on the received FSK backscattering signal.
  17. In claim 16, the processor performs target identification and clutter noise avoidance by setting different frequency shift values for each sensing target UAV.
  18. In claim 16, the processor transmits the sensing results with a plurality of TRPs to a sensing application server to estimate the location, movement path, shape, and size of the sensing target UAV.
  19. In claim 16, the FSK backscattering signal is a TRP generated by shifting the frequency of the received signal by a preset frequency variation value.
  20. In claim 16, the estimation of the radio wave delay and movement speed is a TRP that achieves ultra-precision sensing accuracy.

Description

Apparatus and Method for Low-Power, Low-Cost, Low-Complexity Integrated Virtual and Real Transmission and Reception in Wireless Communication System The present disclosure generally relates to wireless communication systems, and more specifically to an apparatus and method for low-power, low-cost, low-complexity virtual and reality fusion transmission and reception in a wireless communication system. With the advent of 6G mobile communication systems, streaming and gaming services utilizing Extended Reality (XR) are gaining attention. However, current XR services are limited to use by users in fixed locations, and there are clear technical limitations to enjoying the service while moving freely in real-world environments where various obstacles exist. Therefore, new transmission and reception technologies are required to provide seamless XR services while effectively avoiding real-world obstacles even while the user is on the move. In the smart factory sector, camera-based positioning technology is primarily used for the precise control of autonomous robots (AGVs/AMRs), but this entails high system implementation and maintenance costs. Furthermore, assembly and painting operations on conveyor belts rely on basic Wi-Fi systems or wired robots, which present problems such as a lack of flexibility and incurring massive costs when equipment is modified. Accordingly, there is a need for wireless-based robot control technology that enables ultra-precise control while operating at low power. Although Frequency Modulated Continuous Wave (FMCW) radar is widely used in autonomous vehicle technology, it is difficult to detect objects that suddenly appear in non-line-of-sight conditions. Furthermore, as all vehicles use radar signals of the same specifications, mutual interference causes the "ghost target" phenomenon, where non-existent objects are detected, which can seriously threaten safety. Therefore, the development of low-cost, low-power automotive sensing and communication technologies that solve the ghost target problem and improve precision is an urgent task. Finally, the use of unmanned aerial vehicles (UAVs) is expected to surge in the future, which could lead to various safety issues such as collisions over urban areas, unauthorized intrusion into restricted areas, and deviations from flight paths. To manage numerous UAVs safely and efficiently, ultra-precision, ultra-low power transmission and reception technology is essential to precisely track the flight paths of each UAV, prevent collision risks in advance, and detect unauthorized intrusions. FIG. 1 illustrates a case of virtual and real fusion in which a terminal can enjoy a virtual XR service and avoid real-world obstacles even in a moving environment, according to one embodiment of the present disclosure. FIG. 2 illustrates an example of controlling a mobile robot for autonomous driving, such as for an AGV (Automated Guided Vehicle) and AMR (Autonomous Mobile Robot) in a smart factory, or for performing product assembly/painting on a conveyor belt, with the help of an ultra-precision ultra-low power wireless communication system according to one embodiment of the present disclosure. FIG. 3 illustrates a case of transmission and reception between a wireless communication system and a vehicle that enables autonomous driving without a ghost target, with low complexity, low cost, low power, and high precision, according to one embodiment of the present disclosure. FIG. 4 illustrates a transmission and reception case in which, according to one embodiment of the present disclosure, the flight path of an Unmanned Aerial Vehicle (UAV) is tracked, a collision is avoided, or a UAV intrusion is estimated with the help of an ultra-precise and ultra-low power wireless communication system. FIG. 5 is a flowchart illustrating the operation method of a terminal that supports virtual and real fusion for XR (Extended Reality) services and obstacle avoidance according to one embodiment of the present disclosure. FIG. 6 is a flowchart illustrating the operation method of a Transmission and Reception Point (TRP) for Unmanned Aerial Vehicle (UAV) sensing according to one embodiment of the present disclosure. FIG. 7 is a diagram of the apparatus configuration of the present disclosure according to one embodiment of the present disclosure. The terms used in this disclosure are used merely to describe specific embodiments and are not intended to limit the scope of other embodiments. A singular expression may include a plural expression unless the context clearly indicates otherwise. Terms used herein, including technical or scientific terms, may have the same meaning as generally understood by those skilled in the art described in this disclosure. Terms used in this disclosure that are defined in a general dictionary may be interpreted as having the same or similar meaning as they have in the context of the relevant technology, and are not to be interpreted in an ideal or ov