KR-20260066546-A - ELECTRONIC DEVICE FOR GUIDING POSITION OF OBJECT AND METHODE FOR OPERATING THE SAME
Abstract
A method of operating an electronic device may include the steps of: storing video obtained by recording a video call with an external device and motion data of the electronic device; detecting a target object corresponding to an object included in the request in the video upon receiving a request from the external device to show a specific object again; determining a reference timestamp at the point in time when the target object is detected in the video; determining a movement path from the current location of the electronic device to the location of the electronic device at the reference timestamp based on the motion data, and displaying a first user interface that guides the movement to the location of the electronic device at the reference timestamp; and displaying a second user interface that guides the capture of the target object.
Inventors
- 엔고 더 바오
- 듀옹 티 탄 푸옹
- 응우옌 민 투이
- 호앙 트렁 키엔
- 트린 티 투 히엔
- 응우옌 티 투 흐엉
- 부 반 띤
Assignees
- 삼성전자주식회사
Dates
- Publication Date
- 20260512
- Application Date
- 20241104
Claims (20)
- In a method of operation of an electronic device that guides the location of an object, A step of storing video obtained by recording a video call with an external device and motion data of the electronic device (S210); A step (S220) of detecting a target object corresponding to the object included in the request in the video upon receiving a request from the external device to show a specific object again; A step of determining a reference timestamp at the point in time when the target object is detected in the video (S230); A step (S240) of determining a movement path from the current position of the electronic device to the position of the electronic device at the reference time stamp based on the motion data, and displaying a first user interface that guides the movement to the position of the electronic device at the reference time stamp along the movement path; and A method comprising the step (S250) of displaying a second user interface that guides the target object to be photographed.
- In Article 1, The step (S220) of detecting the target object corresponding to the object included in the request in the above video is, Step of receiving a request from the external device to show the specific object again (S410); A step (S420) of detecting at least one candidate object corresponding to the object included in the request in the above video, and obtaining a confidence score of the candidate object at each of at least one timestamps where the at least one candidate object is detected; A step (S430) of identifying an image frame at the timestamp where the confidence score is maximum for each of the above at least one candidate object; A step of transmitting the identified image frame of each of the at least one candidate object to the external device (S440); and A method comprising the step (S450) of determining a selected candidate object among the at least one candidate object as the target object based on the user input, upon receiving a user input selecting at least one candidate object from the external device.
- In any one of paragraphs 1 to 2, The step (S230) of determining the reference timestamp, which is the point in time when the target object is detected in the video, is: A step (S610) of obtaining a confidence score of the target object at each of a plurality of candidate timestamps in which the target object is detected in the video above; A step (S620) of identifying at least one candidate timestamp among the plurality of candidate timestamps in which the reliability score of the target object is greater than or equal to a preset threshold score; and A method comprising the step (S630) of determining the candidate timestamp among the at least one candidate timestamp that has the maximum size of the corresponding target object as the reference timestamp.
- In any one of paragraphs 1 to 2, The step (S230) of determining the reference timestamp, which is the point in time when the target object is detected in the video, is: In the above video, a step (S610) of obtaining a confidence score of the target object at each of a plurality of candidate timestamps where the target object is detected; A step (S620) of identifying at least one candidate timestamp among the plurality of candidate timestamps in which the reliability score of the target object is greater than or equal to a preset threshold score; and A method comprising the step (S630-1) of determining the candidate timestamp among the at least one candidate timestamp that has the smallest size of the corresponding target object as the reference timestamp.
- In any one of paragraphs 1 to 2, The step (S230) of determining the reference timestamp, which is the point in time when the target object is detected in the video, is: A method comprising the step (S1510) of determining the most recent candidate timestamp among a plurality of candidate timestamps in which the target object is detected in the above video as the reference timestamp.
- In any one of paragraphs 1 to 5, The above method is, A step (S1040) of determining whether the target object is captured based on whether there is a match between the objects detected in the video and the image frame of the target object at the reference timestamp; and A method further comprising the step (S1050) of displaying a third user interface indicating that the target object is being photographed based on the target object being photographed.
- In any one of paragraphs 1 through 6, The step (S240) of displaying the first user interface that guides the user to move to the location of the electronic device at the reference timestamp is, Based on the motion data above, a step (S810) of determining a movement path from the current position of the electronic device to the position of the electronic device at the reference timestamp; A step (S820) of displaying the first user interface indicating the determined movement path and the real-time location of the electronic device; and A method comprising the step (S830) of displaying a popup indicating proximity to the target object based on the distance between the real-time location of the electronic device and the location at the reference timestamp being less than or equal to a preset threshold distance.
- In any one of paragraphs 1 through 7, The step (S250) of displaying the second user interface that guides the shooting of the target object is, A step (S1010) of determining a first coordinate of the current position of the electronic device, a second coordinate of the position of the electronic device at the reference timestamp, and a third coordinate of the position of the target object within the same coordinate system; A step (S1020) of determining a movement direction and angle for photographing the target object based on the first coordinate, the second coordinate, and the third coordinate; and A method comprising the step (S1030) of displaying the second user interface that guides the movement of the electronic device based on the determined movement direction and angle.
- In any one of paragraphs 1 through 7, The step (S250) of displaying the second user interface that guides the shooting of the target object is, A method comprising the step (S1610) of scanning the surrounding environment through a sphere camera of the electronic device based on the fact that the real-time location of the electronic device is adjacent to the location of the electronic device at the reference timestamp.
- In Article 9, The step (S1610) of scanning the surrounding environment through the above-mentioned spherical camera is, A step of comparing multiple objects within a frame at the reference timestamp with multiple objects within a frame scanned through the spherical camera to obtain information regarding the matching multiple objects (S1710); and The method includes the step (S1730) of terminating the scanning of the spherical camera based on the fact that the number of matching objects is greater than or equal to a preset threshold number according to the information regarding the acquired matching multiple objects. The above method is, A step (S1740) of predicting the location of the target object based on the last scanned frame through the above-mentioned spherical camera and information regarding the acquired matching plurality of objects; A step of determining a direction of movement for photographing the target object based on the predicted location of the target object (S1750); and A method further comprising the step (S1760) of providing a fourth user interface that guides the movement of the electronic device based on the direction determined above.
- In any one of paragraphs 1 through 10, The above method is, Based on the fact that the object included in the request is not detected in the stored captured video, a step of acquiring an image of the object included in the request through an artificial intelligence model (S1820); and The method further includes a step (S1830) of determining whether the target object is captured based on whether there is a match between the objects detected in the video and the image obtained through the artificial intelligence model. The step (S220-1) of detecting the target object corresponding to the object included in the request in the above video is, A step of detecting a large object including the object included in the request in the above video (S1830); and A method comprising the step (S1840) of determining a surrounding object detected in the video that is included in the above large object as the target object.
- In any one of paragraphs 1 through 11, The step (S240) of providing a display of the first user interface that guides movement to the location of the electronic device at the reference timestamp is, A step (S1910) of determining at least one section among the determined movement paths in which the positional movement amount of the electronic device is less than or equal to a preset threshold; Step of determining the average position in each of the at least one interval determined above (S1920); and A method comprising the step (S1930) of changing the path in each of the above-determined at least one segment to a point of the corresponding average position.
- In an electronic device (1000) that guides the location of an object, Communication interface (110); Camera (120); Display (130); At least one processor (150) including a processing circuit; and It includes a memory (!40) that stores multiple instructions, By executing the above plurality of instructions individually or collectively by the at least one processor (150), the electronic device (1000) is: A video obtained by recording a video call with an external device and motion data of the electronic device (1000) are stored, and Upon receiving a request from the above external device to display a specific object again, a target object corresponding to the object included in the request is detected in the above video, and Determine a reference timestamp at the point in time when the target object is detected in the above video, and Based on the motion data above, a movement path is determined to reach the position of the electronic device (1000) at the reference time stamp from the current position of the electronic device (1000), and a first user interface is displayed to guide movement to the position of the electronic device (1000) at the reference time stamp along the movement path. An electronic device (1000) that displays a second user interface guiding the shooting of the above target object.
- In Article 13, By executing the above plurality of instructions individually or collectively by the at least one processor (150), the electronic device (1000) is: In detecting the target object corresponding to the object included in the request in the above video, Detecting at least one candidate object corresponding to the object included in the request in the above video, obtaining a confidence score of the candidate object at each of the at least one timestamps where the at least one candidate object is detected, and For each of the above at least one candidate object, identify the image frame at the timestamp where the confidence score is maximum, and Transmit the identified image frame of each of the above at least one candidate object to the external device, and An electronic device (1000) that, upon receiving a user input selecting at least one candidate object from the external device, determines a selected candidate object among the at least one candidate object as the target object based on the user input.
- In any one of paragraphs 13 to 14, By executing the above plurality of instructions individually or collectively by the at least one processor (150), the electronic device (1000) is: In determining the reference timestamp, which is the point in time when the target object is detected in the video above, A confidence score of the target object at each of the plurality of candidate timestamps where the target object is detected in the above video is obtained, and Identify at least one candidate timestamp among the plurality of candidate timestamps in which the reliability score of the target object is greater than or equal to a preset threshold score, and An electronic device (1000) that determines the candidate timestamp having the maximum size of the corresponding target object among the at least one candidate timestamp as the reference timestamp.
- In any one of paragraphs 13 to 14, By executing the above plurality of instructions individually or collectively by the at least one processor (150), the electronic device (1000) is: In determining the reference timestamp, which is the point in time when the target object is detected in the video above, A confidence score of the target object at each of the plurality of candidate timestamps where the target object is detected in the above video is obtained, and Identify at least one candidate timestamp among the plurality of candidate timestamps in which the reliability score of the target object is greater than or equal to a preset threshold score, and An electronic device (1000) that determines the candidate timestamp having the smallest size of the corresponding target object among the at least one candidate timestamp as the reference timestamp.
- In any one of paragraphs 13 to 14, By executing the above plurality of instructions individually or collectively by the at least one processor (150), the electronic device (1000) is: In determining the reference timestamp, which is the point in time when the target object is detected in the video above, An electronic device (1000) that determines the most recent candidate timestamp among a plurality of candidate timestamps in which the target object is detected in the above video as the reference timestamp.
- In any one of paragraphs 13 through 17, By executing the above plurality of instructions individually or collectively by the at least one processor (150), the electronic device (1000) is: An electronic device (1000) that determines whether the requested object is captured based on whether the objects in the captured video transmitted in real time match the image of the target object at the reference timestamp.
- In any one of paragraphs 13 through 18, By executing the above plurality of instructions individually or collectively by the at least one processor (150), the electronic device (1000) is: Determining the first coordinate of the current position of the electronic device (1000) within the same coordinate system, the second coordinate of the position of the electronic device (1000) at the reference timestamp, and the third coordinate of the position of the target object. Based on the first coordinate, the second coordinate, and the third coordinate, a movement direction and angle for photographing the target object are determined, and An electronic device (1000) that displays the second user interface guiding the movement of the electronic device (1000) based on the above-determined movement direction and angle.
- A computer-readable recording medium having a program for executing the method of any one of paragraphs 13 through 19 on a computer.
Description
Electronic device for guiding the position of an object and method for operating the same The present disclosure relates to an electronic device for guiding the location of an object, a method of operating the electronic device, and a computer-readable recording medium for storing a computer program for operating the electronic device. Recently, due to the development of various electronic and communication technologies, video call services have become commonplace as well as voice call services. Furthermore, communication devices that support video call services are also becoming more diverse. For example, video call services are provided through mobile phones and smartphones, as well as personal computers, laptops, and tablet PCs. By utilizing the camera functions of various electronic devices, users can conveniently capture necessary objects and conveniently share the captured objects with the other party. FIG. 1 is a conceptual diagram illustrating the operation of an electronic device according to one embodiment of the present disclosure guiding the location of an object. FIG. 2 is a flowchart illustrating an operation method in which an electronic device according to one embodiment of the present disclosure guides the location of an object. FIG. 3 is a block diagram schematically showing the configuration of an electronic device according to one embodiment of the present disclosure. FIG. 4 is a flowchart illustrating a method for an electronic device according to one embodiment of the present disclosure to determine a target object corresponding to a requested object. FIG. 5a is a diagram illustrating the operation of an electronic device according to one embodiment of the present disclosure receiving a request from an external device to show a specific object again. FIG. 5b is a diagram illustrating the operation of an electronic device according to one embodiment of the present disclosure determining images of candidate objects corresponding to a requested object. FIG. 5c is a diagram illustrating the operation of an electronic device according to one embodiment of the present disclosure transmitting images of candidate objects to an external device and receiving user input for selecting a candidate object from the external device to determine a target object. FIG. 6 is a flowchart illustrating a method for an electronic device according to one embodiment of the present disclosure to determine a reference timestamp at which a candidate object is detected. FIG. 7a is a diagram illustrating the operation of an electronic device according to one embodiment of the present disclosure to obtain a reliability score at all timestamps in which a target object is detected. FIG. 7b is a diagram illustrating the operation of an electronic device according to one embodiment of the present disclosure determining a reference timestamp among timestamps based on a obtained reliability score. FIG. 8 is a flowchart illustrating a method for an electronic device according to one embodiment of the present disclosure to guide movement to a requested object. FIG. 9a is a diagram illustrating the operation of an electronic device according to one embodiment of the present disclosure to obtain a movement path from a position at a reference timestamp to a current position. FIG. 9b is a diagram showing an example of a movement path from a position at a reference timestamp acquired by an electronic device according to one embodiment of the present disclosure to a current position. FIG. 9c is a diagram illustrating the operation of an electronic device according to one embodiment of the present disclosure that provides a user interface guiding movement to a requested object. FIG. 9d is a diagram illustrating the operation of an electronic device according to one embodiment of the present disclosure providing a popup that indicates proximity to a requested object. FIG. 10 is a flowchart illustrating a method for an electronic device according to one embodiment of the present disclosure to provide a user interface that guides movement so that a requested object is photographed. FIG. 11a is a diagram illustrating the device coordinate system. Figure 11b is a diagram illustrating the Earth coordinate system. Figure 11c is a diagram illustrating Euler angles. FIG. 12 is a flowchart illustrating a method for an electronic device according to one embodiment of the present disclosure to acquire the position coordinates of a target object. FIG. 13 is a diagram illustrating the operation of a user interface that guides movement for an electronic device to photograph a requested object according to one embodiment of the present disclosure. FIG. 14a is a flowchart illustrating a method for an electronic device according to one embodiment of the present disclosure to determine a reference timestamp at which a candidate object is detected. FIG. 14b is a diagram illustrating the operation of an electronic device according to one embodiment of