Search

KR-20260067847-A - DATA PRELOAD METHOD AND AN ELECTRONIC DEVICE PERFORMING THE METHOD

KR20260067847AKR 20260067847 AKR20260067847 AKR 20260067847AKR-20260067847-A

Abstract

A method for preloading data is disclosed. The method may include the steps of: identifying a current process ID and a virtual address to which the data is to be loaded based on the identification of a data load request; converting the virtual address into a physical address based on a page table; if the current process ID is the same as a previous process ID, predicting the physical address of the next data to be used in the process corresponding to the current process ID based on the physical address; and preloading the next data into the main memory of the electronic device based on the predicted physical address.

Inventors

  • 김주신

Assignees

  • 삼성전자주식회사

Dates

Publication Date
20260513
Application Date
20241106

Claims (20)

  1. In an electronic device (2000) that performs a data preload method, A system-on-chip (2400) including at least one processor; and It includes a storage (2300) for storing instructions and a main memory (2200) on which the instructions are loaded, By executing the above instructions by the at least one processor within the system-on-chip (2400), the electronic device (2000) Based on the identification of the data load request, the current process ID and the virtual address to which the data is to be loaded are identified, and Based on the page table stored in the storage above, the virtual address is converted to a physical address, and If the above current process ID is the same as the previous process ID, the physical address of the next data to be used in the process corresponding to the above current process ID is predicted based on the above physical address, and An electronic device that preloads the following data into the main memory (2200) based on the predicted physical address.
  2. In paragraph 1, By executing the above instructions by the at least one processor within the system-on-chip, the electronic device, Based on the above page table, the virtual page number extracted from the virtual address is converted into a physical page number, and The above physical page number is input into a prediction model to predict the physical page number of the next data, and An electronic device that converts the predicted physical page number into a physical address to obtain the predicted physical address.
  3. In any one of paragraphs 1 to 2, By executing the above instructions by the at least one processor within the system-on-chip, the electronic device, An electronic device that identifies whether the next data is in the main memory, and if the next data is not in the main memory, preloads the next data from the storage into the main memory based on the predicted physical address.
  4. In paragraph 2, The above prediction model is an electronic device composed of a Multi-Layer Perceptron (MLP) and trained based on training data in which the physical page number of the current data is used as input and the physical page number of the next data is used as a label.
  5. In paragraph 4, By executing the above instructions by the at least one processor within the system-on-chip, the electronic device, For processes that are executed cross-executed according to scheduling, the physical addresses of data loaded within the same process are identified based on the process ID, and An electronic device that generates the training data based on the physical address pattern of the physical addresses of the same process.
  6. In paragraph 5, By executing the above instructions by the at least one processor within the system-on-chip, the electronic device, Collecting the user's electronic device usage history, and Based on the above usage history, generate usage patterns for one or more applications, and An electronic device that generates the physical address pattern per process based on the above usage pattern.
  7. In paragraph 5, The above electronic device further includes a communication interface, and By executing the above instructions by the at least one processor within the system-on-chip, the electronic device, Through the above communication interface, the training data is transmitted to the server, and An electronic device that receives parameters of the prediction model trained using the training data from the server through the communication interface.
  8. In Paragraph 7, The above prediction model is an electronic device in which the above prediction model, which is periodically updated based on a defined period, is received from the server.
  9. In paragraph 2, By executing the above instructions by the at least one processor within the system-on-chip, the electronic device, An electronic device that executes one or more processes by repeatedly predicting physical addresses for subsequent data sequences.
  10. In paragraph 2, The above system-on-chip includes an NPU, and By executing the above instructions by the at least one processor within the system-on-chip, the electronic device, An electronic device that predicts the physical page number of the next data using the prediction model using the above NPU.
  11. In a data preloading method performed by an electronic device, A step (S310) of identifying the current process ID and the virtual address to which the data is to be loaded based on the identification of the data load request; A step of converting the virtual address to a physical address based on a page table (S320); If the current process ID is the same as the previous process ID, a step (S330) of predicting the physical address of the next data to be used in the process corresponding to the current process ID based on the physical address; and A method comprising the step (S340) of preloading the next data into the main memory of the electronic device based on the predicted physical address.
  12. In Paragraph 11, The step of converting the above virtual address to a physical address is, It includes the step of converting a virtual page number extracted from the above virtual address into a physical page number, and The step of predicting the physical address of the above-mentioned next data is, A step of inputting the physical page number into a prediction model to predict the physical page number of the next data; and A method comprising the step of converting the predicted physical page number into a physical address to obtain the predicted physical address.
  13. In any one of paragraphs 11 to 12, The step of preloading the following data above is, A method for identifying whether the next data is in the main memory, and if the next data is not in the main memory, preloading the next data from storage into the main memory based on the predicted physical address.
  14. In Paragraph 12, A method wherein the above prediction model is composed of a Multi-Layer Perceptron (MLP) and is trained based on training data in which the physical page number of the current data is used as input and the physical page number of the next data is used as a label.
  15. In Paragraph 14, The above method is, For processes that are executed cross-executed according to scheduling, a step of identifying the physical addresses of data loaded within the same process based on the process ID; and A method further comprising the step of generating the training data based on the physical address pattern of the physical addresses of the same process.
  16. In paragraph 15, The above method is, A step of collecting the user's electronic device usage history; and The method further includes the step of generating usage patterns for one or more applications based on the above usage history, and A method comprising the step of generating the training data, wherein the step of generating the physical address pattern for each process based on the usage pattern.
  17. In paragraph 15, The above method is, The step of transmitting the above training data to a server; and A method further comprising the step of receiving parameters of the prediction model trained using the training data from the server.
  18. In Paragraph 17, A method in which the step of receiving parameters of the prediction model from the server is to receive parameters of the prediction model that are periodically updated from the server based on a defined period.
  19. In Paragraph 12, The above method further comprises the step of executing one or more processes by repeating the prediction of a physical address for a subsequent data sequence.
  20. In Paragraph 12, The step of predicting the physical page number of the next data above is a method in which the physical page number of the next data above is predicted using the prediction model using an NPU.

Description

Data Preload Method and Electronic Device Performing the Method The present disclosure relates to a method for preloading data by predicting data to be loaded based on a prediction model, and an electronic device for performing the method. Data preloading is an optimization technique used to accelerate processing speeds in various computing tasks involving data processing. While preparing necessary data in advance can reduce processing delays, it also increases memory usage and carries the risk of loading unnecessary data. While data preloading is effectively applied to predictable processes, its utilization is limited in processes that are difficult to predict due to numerous variables. Accordingly, there is a need to optimize performance for electronic devices performing unpredictable processes in various environments by efficiently predicting and preloading data. FIG. 1 is an example for explaining data preloading of an electronic device according to one embodiment of the present disclosure. FIG. 2 is a diagram for illustrating processes executed in an electronic device according to one embodiment of the present disclosure. FIG. 3 is a flowchart illustrating the operation of an electronic device preloading data according to one embodiment of the present disclosure. FIG. 4 is a block diagram illustrating the configuration of an electronic device according to one embodiment of the present disclosure. FIG. 5 is a block diagram illustrating the PPB configuration of an electronic device according to one embodiment of the present disclosure. FIG. 6 is a diagram illustrating the operation of an electronic device according to one embodiment of the present disclosure to predict a physical address and preload data. FIG. 7 is a diagram illustrating the operation of an electronic device according to one embodiment of the present disclosure converting a virtual address into a physical address. FIG. 8 is a diagram illustrating the operation of an electronic device preloading data according to one embodiment of the present disclosure. FIG. 9 is a diagram illustrating the operation of an electronic device according to one embodiment of the present disclosure training a prediction model for predicting a physical address. FIG. 10 is a diagram illustrating the operation of an electronic device according to one embodiment of the present disclosure generating training data for a prediction model. FIG. 11 is a diagram illustrating a prediction model used by an electronic device according to one embodiment of the present disclosure to predict a physical address. FIG. 12 is a diagram illustrating the operation of an electronic device according to one embodiment of the present disclosure collecting usage patterns of the electronic device. FIG. 13 is a block diagram illustrating the configuration of an electronic device according to one embodiment of the present disclosure. FIG. 14 is a block diagram illustrating the configuration of an electronic device according to one embodiment of the present disclosure. The terms used in this specification will be briefly explained, and the present disclosure will be described in detail. In the present disclosure, the expression "at least one of a, b, or c" may refer to "a," "b," "c," "a and b," "a and c," "b and c," "all of a, b, and c," or variations thereof. The terms used in this disclosure have been selected to be as widely used and general as possible, taking into account their functions within this disclosure; however, these terms may vary depending on the intent of those skilled in the art, case law, the emergence of new technologies, etc. Additionally, in specific cases, terms have been selected at the applicant's discretion, and in such cases, their meanings will be described in detail in the relevant explanatory sections. Therefore, terms used in this disclosure should be defined not merely by their names, but based on their meanings and the overall content of this disclosure. Singular expressions may include plural expressions unless the context clearly indicates otherwise. Terms used herein, including technical or scientific terms, may have the same meaning as generally understood by those skilled in the art as described in this specification. Additionally, terms including ordinal numbers, such as "first" or "second," used in this specification may be used to describe various components, but said components should not be limited by said terms. Such terms are used solely for the purpose of distinguishing one component from another. When a part of a specification is described as "comprising" a certain component, this means that, unless specifically stated otherwise, it does not exclude other components but may include additional components. Furthermore, terms such as "part" or "module" as used in the specification refer to a unit that processes at least one function or operation, and this may be implemented in hardware or software, or as a combination of hardware and software. Emb