Search

CN-115357619-B - Data caching method and device, server and storage medium

CN115357619BCN 115357619 BCN115357619 BCN 115357619BCN-115357619-B

Abstract

The disclosure relates to the technical field of data caching, and in particular relates to a data caching method, a data caching device, a server and a storage medium. A data caching method comprises the steps of sending a first data request message to a first database based on a first timing task, wherein the first timing task comprises the steps of requesting a specific number of original data to the first database at fixed time, obtaining the original data sent by the first database, processing the original data to obtain cached data, sending the cached data to a second database to be cached, sending a second data request message to the second database based on a second timing task, and the second timing task comprises the steps of requesting a specific number of cached data of a specific type to the second database at fixed time, obtaining the cached data sent by the second database and updating the cached data to a local cache. The method reduces response time of the request data, reduces direct access to the database under the high concurrency situation, and relieves pressure of the database.

Inventors

  • RAN QISHAN
  • LIANG JINGFAN

Assignees

  • 北京达佳互联信息技术有限公司

Dates

Publication Date
20260512
Application Date
20220823

Claims (8)

  1. 1. A method for caching data, the method comprising: a first data request message is sent to a first database based on a first timing task, wherein the first timing task comprises the steps of requesting a specific amount of original data from the first database at fixed time, and the specific amount is determined based on the rolling rate of the data when displayed at the front end; acquiring original data sent by the first database, and processing the original data to obtain cache data; Sending the cache data to a second database for caching; the second database is used for storing different types of cache data in a queue, wherein the different types of cache data correspond to different queues, and each queue corresponds to a maximum cache threshold; if the current storage amount of the queue exceeds the maximum cache threshold corresponding to the queue, removing data stored in the queue earliest, so that the removed storage amount of the queue is smaller than or equal to the maximum cache threshold corresponding to the queue; Sending a second data request message to the second database based on a second timing task, the second timing task comprising requesting a specific amount of cached data of a specific type from the second database at a timing; And obtaining the cache data sent by the second database, and updating the cache data into a local cache.
  2. 2. The method according to claim 1, wherein the method further comprises: receiving a data display request; And acquiring the data requested by the data display request from the local cache, and displaying the data at the front end.
  3. 3. The method of claim 1, wherein processing the raw data to obtain cached data comprises classifying each piece of data in the raw data.
  4. 4. A data caching apparatus, the apparatus comprising: a first transmitting unit configured to transmit a first data request message to a first database based on a first timing task including requesting a specific amount of raw data to the first database at a timing, the specific amount being determined based on a scroll rate at which data is displayed at a front end; The first acquisition unit is configured to acquire the original data sent by the first database and process the original data to obtain cache data; A second transmitting unit configured to transmit the buffered data to a second database for buffering; the second database is used for storing different types of cache data in a queue, wherein the different types of cache data correspond to different queues, and each queue corresponds to a maximum cache threshold; if the current storage amount of the queue exceeds the maximum cache threshold corresponding to the queue, removing data stored in the queue earliest, so that the removed storage amount of the queue is smaller than or equal to the maximum cache threshold corresponding to the queue; a third sending unit configured to send a second data request message to the second database based on a second timing task, wherein the second timing task includes requesting a specific amount of cached data of a specific type from the second database at a timing; the second acquisition unit is configured to acquire the cache data sent by the second database and update the cache data into a local cache.
  5. 5. The apparatus of claim 4, wherein the apparatus further comprises: a receiving unit configured to receive a data display request; and the display unit is configured to acquire the data requested by the data display request from the local cache and display the data at the front end based on the data.
  6. 6. The apparatus of claim 4, wherein the first acquisition unit is configured to classify each piece of data in the raw data.
  7. 7. A server for a server, which comprises a server and a server, characterized by comprising the following steps: A processor; a memory for storing the processor-executable instructions; Wherein the processor is configured to execute the instructions to implement the method of any of claims 1-3.
  8. 8. A storage medium, which when executed by a processor of a server, enables the server to perform the method of any one of claims 1-3.

Description

Data caching method and device, server and storage medium Technical Field The disclosure relates to the technical field of data caching, and in particular relates to a data caching method, a data caching device, a server and a storage medium. Background The ticker animation refers to an animation effect of message rolling playing, which is realized through programming. And the data for ticker animation display is ticker data. In the prior art, ticker data is generally stored through a database, when detailed information of a certain ticker data is specifically displayed, original data needs to be retrieved from the database, and then the original data is processed according to the setting of a front page and then displayed through the front page. However, in the related art, the response time for acquiring the ticker data by querying the database is long, and frequent querying of the database in a high concurrency scene can cause excessive pressure of the database, so that the availability of the database is reduced, and the performance of the whole system is further reduced. Disclosure of Invention The disclosure provides a data caching method, a data caching device, a server and a storage medium, so as to at least solve the problems of long response time of request data and large database pressure in the data caching method in the related art. The technical scheme of the present disclosure is as follows: According to a first aspect of an embodiment of the present disclosure, there is provided a data caching method, including: Sending a first data request message to a first database based on the first timing task, wherein the first timing task comprises the steps of requesting a specific amount of original data from the first database at fixed time; acquiring original data sent by the first database, and processing the original data to obtain cache data; Sending the cache data to a second database for caching; Sending a second data request message to the second database based on a second timing task, wherein the second timing task comprises requesting a specific amount of cached data of a specific type from the second database at a timing; And obtaining the cache data sent by the second database, and updating the cache data into a local cache. Optionally, the method further comprises: receiving a data display request; And acquiring the data requested by the data display request from the local cache, and displaying the data at the front end. Optionally, the processing the original data to obtain the cached data includes classifying each piece of data in the original data. Optionally, the second database is used for storing different types of cache data in a queue, wherein the different types of cache data correspond to different queues. Optionally, the different queues correspond to a maximum buffer threshold, and the second database is further configured to: And if the current storage amount of the queue exceeds the maximum cache threshold corresponding to the queue, removing the data stored in the queue earliest, so that the removed storage amount of the queue is smaller than or equal to the maximum cache threshold corresponding to the queue. Optionally, the specific number is determined based on a scroll rate at which the data is displayed at the front end. According to a second aspect of an embodiment of the present disclosure, there is provided a data caching apparatus, including: a first transmitting unit configured to transmit a first data request message to a first database based on a first timing task including requesting a specific amount of raw data to the first database at a timing; The first acquisition unit is configured to acquire the original data sent by the first database and process the original data to obtain cache data; a second transmitting unit configured to transmit the buffered data to a second database for buffering; a third sending unit configured to send a second data request message to the second database based on a second timing task, wherein the second timing task includes requesting a specific amount of cached data of a specific type from the second database at a timing; the second acquisition unit is configured to acquire the cache data sent by the second database and update the cache data into a local cache. Optionally, the apparatus further includes: a receiving unit configured to receive a data display request; and the display unit is configured to acquire the data requested by the data display request from the local cache and display the data at the front end based on the data. Optionally, the first obtaining unit is configured to classify each piece of data in the original data. According to a third aspect of the present disclosure, there is provided a server comprising: A processor; a memory for storing the processor-executable instructions; Wherein the processor is configured to execute the instructions to implement the method for caching data according to the first a