Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add cache mechanism to line count estimation to reduce backend load #44978

Closed
hawkingrei opened this issue Jun 27, 2023 · 0 comments · Fixed by #44979
Closed

Add cache mechanism to line count estimation to reduce backend load #44978

hawkingrei opened this issue Jun 27, 2023 · 0 comments · Fixed by #44979

Comments

@hawkingrei
Copy link
Member

hawkingrei commented Jun 27, 2023

Enhancement

Currently, every time the line count estimation feature is called, a request is sent to the backend, which may cause a certain burden on the backend. To solve this problem, we can add a cache mechanism to store the results of line count estimation locally, thereby reducing the number of requests sent to the backend.

Solution

We can add a local cache to store the results of line count estimation. When the line count estimation is called, first check whether there are corresponding results in the local cache. If so, return the results directly from the cache. Otherwise, send a request to the backend and store the results in the local cache.
To avoid cache expiration issues, we can set a cache time, and after this time, the cache will automatically expire and send a new request to the backend to get the latest line count estimation results.

Implementation Details

Add a cache mechanism in the frontend code to store the results of line count estimation;
Check if there are corresponding results in the cache;
If there are, return the results directly from the cache;
If not, send a request to the backend and store the results in the cache;
Set a cache time, and after this time, the cache will automatically expire and send a new request to the backend to get the latest line count estimation results.

Expected Outcome

By adding a cache mechanism, we can reduce the number of requests sent to the backend, thereby reducing the backend load, improving system performance and stability. At the same time, we can reduce user waiting time and improve user experience.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant