A concurrency toolbox for c++11, including a cached thread pool executor, a shared timed mutex, a fair semaphore and several other utilities.
-
Updated
Feb 23, 2017 - C++
A concurrency toolbox for c++11, including a cached thread pool executor, a shared timed mutex, a fair semaphore and several other utilities.
4-way set associative cache with 1024 sets, 32-bit, byte addressable computer
Implemented modern last-level cache(LLC) with the concept of "Perceptron Learning for Reuse Prediction" that use neural network idea, which is training the predictor by a smaller independent cache with a series of features.
Implementation of the Cortex-A53 memory system using a virtual memory simulator to reveal the key steps such as instruction fetch, address generation and computation, tag searches in caches, TLBs and virtual to physical address translations.
Using Belady's algorithm for improved cache replacement
💫 A feature complete LRU cache implementation in C++
A native module for Node supporting LRU (least-recently-used).
Nachos (Not Another Completely Heuristic Operating System) is an educational operating system.
The simulator is capable of implementing one/two levels of cache with LRU / FIFO / Optimal replacement policy and non-inclusive / inclusive cache eviction policy.
In this repository, we designed a LRU (Least Recently Used) cache.
A naive LRU caching service implemented in C++
INFORMATION_SCHEMA plugin to aggregate unreferenced page information in the InnoDB Old Sublist
Simple and reliable LRU cache for c++ based on hashmap and linkedlist
A project for Advanced Operating System(CS604) that implements ARC cache replacement policy.
A header only C++17 LRU Cache template class that allows you to define key, value and optionally the Map type. uses a double linked list and a std::unordered_map style container to provide fast insert, delete and update No dependencies other than the C++ standard library. The goal was to create a fast LRUCache header only library and to avoid an…
Add a description, image, and links to the lru-cache topic page so that developers can more easily learn about it.
To associate your repository with the lru-cache topic, visit your repo's landing page and select "manage topics."