Comments (2)
To set some context, LRU caches at most n items, and keeps them in memory as long as possible in order to maximize the chance of a cache hit. In other words, budget some fixed amount of memory for caching and try to maximize efficiency. TLRU is the same, but items have an explicit TTL and can therefore age out. So, expiry is only really applicable to TLRU.
I had considered adding a Trim method similar to MemoryCache.Trim(int percent) that could be hooked up to a background thread or some other event in the program like inactivity or memory pressure. It would work slightly differently for each cache:
LRU.Trim(int itemCount)
trim the n least recently used items.TLRU.Trim(int itemCount)
trim all e=expired items, then the remaining n-e least recently used items, if any.
TLRU.Expire()
would then be implemented as TLRU.Trim(0)
to remove only expired items. You would then be able to explicitly choose whether to spin up a thread or hook into some other existing process, and how aggressively to trim.
The main reason Trim() doesn't exist is because I have never needed it in practice. Do you have a situation in mind where this is a deal breaker and LRU eviction doesn't work?
from bitfaster.caching.
Do you have a situation in mind where this is a deal breaker
I'm just looking for a complete replacement for MemoryCache
which is slower than yours
from bitfaster.caching.
Related Issues (20)
- Serialise contents of LRU HOT 1
- [Feature request] - Enumerate all cached keys HOT 1
- Is it possible to add a method where it'll provide rough memory that is being used by the cache stance? HOT 2
- Is there a way to dump all values in the cache? HOT 1
- FastConcurrentTLru's Constructor timetolive is not precise on liunx mac platform HOT 7
- Expiration based on a Value's property? HOT 2
- NuGet package is missing intellisense xml file HOT 1
- Use NonBlocking instead of ConcurrentDictionary HOT 1
- [Feature request] individual items expiry HOT 1
- Current time provider HOT 6
- [Feature request] Allow to pass additional factory argument to the `GetOrAdd`/`GetOrAddAsync` cache methods HOT 2
- Entry left in cache configured with WithAtomicGetOrAdd when value factory throws HOT 4
- [Feature request] Atomic TryRemove HOT 3
- ConcurrentLru's "GetOrAdd" vs ConcurrentDictionary's "GetOrAdd" behave differently HOT 2
- Ability to Clear the cache HOT 2
- Enumerate all cached keys HOT 2
- Add a GetOrAdd that takes a value? HOT 6
- [Feature]: Add ttl policy that takes a Func(Value, Timestamp) to resolve timestamp from value HOT 2
- [Feature] Add event for when an item is expired. HOT 6
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from bitfaster.caching.