Giter Club home page Giter Club logo

ttlcache's Introduction

@isaacs/ttlcache

The time-based use-recency-unaware cousin of lru-cache

Usage

Essentially, this is the same API as lru-cache, but it does not do LRU tracking, and is bound primarily by time, rather than space. Since entries are not purged based on recency of use, it can save a lot of extra work managing linked lists, mapping keys to pointers, and so on.

TTLs are millisecond granularity.

If a capacity limit is set, then the soonest-expiring items are purged first, to bring it down to the size limit.

Iteration is in order from soonest expiring until latest expiring.

If multiple items are expiring in the same ms, then the soonest-added items are considered "older" for purposes of iterating and purging down to capacity.

A TTL must be set for every entry, which can be defaulted in the constructor.

Custom size calculation is not supported. Max capacity is simply the count of items in the cache.

const TTLCache = require('@isaacs/ttlcache')
const cache = new TTLCache({ max: 10000, ttl: 1000 })

// set some value
cache.set(1, 2)

// 999 ms later
cache.has(1) // returns true
cache.get(1) // returns 2

// 1000 ms later
cache.get(1) // returns undefined
cache.has(1) // returns false

Caveat Regarding Timers and Graceful Exits

On Node.js, this module uses the Timeout.unref() method to prevent its internal setTimeout calls from keeping the process running indefinitely. However, on other systems such as Deno, where the setTimeout method does not return an object with an unref() method, the process will stay open as long as any unexpired entry exists in the cache.

You may call cache.cancelTimer() to clear the timeout and allow the process to exit normally. Be advised that canceling the timer in this way will of course prevent anything from expiring.

API

const TTLCache = require('@isaacs/ttlcache') or import TTLCache from '@isaacs/ttlcache'

Default export is the TTLCache class.

new TTLCache({ ttl, max = Infinty, updateAgeOnGet = false, checkAgeOnGet = false, noUpdateTTL = false, noDisposeOnSet = false })

Create a new TTLCache object.

  • max The max number of items to keep in the cache. Must be positive integer or Infinity, defaults to Infinity (ie, limited only by TTL, not by item count).

  • ttl The max time in ms to store items. Overridable on the set() method. Must be a positive integer or Infinity (see note below about immortality hazards). If undefined in constructor, then a TTL must be provided in each set() call.

  • updateAgeOnGet Should the age of an item be updated when it is retrieved? Defaults to false. Overridable on the get() method.

  • checkAgeOnGet Check the TTL whenever an item is retrieved with get(). If the item is past its ttl, but the timer has not yet fired, then delete it and return undefined. By default, the cache will return a value if it has one, even if it is technically beyond its TTL.

  • noUpdateTTL Should setting a new value for an existing key leave the TTL unchanged? Defaults to false. Overridable on the set() method. (Note that TTL is always updated if the item is expired, since that is treated as a new set() and the old item is no longer relevant.)

  • dispose Method called with (value, key, reason) when an item is removed from the cache. Called once item is fully removed from cache. It is safe to re-add at this point, but note that adding when reason is 'set' can result in infinite recursion if noDisponseOnSet is not specified.

    Disposal reasons:

    • 'stale' TTL expired.
    • 'set' Overwritten with a new different value.
    • 'evict' Removed from the cache to stay within capacity limit.
    • 'delete' Explicitly deleted with cache.delete() or cache.clear()
  • noDisposeOnSet Do not call dispose() method when overwriting a key with a new value. Defaults to false. Overridable on set() method.

When used as an iterator, like for (const [key, value] of cache) or [...cache], the cache yields the same results as the entries() method.

cache.size

The number of items in the cache.

cache.set(key, value, { ttl, noUpdateTTL, noDisposeOnSet } = {})

Store a value in the cache for the specified time.

ttl and noUpdateTTL optionally override defaults on the constructor.

Returns the cache object.

cache.get(key, {updateAgeOnGet, checkAgeOnGet, ttl} = {})

Get an item stored in the cache. Returns undefined if the item is not in the cache (including if it has expired and been purged).

If updateAgeOnGet is true, then re-add the item into the cache with the updated ttl value. All options default to the settings on the constructor.

If checkAgeOnGet, then an item will be deleted if it is found to be beyond its TTL, which can happen if the setTimeout timer has not yet fired to trigger its expiration.

Note that using updateAgeOnGet can effectively simulate a "least-recently-used" type of algorithm, by repeatedly updating the TTL of items as they are used. However, if you find yourself doing this, consider using lru-cache, as it is much more optimized for an LRU use case.

cache.getRemainingTTL(key)

Return the remaining time before an item expires. Returns 0 if the item is not found in the cache or is already expired.

cache.has(key)

Return true if the item is in the cache.

cache.delete(key)

Remove an item from the cache.

cache.clear()

Delete all items from the cache.

cache.entries()

Return an iterator that walks through each [key, value] from soonest expiring to latest expiring. (Items expiring at the same time are walked in insertion order.)

Default iteration method for the cache object.

cache.keys()

Return an iterator that walks through each key from soonest expiring to latest expiring.

cache.values()

Return an iterator that walks through each value from soonest expiring to latest expiring.

cache.cancelTimer()

Clear the internal timer, and stop automatically expiring items when their TTL expires.

This allows the process to exit normally on Deno and other platforms that lack Node's Timer.unref() method.

Internal Methods

You should not ever call these, they are managed automatically.

purgeStale

Internal

Removes items which have expired. Called automatically.

purgeToCapacity

Internal

Removes soonest-expiring items when the capacity limit is reached. Called automatically.

dispose

Internal

Called when an item is removed from the cache and should be disposed. Set this on the constructor options.

setTimer

Internal

Called when an with a ttl is added. This ensures that only one timer is setup at once. Called automatically.

Algorithm

The cache uses two Map objects. The first maps item keys to their expiration time, and the second maps item keys to their values. Then, a null-prototype object uses the expiration time as keys, with the value being an array of all the keys expiring at that time.

This leverages a few important features of modern JavaScript engines for fairly good performance:

  • Map objects are highly optimized for referring to arbitrary values by arbitrary keys.
  • Objects with solely integer-numeric keys are iterated in sorted numeric order rather than insertion order, and insertions in the middle of the key ordering are still very fast. This is true of all modern JS engines tested at the time of this module's creation, but most particularly V8 (the engine in Node.js).

When it is time to prune, we can always walk the null-prototype object in iteration order, deleting items until we come to the first key greater than the current time.

Thus, the start time doesn't need to be tracked, only the expiration time. When an item age is updated (either explicitly on get(), or by setting to a new value), it is deleted and re-inserted.

Immortality Hazards

It is possible to set a TTL of Infinity, in which case an item will never expire. As it does not expire, its TTL is not tracked, and getRemainingTTL() will return Infinity for that key.

If you do this, then the item will never be purged. Create enough immortal values, and the cache will grow to consume all available memory. If find yourself doing this, it's probably better to use a different data structure, such as a Map or plain old object to store values, as it will have better performance and the hazards will be more obvious.

ttlcache's People

Contributors

dlebech avatar edsondewes avatar irudoy avatar isaacs avatar marcbachmann avatar smacker avatar stimulcross avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

ttlcache's Issues

Too slow

for (const exp in this.expirations) takes over 200ms to get the first entry when 1e6+ entries exist.

Potential issues in CI environment

After adding this package into our repository we have started seeing non deterministic timeouts in our CI pipeline. I don't have any certainty yet, but is there any known issues with this package causing CI test runs to deadlock in some way? We are running in gitlab CI with node.

I am opening this for now just to start a discussion. As we investigate this issue I will provide additional information.

Feature request: dispose function should be passed the time when an entry would have naturally expired

I have a use case where I need to know, if an entry is evicted because the cache is full, what time that entry would have naturally expired. Thus, I would like to change the signature of the dispose callback function from:

dispose(value, key, reason)

to

dispose(value, key, reason, expirationTime)

where expirationTime is the time, in milliseconds since the epoch, when the entry would have naturally expired. Of course, if reason is 'stale' then expirationTime isn't that useful, but if reason is 'set', 'evict' or 'delete' then there are use cases where expirationTime could be valuable.

Right now I can work around this by just setting the value to an object that includes the expiration time, but it would be cleaner/more accurate if I could get the expirationTime directly in the dispose method.

I looked through the source and this should be an easy change. I'm happy to submit a PR for it if you think this feature request is a good idea, but wanted to check with you first. I think this change should be considered backwards compatible as it's just passing an additional parameter to dispose (previously existing dispose functions would just ignore the expirationTime).

Thanks for a great and useful library!

[BUG] cannot graceful stop due to leaked Timeout

Reproduce

import TTLCache from "https://esm.sh/@isaacs/[email protected]";

const cache = new TTLCache({
  max: 10000,
  ttl: 1000,
})

cache.set('a', 'b', {ttl: 180000})
cache.clear()

Actual

$ time deno run reproduce.ts
real    3m

Expected

$ time deno run reproduce.ts
real    0m

Source

ttlcache/index.js

Lines 78 to 83 in 7317c57

if (!this.expirations[expiration]) {
const t = setTimeout(() => this.purgeStale(), ttl)
/* istanbul ignore else - affordance for non-node envs */
if (t.unref) t.unref()
this.expirations[expiration] = []
}

Also

nodejs also cannot graceful stop due to that leaked Timeout

purgeStale skips purge due to expiration ceil (assumption) making cache use stale values

I am trying to set up API rate limiter using ttlcache:

const requestsCountByKeyCache = new TTLCache({
  ttl: thresholdPeriodInSeconds * 1000,
  noUpdateTTL: true,
});

The idea is that requests keep coming in, and I count them using some key, e.g. API_TOKEN_xxxx. I do not refresh TTL and let them expire, value will reset automatically.

And yet I am hitting rate limit errors. I've set limit to 3 and sending 2 request per second, here is the log:

2022-06-29T06:04:41.811Z API_TOKEN_my.app 1
2022-06-29T06:04:41.812Z API_TOKEN_my.app 2
2022-06-29T06:04:42.812Z purgeStale
2022-06-29T06:04:42.812Z purge done, current value: 2 new data Map(0) {}
2022-06-29T06:04:42.815Z API_TOKEN_my.app 1
2022-06-29T06:04:42.817Z API_TOKEN_my.app 2
2022-06-29T06:04:43.809Z API_TOKEN_my.app 3
2022-06-29T06:04:43.811Z API_TOKEN_my.app 4
2022-06-29T06:04:43.814Z purgeStale
2022-06-29T06:04:43.814Z purgeStale, exp > n 8470 8468.831500053406
2022-06-29T06:04:44.809Z API_TOKEN_my.app 5
2022-06-29T06:04:44.810Z API_TOKEN_my.app 6
2022-06-29T06:04:45.810Z purgeStale
2022-06-29T06:04:45.811Z purge done, current value: 6 new data Map(0) {}

purgeStale, exp > n 8470 8468.831500053406 <- here we skip resetting data and it seems the next one will be queued upon next set. And no checks are done while getting value from the cache.

Thank you very much for your work.

typescript support

The project is not contain any ts declare file, but just infer type by js code. Is there any plan to support it?

Calling `.clear()` while elements are in the cache causes an exception when they are subsequently attempted to be purged as stale

TypeError: this.expirations[exp] is not iterable
    at TTLCache.purgeStale (node_modules/@isaacs/ttlcache/index.js:204:41)
    at Timeout._onTimeout (node_modules/@isaacs/ttlcache/index.js:79:41)
    at listOnTimeout (node:internal/timers:564:17)
    at process.processTimers (node:internal/timers:507:7)

for (const key of this.expirations[exp]) {

TTL = Infinity

Hello and thanks for this package.

I'd like to propose that TTL could be set to Infinity pretty much same as for the max config param.
This would let setting some items as "pinned" into a TTL cache mechanism.

Thanks again,
Marco

Iterating entries/keys/values in dispose() function results in undefined entry for the stale element

In the purgeStale() method, you can see it deletes map entries from this.data and this.expirationMap before calling this.dispose(), but it leaves the entry in this.expirations until after that call finishes. This has to be refactored somehow to also remove the entry from this.expirations before calling this.dispose().

See expiration logic here:

for (const key of this.expirations[exp]) {

See iteration logic here:

*entries() {

With the current behavior, when I iterate elements in my dispose() function, the first element returns as undefined.

Feature request: Control over stale entries being emitting before or after collection removal

First, thank you so much for your fast resolution of #16 and #17. I see that dispose is emitting 'stale' only after it is fully removed from the collection now.

With that said, is there any possibility of implementing at least one of the two:

  • A flag that can dictate whether 'stale' is emitted before or after removal from the collection?
  • A 'pre-stale' event that emits before removal from the collection

For now, I can just collect the value removed from the argument in my dispose() function, but having it in the collection for simple iteration would be nice in my use case.

Module not found: Can't resolve 'perf_hooks'

I tried to integrate this ttlcache into a Next.js (Typescript) project and receive this error:

Module not found: Can't resolve 'perf_hooks' in '[...]/node_modules/@isaacs/ttlcache'

Steps To Reproduce

  • Create a fresh Next.js (Typescript) application
yarn create next-app my-app --ts
  • Install @isaacs/ttlcache package
yarn add @isaacs/ttlcache
  • Open _app.tsx and create an instance of TTLCache
import "../styles/globals.css";
import type { AppProps } from "next/app";
import TTLCache from "@isaacs/ttlcache";

const cache = new TTLCache({ max: 10, ttl: 1000 });

function MyApp({ Component, pageProps }: AppProps) {
  return <Component {...pageProps} />;
}

export default MyApp;
  • Run yarn dev, open http://localhost:3000 and observe

warn - ./node_modules/@isaacs/ttlcache/index.js
Module not found: Can't resolve 'perf_hooks' in '[...]/node_modules/@isaacs/ttlcache'

bug: deleting an immortal entry causes an error

When I try to delete an immortal entry (set with Infinity TTL) I get the following error:

"C:\Program Files\nodejs\node.exe" C:\Dev\ttlcache\test.js
C:\Dev\ttlcache\index.js:164
        this.expirations[current] = exp.filter(k => k !== key)
                                        ^

TypeError: Cannot read properties of undefined (reading 'filter')
    at TTLCache.delete (C:\Dev\ttlcache\index.js:164:41)
    at Object.<anonymous> (C:\Dev\ttlcache\test.js:6:25)
    at Module._compile (node:internal/modules/cjs/loader:1149:14)
    at Module._extensions..js (node:internal/modules/cjs/loader:1203:10)
    at Module.load (node:internal/modules/cjs/loader:1027:32)
    at Module._load (node:internal/modules/cjs/loader:868:12)
    at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:81:12)
    at node:internal/main/run_main_module:23:47

Node.js v18.10.0

Process finished with exit code 1

Code to reproduce the issue:

const TtlCache = require('@isaacs/ttlcache');
const cache = new TtlCache({ttl: Infinity});
cache.set('key', 'val');
cache.delete('key');

Entries with TTL set to Infinity is not present in the expirations object and the exp const can be undefined which causes the error.

So I fixed it and added a test for such a case. I will link a PR ๐Ÿ™ƒ

cache expiration does not work

Although the readme contain this example:

const TTLCache = require('@isaacs/ttlcache')
const cache = new TTLCache({ max: 10000, ttl: 1000 })

// set some value
cache.set(1, 2)

// 999 ms later
cache.has(1) // returns true
cache.get(1) // returns 2

// 1000 ms later
cache.get(1) // returns undefined
cache.has(1) // returns false

it does not work. The cache.get(1) still returning the item and the cache.has(1) returns true even after the expiration timeout. IMO it even can't work as neither the get() nor the has() function check the expirationMap, they check just the data map.

The getRemainingTTL() function correctly return 0 when the item expire, but neither call of this function get the item out of the cache.

Potential race condition issues

Hello,

I apologize for the brief nature of this issue report. Unfortunately, we have limited information to provide, and we are encountering difficulties in reproducing the issue locally. Our usage of the cache is within the context of a Nest microservice, where we have instantiated the cache four times. Recently, we have started experiencing the following problem:

TypeError: this.expirations[exp] is not iterable
    at TTLCache.purgeStale (/app/node_modules/.pnpm/@[email protected]/node_modules/@isaacs/ttlcache/index.js:277:40)
    at Timeout._onTimeout (/app/node_modules/.pnpm/@[email protected]/node_modules/@isaacs/ttlcache/index.js:71:12)
    at listOnTimeout (node:internal/timers:569:17)
    at process.processTimers (node:internal/timers:512:7)

Upon inspecting the source code of the library, it appears that this.expirations undergoes significant mutations, which may result in undefined key values. However, beyond this observation, we do not have additional insights to contribute. We understand that this may not be the most informative issue for you, but we wanted to report it nonetheless.

Thank you for your attention to this matter.

keys function can't return key of Infinity ttl

Hi isaacs,

Here is a example,

When I call cache.keys, it can't get key if ttl is Infinity.
How do I get correct keys?

Thanks,

const TTLCache = require('@isaacs/ttlcache')
const cache = new TTLCache({ max: 10000, ttl: 1000 });

// set some value
cache.set(1, 2, { ttl: Infinity});
console.log(cache.has(1)); // returns true
console.log(Array.from(cache.keys())); // []



cache.set(1, 2, { ttl: 1000 });
console.log(cache.has(1)); // returns true
console.log(Array.from(cache.keys())); // [1]

Add `maxSize` `maxEntrySize` and `sizeCalculation` options

hello!
firstly, thank you for creating all these cool packages.

i've been searching for an in-memory cache primarily focused on TTLs. LRU caching doesn't align with my needs, as I require a cache that prioritizes evicting entries ( when cache is full ) nearing their expiration over the least used ones. that's the main reason I prefer using this package instead of node-lru-cache

however, another crucial feature I need is the ability to set constraints on the storage size, specifically in bytes.

I noticed that node-lru-cache offers options like maxSize, maxEntrySize, and sizeCalculation, which would be perfect. unfortunately, these aren't available in this library, and i believe they are important additions.

Would you be open to add this feature? i would be more than willing to help with a PR if needed

Module not found: Can't resolve 'perf_hooks'

Hi @isaacs, following up #8 (comment), this is my minimal setup to reproduce the warning. Thank you so much for your help.

Step to reproduct

  1. Create a fresh Next.js application
npx create-next-app@latest --typescript myapp
  1. Install @isaacs/ttlcache
npm install @isaacs/ttlcache
  1. Edit myapp/pages/_app.tsx
import '../styles/globals.css'
import type { AppProps } from 'next/app'

import TTLCache from '@isaacs/ttlcache'
const cache = new TTLCache({ max: 10000, ttl: 1000 })

function MyApp({ Component, pageProps }: AppProps) {
  return <Component {...pageProps} />
}

export default MyApp
  1. Observe

warn - ./node_modules/@isaacs/ttlcache/index.js
Module not found: Can't resolve 'perf_hooks' in '/.../myapp/node_modules/@isaacs/ttlcache'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.