Skip to content

Triple-Whale/distributed-lru-cache

 
 

Repository files navigation

Distributed LRU Cache

Library for all distributed in memory cache needs

Supported Backends

  • Redis

Supported Functions

  • Get
  • Set
  • Has
  • Del

Installation

yarn add @crushers-lab/distributed-lru-cache

OR

npm i @crushers-lab/distributed-lru-cache

Basic Usage

import LRUCache, {RedisBackend} from '@crushers-lab/distributed-lru-cache';
import {createClient} from 'redis';

const client = createClient({
    url: 'redis://localhost:6379',
});

const cache = await LRUCache.create({
    ttl: 60000,
    backend: {
        client: new RedisBackend(client, {
            prefix: '',
            ttl: 10000,
            database: 10,
        }),
        expiryListener: true,
    },
});

await cache.set('key', 'value');
const value = await cache.get('key');
const exists = await cache.has('key');
await cache.del('key');

Read More


About

Library for all distributed in memory cache needs

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 74.1%
  • JavaScript 25.9%