Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(lru): don't clear an already cached key on set #725

Closed
wants to merge 1 commit into from
Closed

fix(lru): don't clear an already cached key on set #725

wants to merge 1 commit into from

Conversation

jviide
Copy link
Contributor

@jviide jviide commented Jul 6, 2024

This pull request modifies how the LRU cache implementation handles setting a key that's already in the cache. Previously, if an already cached key was set again, then it was always cleared from the cache:

> const cache = new LRUCache();
> cache.set(1, 1);
> cache.set(1, 2);
> cache.get(1);
undefined

This pull request's modified version returns 2 for cache.get(1). The newly set key is still bumped to the last place in the LRU queue. There are two new tests to check this behavior.

The current semver implementation might never trigger this behavior, in which case this pull request is just future-proofing.

@jviide jviide requested a review from a team as a code owner July 6, 2024 23:40
@wraithgar
Copy link
Member

This is a great example of practically dead code that unit tests could never catch. The way that classes/range.js uses this lib is to call get and if it's in there return it. If it's not, it creates a new one and sets it. All of the code between creating the range and setting it is synchronous, no other event loop action could happen. It's currently impossible for a set to even be called on an entry that already exists.

Tests that only interacted with ranges would have shown that some of this code was unreachable from the surface area of the library. Unfortunately those tests don't exist, and unless someone wanted to put in that work we have what we have today.

With all that in mind, I propose that the real optimization here is to remove the first this.delete(key) altogether. It's a noop during the current actual use case of this lru code.

Comment on lines +24 to 25
this.delete(key)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
this.delete(key)

Copy link
Contributor Author

@jviide jviide Jul 13, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removing the this.delete(key) would turn the LRU cache implementation into a mixture of LRU and FIFO caches. It may not be relevant with the module's current usage pattern. However, it could lead to a surprise down the road if the cache module's usage is expanded.

I suggest either keeping the delete call, removing the delete while calling the cache implementation something else than an LRU, or just closing this pull request as unnecessary (as it does not affect currently observable behavior like you pointed out). I ran the benchmarks and removing or keeping the delete call seems to make no difference there, though there are currently no benchmarks that heavily exercise the .set method.

@wraithgar
Copy link
Member

Gonna err on the side of caution then and close this as not necessary. Thanks for the time spent, all the same.

@wraithgar wraithgar closed this Jul 16, 2024
@jviide jviide deleted the lru-fix branch July 16, 2024 17:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants