Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

operation to introspect the state of the cache #35

Open
tychoish opened this issue Dec 25, 2023 · 3 comments
Open

operation to introspect the state of the cache #35

tychoish opened this issue Dec 25, 2023 · 3 comments

Comments

@tychoish
Copy link

I think number of keys is probably enough, but there's all sorts of information that might be useful (size in bytes, counters for hits/misses).

I'm mostly thinking about the inevitable future when we're trying to understand where memory is allocated in production and I can already see the impulse to blame the memoize cache.

@dermesser
Copy link
Owner

As I have stated in issues here before, while I'm open to feature requests, some of the feature requests sound like you are expecting too much from an automatic memoization annotation :-) your use case may be better served by a custom struct which implements the necessary caching and observability features; if you have a bunch of functions to memoize, there may be a pattern allowing a single implementation.

Otherwise there will be a proliferation of magically appearing, automatically named functions somewhere around the original function which all do some stuff, and that seems less preferable than a clear and explicit implementation.

@tychoish
Copy link
Author

I really am interested in using this for only a handful of functions, and while I agree that more complex caching systems have their place for these more complex use cases that you outline, I don't really think that makes sense.

I definitely understand and respect the desire to limit scope, and indeed if you think that the package is feature complete and should be frozen then by all means. Having said that, I propagating a length (say) or other easily accessible information from the cache seems reasonable.

@gzsombor
Copy link

I have similar requirement, so I've implemented a "size" function here: #38

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants