-
Notifications
You must be signed in to change notification settings - Fork 76
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Shared singleton for using with Spark #246
Comments
I think if you want to share some state between deserialised objects, you'll need some kind of global state, or deserialise a If you'd go with the global state, I think it's exactly as you write - you need some kind of cache. and But maybe serialising the funciton to create a |
Thanks for the response. I played a little an implemented a sharedSingleton that keeps the instances in a The rest is quite similar with how the ProxyingScope is implemented. |
When using Spark with external resources like a database, a somehow common pattern is to make the database client shared between tasks so the connection pool is shared. Otherwise, with a large number of tasks/threads, the database connections are exhausted and will lead to issues when scaling.
This rises some complications when using such an object, as it must implement some kind of singleton shared between threads that receive serialized objects.
Any idea on how to do this with MacWire? Any pattern that can be used?
A simple example:
So the idea would be to have something instead of wire, or beside, that would make it use a single instance. I was thinking to implement a shared singleton
Scope
that picks the instance from a concurrent collection, would this be the best way to do it?The text was updated successfully, but these errors were encountered: