-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LocalCache: limit allowed blob size #69
Conversation
Since the 900 MB limit is specific to the sqlite cache and I would like to keep the magic numbers in one place rather than four, could you move the if statement to I'm also happy to refactor that. |
Sounds good. I'll work on that. |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #69 +/- ##
==========================================
- Coverage 79.58% 79.41% -0.17%
==========================================
Files 30 30
Lines 2243 2254 +11
==========================================
+ Hits 1785 1790 +5
- Misses 458 464 +6 ☔ View full report in Codecov by Sentry. |
@rly Is this what you had in mind? |
Yes, awesome. Thanks for the change! |
on pypi as 0.3.7 |
I found a situation where the chunk size was larger than 1 GB and the sqllite save failed. This particular dataset was not chunked, and this should not be an issue for most dandisets where chunk sizes are reasonable.
So with this PR I limit the size of chunks in local cache to 900 MB.
See https://www.sqlite.org/limits.html
(For some reason, despite what it says in that limits.html, it seems that 1000 MB doesn't work whereas 900 MB does)
For LindiH5ZarrStore and LindiReferenceFileSystemStore, I display a warning if the chunk size is too large, and then skip storing it in local cache. For LindiRemfile I raise an exception because this should never happen in that context.
I added a test.