You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If Yes ... this is the issue and suggestion:
When calling the readdir / readdirp we just need to pass the path,
and in case the directory have more then 1000 files - the limit that AWS return files,
the listAllObjectsFiles call to itself to fetch the next marker.
In case that the directory serve for huge amount of files this recursion cause a memory exception,
because it always check if data.IsTruncated
If you can add 2 options:
Accept option "Marker" property to fetch from specific file key
Option to fetch one set of result - if true will ignore the inner check if data.IsTruncated
default = false so the existing
I assume it can be relevant to all the functions that use the listAllObjectsFiles - like rmdir / rmdirp
Hi,
Just to make sure,
this is the repo of npm : https://www.npmjs.com/package/s3fs ... ?
If Yes ... this is the issue and suggestion:
When calling the readdir / readdirp we just need to pass the path,
and in case the directory have more then 1000 files - the limit that AWS return files,
the listAllObjectsFiles call to itself to fetch the next marker.
In case that the directory serve for huge amount of files this recursion cause a memory exception,
because it always check if data.IsTruncated
If you can add 2 options:
default = false so the existing
I assume it can be relevant to all the functions that use the listAllObjectsFiles - like rmdir / rmdirp
Relevant docs can be found in:
https://docs.aws.amazon.com/AmazonS3/latest/API/API_ListObjects.html
The text was updated successfully, but these errors were encountered: