Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

list objects of huge directories limit files #1

Open
al-oath opened this issue Mar 25, 2020 · 0 comments
Open

list objects of huge directories limit files #1

al-oath opened this issue Mar 25, 2020 · 0 comments

Comments

@al-oath
Copy link

al-oath commented Mar 25, 2020

Hi,

Just to make sure,
this is the repo of npm : https://www.npmjs.com/package/s3fs ... ?

If Yes ... this is the issue and suggestion:
When calling the readdir / readdirp we just need to pass the path,
and in case the directory have more then 1000 files - the limit that AWS return files,
the listAllObjectsFiles call to itself to fetch the next marker.

In case that the directory serve for huge amount of files this recursion cause a memory exception,
because it always check if data.IsTruncated

If you can add 2 options:

  1. Accept option "Marker" property to fetch from specific file key
  2. Option to fetch one set of result - if true will ignore the inner check if data.IsTruncated
    default = false so the existing

I assume it can be relevant to all the functions that use the listAllObjectsFiles - like rmdir / rmdirp

Relevant docs can be found in:
https://docs.aws.amazon.com/AmazonS3/latest/API/API_ListObjects.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant