-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PUT bulk object potential issue #594
Comments
How are you transferring the object from S3 to BP? Are you downloading the object from S3 locally and then sending it to BP? Or are you piping the stream directly from the S3 download into the BP’s put call? The error In your code snippet for your implementation of the Also, setting |
The stream of bytes comes directly from S3.
The
Regarding the max upload size, is setting the ds3Client.putBulkJobSpectraS3(
new PutBulkJobSpectraS3Request(
"bucket",
List(new Ds3Object(ds3Object.key, ds3Object.size))
).withMaxUploadSize(ds3Object.size)
) I'm asking because the behaviour is different than expected. The We managed to PUT objects bigger than 64 GB, using the |
Streaming Strategy Since you have non-seekable input streams, you need to run the job with Max Upload Size Setting the |
We set the max upload size in the options, along with other properties:
and then used the
We did everything as in the javadoc and then the write would fail at the default of 64GB. We are not using the helper currently and it works as expected. Is there any test for this scenario using the helper class? |
Hello,
We've been having some issues transfering an object from S3 to Blackpearl using this library, version 5.4.0
We are transfering using the following
ObjectChannelBuilder
It usually happens when we are transfering files bigger than 64 GB.
I configured the max upload size
but didn't change the behaviour. (I assume this might not affect the blob size)
Please see stack trace below, any feedback is welcome.
The text was updated successfully, but these errors were encountered: