Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed. #27

Open
Fares92 opened this issue Sep 9, 2020 · 19 comments

Comments

@Fares92
Copy link

Fares92 commented Sep 9, 2020

when using copy multipart in the middle of the copy action , i have this error message "The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed." i use log message to show errors so some parts was copied sucessfuly but after this error all the action was aborted , does anyone has an idea
logs
error log
foreach
copyPart

@jeffbski-rga
Copy link

I am getting this error as well.

@Fares92
Copy link
Author

Fares92 commented Oct 14, 2020

@jeffbski-rga i fix it , i 'm using lambda function on AWS and the memory is limited so after many loops the memory is full so the system can't know our file that's why we have this error "The specified upload does not exis" , so i set the a dynamic number of parts depend on the size like that 👍
`let part= Math.trunc(size/10)
let options = {
source_bucket:inputBucket ,
object_key: sourceFile,
destination_bucket: bucket,
copied_object_name: outputKey,
object_size: size,
copy_part_size_bytes: part,
copied_object_permissions: 'bucket-owner-full-control',
};
console.log('options', options)
'use strict';
console.log('copying to output bucket', bucket, sourceFile, outputKey, inputBucket);

    return s3Module.copyObjectMultipart(options)`

So i have always only 10 loops and it works , i can help you if you have a screenshot of your error

@hajjimurad
Copy link

It happens for me as well and I can reproduce the error on the small files. From what I see the error happens, if the file size is less than copy_part_size_bytes parameter.

@Fares92
Copy link
Author

Fares92 commented Nov 20, 2020

@hajjimurad the problem that the Lambda function have a limited memory (max 512 mo) so if the for loop has many loops the lambda function is out of service that why it return this error , i solve it by limit the number of loops at 10 and i divide the size /10 wo we have always 10 part

@hajjimurad
Copy link

@Fares92 in my case I was getting the error in a regular script, not in Lambda.

@jeffbski-rga
Copy link

@Fares92 I figured out that my problem was when I use it with a file size less than 5MB which is not supported by multipart apparently. So locally I use normal s3 copy for smaller than 5MB and then multipart copy for larger files.

@jox
Copy link

jox commented Feb 10, 2022

The 5MB minimum file size limit is self-imposed by this module. It's a bug. I fixed it and made a pull request: #45

@namdiemefo
Copy link

i am still getting this error

@jox
Copy link

jox commented Mar 5, 2022

@namdiemefo do you mean you applied my fix and you still get the error?

@namdiemefo
Copy link

yes i did and im still getting the error .

i followed the tutorials of both https://gist.github.com/nick-ChenZe/71670af034744ee3fe9d801de632836d and
https://medium.com/@vishnuit18/uploading-files-to-aws-s3-bucket-using-nodejs-8c399eea2d19 . still got the same issue

@namdiemefo
Copy link

`

function startUpload(req, res, next) {

const params = {
    Key: `${Date.now().toString()}.mp4`,
    Bucket: bucket_name,
}

let file = req.file;
// var buffer = Buffer.from(file.toString(), 'base64')
console.log(__dirname)
var buffer = fs.readFileSync('/Users/namdiemefo/Desktop/naemo/tag/tagging-server/tagging-panel/app/services/MFMFCVSKWARAUNITED.mp4')

const chunkSize = Math.pow(1024, 2) * 10;
const bufferSize = buffer.length;
console.log(bufferSize)
let numPartsLeft = Math.ceil(bufferSize / chunkSize);
console.log(numPartsLeft);
const arr = Array.from(Array(numPartsLeft).keys());

s3.createMultipartUpload(params, function (err, multipart) {
    if (err) {
        console.log('Error!', err);
        return;
    }
    console.log("Got upload ID", multipart.UploadId);

    for (var rangeStart = 0; rangeStart < buffer.length; rangeStart += chunkSize) {
        partNum++;
        var end = Math.min(rangeStart + chunkSize, buffer.length),
        partParams = {
            Body: buffer.slice(rangeStart, end),
            Bucket: bucket_name,
            Key: `${Date.now().toString()}.mp4`,
            PartNumber: String(partNum),
            UploadId: multipart.UploadId,
            
        };
    
        // Send a single part
        console.log('Uploading part: #', partParams.PartNumber, ', Range start:', rangeStart);
        uploadPart(multipart, partParams, numPartsLeft);
    }

    // arr.map(item => {

    //     partNum++;
    //     // var end = Math.min(rangeStart + chunkSize, buffer.length),
        //    var partParams = {
        //         Body: buffer.slice((item - 1) * chunkSize, item * chunkSize),
        //         Bucket: bucket_name,
        //         Key: `${Date.now().toString()}.mp4`,
        //         PartNumber: String(partNum),
        //         UploadId: multipart.UploadId
        //     };

    //     // Send a single part
    //     console.log('Uploading part: #', partParams.PartNumber, ', Range start:', item);
    //     uploadPart(s3, multipart, partParams, numPartsLeft);

    // })

})


    // Grab each partSize chunk and upload it as a part
//     for (var rangeStart = 0; rangeStart < buffer.length; rangeStart += chunkSize) {
//         partNum++;
//         var end = Math.min(rangeStart + chunkSize, buffer.length),
            // partParams = {
            //     Body: buffer.slice(rangeStart, end),
            //     Bucket: bucket_name,
            //     Key: `${Date.now().toString()}.mp4`,
            //     PartNumber: String(partNum),
            //     UploadId: multipart.UploadId
            // };

//         // Send a single part
//         console.log('Uploading part: #', partParams.PartNumber, ', Range start:', rangeStart);
//         uploadPart(s3, multipart, partParams, numPartsLeft);

//     }

// })

}

function uploadPart(multipart, partParams, numPartsLeft, tryNumber) {
var tryNum = tryNumber || 1;
console.log(${tryNum} ${numPartsLeft})
// console.log(partParams)
s3.uploadPart(partParams, function (err, data) {
console.log("in")
if (err) {
console.log('multiErr, upload part error:', err);
if (tryNum < maxUploadTries) {
console.log('Retrying upload of part: #', partParams.PartNumber)
uploadPart(multipart, partParams, numPartsLeft, tryNum + 1);
} else {
console.log('Failed uploading part: #', partParams.PartNumber)
}
return;
}
console.log("inner")
multipartMap.Parts[this.request.params.PartNumber - 1] = {
ETag: data.ETag,
PartNumber: Number(this.request.params.PartNumber)
}

    console.log("Completed part", this.request.params.PartNumber);
    console.log('mData', data);
    if (--numPartsLeft > 0) return; // complete only when all parts uploaded

    var doneParams = {
        Bucket: bucket_name,
        Key: `${Date.now().toString()}.mp4`,
        MultipartUpload: multipartMap,
        UploadId: multipart.UploadId
    };

    console.log("Completing upload...");
    completeMultipartUpload(doneParams);

})

}

function completeMultipartUpload(doneParams) {
s3.completeMultipartUpload(doneParams, function (err, data) {
if (err) {
console.log("An error occurred while completing the multipart upload");
console.log(err);
} else {
var delta = (new Date() - startTime) / 1000;
console.log('Completed upload in', delta, 'seconds');
console.log('Final upload data:', data);
}
});
}

`

@jox
Copy link

jox commented Mar 5, 2022

May I doubt that you applied my fix? It seems like you didn't even use
the aws-s3-multipart-copy module.

Why not use a gist or something to post your code?

https://gist.github.com/

@namdiemefo
Copy link

my bad , i used the aws-sdk . here is the full code https://gist.github.com/namdiemefo/4db23767b9be11857383c5e3d5378114

@jox
Copy link

jox commented Mar 5, 2022

My advise: Use the aws-s3-multipart-copy module.

@namdiemefo
Copy link

namdiemefo commented Mar 5, 2022 via email

@chiragdev8062
Copy link

any solution on this ? I'm getting same error The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.

@pablogeek
Copy link

same error here, any solution?

@aguerrah9
Copy link

I don't know if it helps anyone but what I did was this.
In the s3 console -> bucket -> permissions - CORS
I added this line:
{
...
"ExposeHeaders": [
"ETag"
],
...
}

@ramonpaolo
Copy link

I got the same error 😞

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants