Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

http: IncomingMessage emits 'end' after 'close' #29295

Closed
ronag opened this issue Aug 24, 2019 · 5 comments
Closed

http: IncomingMessage emits 'end' after 'close' #29295

ronag opened this issue Aug 24, 2019 · 5 comments
Labels
http Issues or PRs related to the http subsystem.

Comments

@ronag
Copy link
Member

ronag commented Aug 24, 2019

'end' can be emitted after 'close'. This can cause e.g. pipeline to error with a
ERR_STREAM_PREMATURE_CLOSE.

See, nxtedition#1 for repo test.

@ronag
Copy link
Member Author

ronag commented Aug 24, 2019

Possibly related #27916 @lpinca

@ronag
Copy link
Member Author

ronag commented Aug 24, 2019

If this is resolved, I believe 779a05d can be re-applied inside a 'close' listener.

@ronag
Copy link
Member Author

ronag commented Aug 24, 2019

Slightly related to addaleax@9aedf72 @addaleax

@addaleax addaleax added the http Issues or PRs related to the http subsystem. label Aug 24, 2019
@ronag
Copy link
Member Author

ronag commented Oct 6, 2019

There are PR's for this

@ronag ronag closed this as completed Oct 6, 2019
@gustavomassa
Copy link

gustavomassa commented Nov 8, 2019

@ronag Sorry to bother you, but i think I'm facing this issue.

Node Version: 10.15.3
Linux Mint 19.3 x64

I'm testing the node streams back-pressuring with pipeline and transform stream to stream data from mongodb cursor dynamically instead of allocating the entire mongodb cursor on RAM.
I'm reading from the mongodb cursor, using a transform stream to stringfy the objects and piping the results to express.response(http).
I'm receiving the error "Premature close", but the data was sent correctly.
Maybe just ignore the premature close error?

image

export class MongoCursorTransform extends Transform {
    private firstChunk: boolean;

    constructor() {
        super({ readableObjectMode: true, writableObjectMode: true });
        this.firstChunk = true;
    }

    _transform(chunk, encoding, callback) {
        if (this.firstChunk) {
            this.firstChunk = false;
            callback(null, '[' + JSON.stringify(chunk) + ',');

        } else callback(null, JSON.stringify(chunk) + ',');
    }

    _flush(callback) {
        callback(null, '{}]');
    }
}

const operation = new Promise((resolve, reject, onCancel) => {
                onCancel(() => {
                    if (cursor) cursor.destroy();
                    if (res) res.destroy();
                    return;
                });

                //NODE STREAM PIPELINE TEST
                const res = reqParams['res'];
                res.once('error', function (err) {
                    Webbuffet.logError(err);
                    //reject(Webbuffet.error(Status.FAILED, err.message));
                    return Webbuffet.error(Status.FAILED, err.message);
                });
                res.writeHead(200, {
                    'Content-Type': 'application/json',
                    'Transfer-Encoding': 'chunked'
                });

                const cursor = this.collection(this.processCollectionName).aggregate(aggregationPipeline, { maxTimeMS: timeout, allowDiskUse: false, cursor: { batchSize: 0 } });
                pipeline(
                    <any>cursor,
                    new MongoCursorTransform(),
                    res,
                    (err) => {
                        if (err) {
                            reject(Webbuffet.error(Status.FAILED, err.message));
                        } else resolve(Webbuffet.success(true));
                    }
                );
            });

            return operation.timeout(Webbuffet.getRemainingTimeout(reqParams.start, reqParams.timeout));

As workaround I just ignored the error code 'ERR_STREAM_PREMATURE_CLOSE'

const cursor = this.collection(this.processCollectionName).aggregate(aggregationPipeline, { maxTimeMS: timeout, allowDiskUse: false, cursor: { batchSize: 0 } });
                pipeline(
                    <any>cursor,
                    new MongoCursorTransform(),
                    res,
                    (err) => {
                        if (err && err.code !== 'ERR_STREAM_PREMATURE_CLOSE') {
                            reject(Webbuffet.error(Status.FAILED, err.message));
                        } else resolve(Webbuffet.success(true));
                    }
                );

Another question not related to the issues, is there a way to know when I'm receiving the last chunk inside the transform stream? Before the end/finish events.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
http Issues or PRs related to the http subsystem.
Projects
None yet
Development

No branches or pull requests

3 participants