Skip to content

Making several http requests using keep-alive mode generates this trace: Warning: Possible EventEmitter memory leak detected. #9268

New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Closed
arnaud-soulard opened this issue Oct 25, 2016 · 20 comments
Labels
http Issues or PRs related to the http subsystem.

Comments

@arnaud-soulard
Copy link

  • Version: v6.9.1
  • Platform: CentOS release 6.8 (Final)
  • Subsystem:

When making 20 http requests using keep-alive mode with node v6.9.1, I get the following trace:
Warning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit

With node v4.4.7, this trace is not displayed.

It could be (from http://www.jongleberry.com/understanding-possible-eventemitter-leaks.html) that a listener has not been removed

@arnaud-soulard
Copy link
Author

httpRequests.js.zip

@arnaud-soulard
Copy link
Author

I joined httpRequests.js that shows the warning when executed with node v6.9.1 on my platform.

@bnoordhuis
Copy link
Member

Can you trim down your test case so it doesn't depend on a third-party module? It's allowed (and encouraged) to paste it inline.

@bnoordhuis bnoordhuis added http Issues or PRs related to the http subsystem. unconfirmed labels Oct 25, 2016
@arnaud-soulard
Copy link
Author

arnaud-soulard commented Oct 25, 2016

// make several http requests to reproduce the warning with node v6:
//(node:19604) Warning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit

var asyncMapSeries = require('async/mapSeries');
var url = require('url');
var http = require('http');
var https = require('https');

var httpAgent = new http.Agent({ keepAlive: true });
var httpsAgent = new https.Agent({ keepAlive: true });

function request(requestUrl, requestOptions, data, callback) {
    var options = url.parse(requestUrl);

    options.agent = ((options.protocol === 'https:') ? httpsAgent : httpAgent);

    // Add supp. options
    if (requestOptions) {
        var suppOptions = ((typeof requestOptions === 'string') ? JSON.parse(requestOptions) : requestOptions);
        if (suppOptions.query) {
            options.path += ((options.path.indexOf('?') > 0) ? '&' : '?') + suppOptions.query;
        }
        Object.assign(options, suppOptions);
    }

    // Add content-length header
    if (!options.headers) {
        options.headers = {};
    }
    if (data) {
        options.headers['Content-Length'] = data.length;
    }

    // Send request
    var t0 = Date.now();
    var httpRequest = ((options.protocol === 'https:') ? https.request : http.request);
    var req = httpRequest(options, function(res) {
        var chunks = [];
        res.setEncoding('utf8');
        res.on('data', function (chunk) {
            chunks.push(chunk);
        });
        res.on('end', function() {
            callback(null, {method:options.method, url:requestUrl, status:res.statusCode, timeTaken:Date.now()-t0, response:chunks.join('')});
        });
    }).on('error', function(err) {
        callback(requestUrl + ': ' + err.message);
    });

    if (options.timeout) {
        req.setTimeout(options.timeout, function() {
            console.error('HTTP Request timeout ('+options.timeout+' ms): '+requestUrl);
            req.abort();
        });
    }

    if (data) {
        req.write(data);
    }
    req.end();
}

var requestUrl = 'http://example.com/foo.html';
var requestOptions = {
    method: 'GET',
    timeout: 10000,
    headers: {
    'User-Agent': 'Mozilla/5.0'
    }
};
var requestData = undefined;

var array = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18];
//var array = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19];

asyncMapSeries(array,
    function(arrayEntry, asyncCallback) {
        request(requestUrl, requestOptions, requestData,
            function(err, result) {
                asyncCallback(err, result);
            }
        );
    },
    function(err, results) {
        if (err) {
            console.log('request error:', err);
            process.exit(1);
        } else {
            console.log('all requests done');
            process.exit(0);
        }
    }
);

@gibfahn
Copy link
Member

gibfahn commented Oct 25, 2016

@arnaud-soulard I edited your comment to format it as js, hope that's ok!

@arnaud-soulard
Copy link
Author

Thank you very much!

@bnoordhuis
Copy link
Member

@arnaud-soulard When I said 'third-party module', I was referring to async/mapSeries. Can you remove that from your snippet?

@Foosvald
Copy link

Foosvald commented Nov 1, 2016

@bnoordhuis I've removed the third-party module in the example and added some memory usage information, because we think that we experience the same problem in our setup.

This is the code we ran:

var url = require('url');
var http = require('http');

var httpAgent = new http.Agent({ keepAlive: true });

function request(callback) {
  var options = url.parse('http://example.com/foo.html');

  options.agent = httpAgent;
  options.method = 'GET';
  options.timeout = 10000;

  var req = http.request(options, function (res) {
    var chunks = [];
    res.setEncoding('utf8');
    res.on('data', function (chunk) {
      chunks.push(chunk);
    });
    res.on('end', function () {
      callback(null, res.statusCode);
    });
  }).on('error', function (err) {
    callback(err.message);
  });

  req.end();
}

function asyncMapSeries(times, fn) {
  let p = Promise.resolve();
  for (let i = 0; i < times; i++) {
    p = p.then(fn);
  }
  return p;
}

asyncMapSeries(100000,
    function () {
      return new Promise((resolve, reject) => {
        request(function (err, result) {
              if (err) {
                return reject(err)
              }
              return resolve(result)
            }
        );
      })
    }
).then((result) => {
  console.log('All done.')
}).catch((err) => {
  console.log("err", err)
});

setInterval(() => {
  gc();
  console.log(process.memoryUsage());
}, 5000);

And just as @arnaud-soulard reported we get a memory leak warning and an actual memory leak in 6.9.1 when using timeouts and keepAlive. This is the output from 6.9.1:

6.9.1 timeout and keepAlive

/v6.9.1/bin/node --expose-gc index.js
(node:815) Warning: Possible EventEmitter memory leak detected. 11 timeout listeners added. Use emitter.setMaxListeners() to increase limit
{ rss: 105869312, heapTotal: 89165824, heapUsed: 56382952 }
{ rss: 111697920, heapTotal: 93360128, heapUsed: 56923360 }
{ rss: 112099328, heapTotal: 94408704, heapUsed: 57153376 }
{ rss: 80588800, heapTotal: 64000000, heapUsed: 57040856 }
{ rss: 80707584, heapTotal: 65048576, heapUsed: 57403400 }
{ rss: 80904192, heapTotal: 65048576, heapUsed: 57549568 }
{ rss: 81403904, heapTotal: 64000000, heapUsed: 57743496 }
{ rss: 81383424, heapTotal: 65048576, heapUsed: 57941816 }
{ rss: 81604608, heapTotal: 66097152, heapUsed: 58125928 }
{ rss: 80445440, heapTotal: 67145728, heapUsed: 58306784 }
{ rss: 81043456, heapTotal: 66097152, heapUsed: 58459384 }
{ rss: 80822272, heapTotal: 67145728, heapUsed: 58625736 }

As you can see we get the warning and then the heapUsed is increasing.

When running with only keepAlive but no timeout we get no warning and no leak:

6.9.1 only keepAlive

/v6.9.1/bin/node --expose-gc index.js
{ rss: 105848832, heapTotal: 89165824, heapUsed: 56170352 }
{ rss: 110899200, heapTotal: 93360128, heapUsed: 56508960 }
{ rss: 79167488, heapTotal: 61902848, heapUsed: 56555168 }
{ rss: 78831616, heapTotal: 61902848, heapUsed: 56299032 }
{ rss: 79888384, heapTotal: 61902848, heapUsed: 56444200 }
{ rss: 79937536, heapTotal: 64000000, heapUsed: 56421264 }
{ rss: 80007168, heapTotal: 64000000, heapUsed: 56417296 }
{ rss: 80273408, heapTotal: 62951424, heapUsed: 56434856 }
{ rss: 79929344, heapTotal: 62951424, heapUsed: 56423256 }
{ rss: 80056320, heapTotal: 62951424, heapUsed: 56455912 }
{ rss: 80089088, heapTotal: 64000000, heapUsed: 56422672 }

We tried pinpointing which Node version introduced the leak and found that it's present in 6.8.0 but not in 6.7.0.

6.7.0 timeout and keepAlive

/v6.7.0/bin/node --expose-gc index.js
{ rss: 105660416, heapTotal: 89165824, heapUsed: 56142048 }
{ rss: 111349760, heapTotal: 93360128, heapUsed: 56983552 }
{ rss: 79482880, heapTotal: 61902848, heapUsed: 57037944 }
{ rss: 79208448, heapTotal: 62951424, heapUsed: 56775624 }
{ rss: 80134144, heapTotal: 64000000, heapUsed: 56923136 }
{ rss: 80490496, heapTotal: 62951424, heapUsed: 56885816 }
{ rss: 80240640, heapTotal: 62951424, heapUsed: 56902400 }
{ rss: 80277504, heapTotal: 62951424, heapUsed: 56919968 }
{ rss: 80281600, heapTotal: 62951424, heapUsed: 56914312 }

Could it have been introduced in c9b59e8?

@grahamj
Copy link

grahamj commented Nov 2, 2016

I'm seeing this too - we moved from 4.x to 6.9 the other day and have been seeing crashing due to memory. I spent today chasing it down and found that simply turning off keepalive when creating our Agent (which we pass to https://github.com/request/request with maxsockets 10) stops the leak.

What's interesting is I still get the Possible EventEmitter memory leak detected warning, just no leak.

@evantorrie
Copy link
Contributor

The problem is due to this commit: ee7af01

Specifically these three lines of code

if (req.timeout) {

 if (req.timeout) {
    socket.once('timeout', () => req.emit('timeout'));
  }

In the case of keepalive, the socket is reused -- so each time a new listener is added () => req.emit('timeout') a closure containing a reference to the existing req object is created and retained.

This has caused our application major memory leak problems, and appears to have been there since 6.8.1. https://github.com/nodejs/node/tree/v6.8.1

@karlbohlmark
Copy link

@evantorrie Yup, that's what my pull request addresses. I'm a little bit surprised that a memory leak in the LTS is not given more priority.

We had a bunch of services crash in production after upgrading to 6.9.1.

@oiime
Copy link

oiime commented Nov 18, 2016

@evantorrie Same problem, right now I bypass it by not using setTimeout on the request and implementing my own timer that resolves without waiting for the request. I definitely think it should be resolved in LTS as this kind of bug is prone to show up only in production and can be quite the headache to debug

@targos
Copy link
Member

targos commented Jan 8, 2017

Closing as the fix in #9440 landed.

@targos targos closed this as completed Jan 8, 2017
@andrasq
Copy link

andrasq commented Jan 9, 2017

@targos @italoacasas I don't know the process, does landing #9440 also get it into LTS?
I don't see the ticket on the lts-watch-v6.x tagged list (because it's closed I presume), but
the change is not in 6.x yet.

@arnaud-soulard v6.7.0 did not have this bug, you could try that. We rolled back to 6.2.2 but
are considering 6.7.0 as well.

@targos
Copy link
Member

targos commented Jan 9, 2017

The PR is labeled lts-watch-v6.x so the fix will eventually be cherry-picked to 6.x.

@andrasq
Copy link

andrasq commented Jan 9, 2017

Thanks @targos!

@pkoretic
Copy link

pkoretic commented Feb 7, 2017

still not cherry-picked in 6.9.5 for those interested, maybe if somebody could speed this up, bad bug for lts really @targos

@targos
Copy link
Member

targos commented Feb 7, 2017

It will be in the next release. See #10974

@Removed-5an
Copy link

Is it possible this is fixed in 6.10 and not 7.5? If this really solves it in 6.10 I can run my application on node 6.10.

@karlbohlmark
Copy link

@5an1ty This is merged in 6.10

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
http Issues or PRs related to the http subsystem.
Projects
None yet
Development

Successfully merging a pull request may close this issue.