Skip to content

Store Compiled Packet Parsers in a global cache - Resolves #722 #723

New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Merged
merged 1 commit into from
Feb 5, 2018

Conversation

aikar
Copy link
Contributor

@aikar aikar commented Feb 2, 2018

See #722 for details

When a connection is closed, any compiled parser created for that will
be lost, as the connection is removed.

However, Node/V8 Internals appear to be holding onto any created function.

This can result in memory leaking based on connection cycling and new
parsers being created again.

This change creates a global cache store since everything around the generated
code is not connection specific.

This also completes a TODO to use an LRU cache, although I set a high value
as to avoid running into the leak if the LRU is too small.

An API is provided for anyone who does need to control the max value, as
well as an API to clear the parser cache.



When a connection is closed, any compiled parser created for that will
be lost, as the connection is removed.

However, Node/V8 Internals appear to be holding onto any created function.

This can result in memory leaking based on connection cycling and new
parsers being created again.

This change creates a global cache store since everything around the generated
code is not connection specific.

This also completes a TODO to use an LRU cache, although I set a high value
as to avoid running into the leak if the LRU is too small.

An API is provided for anyone who does need to control the max value, as
well as an API to clear the parser cache.
@sidorares
Copy link
Owner

Looking good! I'll try to review carefully later (and I'm away next 3 days) as it's quite core part

Can you think of some tests that can cover cache?

Also would be good what was the cause of original leak

@aikar
Copy link
Contributor Author

aikar commented Feb 2, 2018

It would be too difficult to test the original case, the speed of the leak varied and could take hours to be noticed. Nor can you easily identify it in code since its leaking more natively.

As for testing in general of the change, this was mainly moving code around / changing the where its stored.
the behavior really is sort of the same as it was before when 'caching' is considered. Is it in cache? yes use it, else make and cache it. Except its now 1 global bucket instead of 2 buckets per connection.

I don't think we really need to bother implementing tests to check LRU behavior heh, the LRU library tests that.

Nor do I have the time to even try right now, as spending days on this issue was already undesired to my boss as it was... He's going to be happy to hear it's resolved.

Just to add an update, the (really abusive connection spamming) process has been running for an hour now with 0 signs of leaking.

@sidorares sidorares merged commit b41de75 into sidorares:master Feb 5, 2018
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants