Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

lots of goroutines hanging in handleOutgoingMessage #399

Closed
whyrusleeping opened this issue Dec 1, 2014 · 10 comments
Closed

lots of goroutines hanging in handleOutgoingMessage #399

whyrusleeping opened this issue Dec 1, 2014 · 10 comments
Labels
kind/bug A bug in existing code (including security flaws)

Comments

@whyrusleeping
Copy link
Member

I have a node (A) that has added a file and another (B) that is catting it. Node A's RAM usage spikes massively until it runs out of memory, sending a panic before the out of memory happens shows a massive number of goroutines waiting to send their messages in handleOutgoingMessage. Not entirely sure why this is happening yet, but its probably a bug.

@whyrusleeping whyrusleeping added the kind/bug A bug in existing code (including security flaws) label Dec 1, 2014
@btc
Copy link
Contributor

btc commented Dec 2, 2014

pasted_image_12_1_14__4_33_pm

@whyrusleeping
Copy link
Member Author

i like all of those points. the errChan is a likely suspect In my opinion.

@whyrusleeping
Copy link
Member Author

Although, upon review, i dont see that happening anywhere.

@btc
Copy link
Contributor

btc commented Dec 2, 2014

i dont see that happening anywhere

don't see what? 😕

@whyrusleeping
Copy link
Member Author

The errChan being sent on, Im not seeing that case happen in my tests

@btc
Copy link
Contributor

btc commented Dec 2, 2014

Ah I see

https://gist.github.com/maybebtc/c6034877dadbee2b97de

3255 instances of handleOutgoingMessage and 3000 instances of sync.(*Mutex).Lock(...). The lock is held to Get from LevelDB.

This doesn't begin to explain why the muxer's outgoing pipe is blocked up so severely.

@whyrusleeping
Copy link
Member Author

I think one thing we might be able to do to help is not make handleOutgoingMessage its own goroutine. Or at least limit the number allowed in flight at a time.

@btc
Copy link
Contributor

btc commented Dec 2, 2014

one thing we might be able to do to help is not make handleOutgoingMessage its own goroutine

agreed. 👍 to rate-limiting producers. 904e9d5

@jbenet
Copy link
Member

jbenet commented Dec 5, 2014

  1. yeah should continue. we should probably get rid of max message size though. or make it a very large limit (100MB?) just to catch runoff problems. What's the realistic largest message we'd send through here? a large block?

My guess would be 5).

@whyrusleeping
Copy link
Member Author

Fixed in 1026244

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
kind/bug A bug in existing code (including security flaws)
Projects
None yet
Development

No branches or pull requests

3 participants