-
Notifications
You must be signed in to change notification settings - Fork 30.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
io.js 3.0.0 memory leak #2308
Comments
I've also got an app I'm test running at 3.0.0 and after 4 hours it uses the double amount of memory as it used on 2.5.0. And it seems to be climbing on 3.0.0. |
@jbergstroem Sorry, I'm just getting on board to SFO now. |
Is there a test-case that be used to reproduce the memory leak? |
Not that I can think of, unless you already have a system running PM2. Up until the server machine crashes there are no error messages or logs to track it down with. It is most certainly an issue only on io.js 3.0.0, though. I reverted back to 2.5.0 and memory usage has been stable for almost 10 hours now. |
fwiw you can't assume that high memory usage and growing early memory usage are a memory leak, the way V8 handles memory and GC changes over time so you may possibly be dealing with a "normal" memory profile. It'd be good to have data for a longer run. |
Based on the original report in Unitech/pm2#1500, it seems that the growth rate is 30MB/min and the app runs out of memory in a couple of hours. This does smell like a memory leak, but It would be very hard to debug this, or to know if the memory leak is actually in io.js, without a test case that shows the problem. At the least, can you grab and compare heap-snapshots and see what kinds of objects are growing? |
Well, I changed nothing other than upgrading io.js, and tried out both old, current and pre-release versions of PM2. I even tried out bumping down my dependencies from the last time I upgraded those. When viewed with htop, I saw 500+ MB (kept growing until the server crashed) for each process with 3.0.0, versus about 40-60MB stable per process with 2.5.0. I wouldn't call a 1000% memory increase with up to 2MB per second continuous growth at idle a "normal" curve. Perhaps it is completely normal if the bug is in PM2 instead, and previous versions' GC still managed to collect it somehow while 3.0.0 doesn't. My actual application code and processes seems unaffected; only the PM2 processes have such high memory usage. The PM2 processes also handle all the cluster and load balancing stuff, which could be more affected by the new Buffer implementation than my code is. I honestly don't know, but I'm willing to give you any data I can possibly provide. |
To people experiencing the same issue: The best way to drive this issue forward is more metrics over time and a reduced proof of concept we all can test. |
I'm sorry, but I honestly have no idea how to take a heap snapshot of the individual PM2 daemons. |
@novacrazy try importing heapdump and send it a unix signal. |
Alright, I injected |
Before you upload the heapdump, be aware that it contains the full JavaScript heap and may include private information (such as user data, keys, etc.) and the source code of your JavaScript functions. |
@ofrobots Very good advice, thank you. Also, @bnoordhuis, heapdump fails to compile with io.js 3.0.0: http://pastebin.com/RVnY5Mrs Not sure if that is my fault or part of the last V8 upgrade. |
I just closed out a similar issue 30 seconds ago. :-) I don't think that's an issue with node-heapdump but with V8. It's a C++11 project and you'll need at least gcc 4.8 or clang 3.4. You could get away with older versions until now but no more. |
Ah, my gcc was at 5.1 but g++ was still at 4.6 for some reason. Fixed now, thanks. Continuing on... |
It's going much slower now that I'm actually looking at it. Go figure. However, I'm not really seeing any net gain in JavaScript memory usage. Although the io.js 3.0.0 process is already about four times as large as the 2.5.0 version, the observable V8 heap sizes are the same. |
I let the process grow to about 450MB before I killed it, and took another heap snapshot before I did. The heap was actually smaller than it was when the process was around 150MB. In all cases (I took about ten snapshots), the observable V8 heaps of the PM2 daemons never exceeded 11MB, so the memory must be allocated somewhere else. |
@rvagg, @bnoordhuis do you have some sort of writeup/discussion available as to what you did to move away from thanks! |
@JerrySievert Use typed arrays basically. If you look at the changes to src/node.cc in 70d1f32 (warning: big commit), you'll see that we swapped out |
thanks @bnoordhuis, that helps a lot. |
Looks I've just encounted this. Test case and details are in the issue over at mscdex/busboy#92. I'm thinking it must be stream/gc related and I can confirm that in io.js 2.5.0, the process stays at around 60 MB as opposed to 3.0.0 which bloats up to the uploaded file size, seemingly making v8 unable to garbage collect any of it. |
Looks alright to me, no leaks. If there is a memory leak of some kind and it's not some GC quirk, it's probably outside the JS heap. @silverwind Quick check: what happens when you pass If there is a way to easily reproduce it, that would be appreciated. |
@bnoordhuis should be pretty easy to reproduce with this: wget https://gist.githubusercontent.com/silverwind/10f076397348a64a72a5/raw/4d9b3d0a0f43a8ca85620822ac755fd40f25fae6/busboy-memory-test.js
npm i busboy
iojs busboy-memory-test Upload a 1GB file in a second terminal: dd if=/dev/zero of=bigfile bs=1 count=0 seek=1073741824
curl -F filedata=@bigfile http://localhost:6666/upload The script will write |
Fixes: nodejs#2308 PR-URL: nodejs#2352 Reviewed-By: Fedor Indutny <[email protected]> Reviewed-By: Trevor Norris <[email protected]>
In a few places dynamic memory was passed to the Buffer::New() overload that makes a copy of the input, not the one that takes ownership. This commit is a band-aid to fix the memory leaks. Longer term, we should look into using C++11 move semantics more effectively. Fixes: #2308 PR-URL: #2352 Reviewed-By: Fedor Indutny <[email protected]> Reviewed-By: Trevor Norris <[email protected]>
The circular dependency problem that put them there in the first place is no longer an issue. Move them out of the public node_buffer.h header and into the private node_internals.h header. Fixes: #2308 PR-URL: #2352 Reviewed-By: Fedor Indutny <[email protected]> Reviewed-By: Trevor Norris <[email protected]>
Rename the three argument overload of Buffer::New() to Buffer::Copy() and update the code base accordingly. The reason for renaming is to make it impossible to miss a call site. This coincidentally plugs a small memory leak in crypto.getAuthTag(). Fixes: #2308 PR-URL: #2352 Reviewed-By: Fedor Indutny <[email protected]> Reviewed-By: Trevor Norris <[email protected]>
Fixes: #2308 PR-URL: #2352 Reviewed-By: Fedor Indutny <[email protected]> Reviewed-By: Trevor Norris <[email protected]>
While the major leak will be fixed in the upcoming 3.1.0, I think there's still a few minor leaks around. Please post test cases if you have any! |
Current master (actually https://iojs.org/download/nightly/v3.0.1-nightly201508173645dc62ed/) looks pretty good, the https test is showing a possible very slow leak, it's creeping up really slowly but I'm not confident calling it a leak. Will leave it going and see what happens but the main leak(s) appear to be resolved. |
Just in case: could you confirm that there is no mistype? The 3.0.1 memory usage on your graph looks constant with https-ping (the bottom one) and very slightly growing over time with http-ping (the upper one) to me. |
@ChALkeR sorry, you're absolutely correct, it's the http-ping that appears to be leaking while https-ping is stable, which is very strange! http-ping stabilises at around 75k and then gradually, but steadily, adds 10k more by the end of the graph. https-ping fluctuates between 71k and 72k for the entire length. |
http-ping has reached steady state in the last 6 hours. It's probably just memory arena fragmentation or the new v8 engine performing GC less aggressively in this low memory test. It appears to have levelled off. @rvagg - can you provide a link to your RSS ping test source code? |
On a sidenote, my test case with FormData was just invalidated, the issue seems to have been a unconsumed stream (https://gist.github.com/silverwind/54b3829142f93d1127bc#gistcomment-1552967) |
ditto, and see https://github.com/rvagg/node-memtest/tree/master/test directly for source of the tests, rss is collected using |
I'll close this one per the graph above. We don't have any more indicators for more major leaks like the one in 3.0.0. |
See Unitech/pm2#1500 for the bug report filed with PM2 before I realized it was being caused by io.js.
With io.js 2.4.0 and 2.5.0 there were no memory leaks for that module. Perhaps it could have something to do with the new Buffer implementation?
The text was updated successfully, but these errors were encountered: