-
Notifications
You must be signed in to change notification settings - Fork 9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
EventEmitter memory leak detected. 11 exit listeners added #594
Comments
One possible solution is using child process:
I still wonder if this is the only current solution. |
I've also hit this in node 7.10 but I'm getting 'finish' listener. My service renders a pdf one at a time, it runs launch and creates a new page for each pdf. But this seems to have occurred after the service was running for a while.
|
Lighthouse and chrome-launcher handled this in GoogleChrome/lighthouse#2959. |
@selfrefactor your script should be working fine now. |
@aslushnikov Thank you for the fast fix, but it still doesn't work for me. I tested it on local machine and Cloud9 instance and I make sure that the code reads your changes, but the same error appears again. |
@selfrefactor ah, indeed, you launch chrome processes in parallel. Every chrome instance adds a listener to the process's "exit" event to cleanup properly. 12 chrome instances add 12 listeners, which yields the warning. You can fix this by adding const puppeteer = require('puppeteer')
const R = require('rambda')
process.setMaxListeners(Infinity); // <== Important line
const resolution = {
x : 1920,
y : 1080,
}
// ... Note: you don't need to launch browser just to create a page. Instead, you can open multiple pages in the same browser: const puppeteer = require('puppeteer')
const R = require('rambda')
const resolution = {
x : 1920,
y : 1080,
}
const args = [
'--disable-gpu',
`--window-size=${ resolution.x },${ resolution.y }`,
'--no-sandbox',
]
let browser;
const work = async () => {
const page = await browser.newPage()
const url = 'https://ilearnsmarter.com/learning-meme'
await page.setViewport({
width : resolution.x,
height : resolution.y,
})
await page.goto(url, { waitUntil : 'networkidle' })
await page.close();
}
const fn = async () => {
browser = await puppeteer.launch({
headless : true,
handleSIGINT : false,
args : args,
})
const promised = R.range(0, 12).map(() => work())
await Promise.all(promised)
browser.close()
console.log('DONE')
}
fn() |
const puppeteer = require('puppeteer');
// bad url with no protocol fails fast
const jobs = [...new Array(12)].map(() => 'google.com');
const work = async (url) => {
try {
const browser = await puppeteer.launch({ dumpio: true });
const page = await browser.newPage();
await page.goto(url);
browser.close();
} catch (err) {
console.log(err.message);
} finally {
if (jobs.length) {
work(jobs.pop());
} else {
process.exit();
}
}
};
work(jobs.pop()); The listeners don't seem to get cleaned up, and if I dont have the process.exit it doesn't ever exit the process. Also you can see there are a bunch of event emitter errors show like unpipe, drain, error. These are only shown when using dumpio. Setting max event listeners is not an option for me as I'm using as part of a long running process so it will leak memory. output:
|
@domarmstrong the reason your script doesn't work with tip-of-tree puppeteer is because you don't close browsers if exception is thrown. This is a memory leak. The following script works just fine: const puppeteer = require('puppeteer');
// bad url with no protocol fails fast
const jobs = [...new Array(12)].map(() => 'google.com');
const work = async (url) => {
const browser = await puppeteer.launch({dumpio: true});
try {
const page = await browser.newPage();
await page.goto(url);
} catch (err) {
console.log(err.message);
} finally {
browser.close();
if (jobs.length) {
work(jobs.pop());
} else {
process.exit();
}
}
};
work(jobs.pop()); if you pass |
OK, that's a very good point lol. But still you get the exit and SIGINT errors. const puppeteer = require('puppeteer');
// bad url with no protocol fails fast
const jobs = [...new Array(12)].map(() => 'https://google.com');
const work = async (url) => {
let browser;
try {
browser = await puppeteer.launch({ dumpio: true });
const page = await browser.newPage();
await page.goto(url);
} catch (err) {
console.log(err.message);
} finally {
if (browser) {
browser.close();
}
if (jobs.length) {
work(jobs.pop());
} else {
process.exit();
}
}
};
work(jobs.pop());
|
@domarmstrong What's your puppeteer version? Can you run the |
Yep your right. That's fixed then. 👍 thanks. |
@aslushnikov Actually I don't need it to run in parallel and the code really works in sequential context. Thank you for the efforts. |
Had the same issue with |
Got a queue of 12 items to process at once. I was creating 12 browsers with a page for each and getting the above error. Then I tried 1 browser for all tabs and noticed the time to process all 12 of those items doubled and had issues when closing pages breaking the others. Then tried 4x browser with 3 pages each and still got lots of issues with previously closed pages causing the others to break. Not sure what happened but for me I'm going to just increase the limit to remove the error. |
(node:25120) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 exit listeners added. Use emitter.setMaxListeners() to increase limit |
right before on your |
For me, I would like to ask, whats the best way to capture more than 20 screenshots or different Urls?
Because my DB Table contains more than 50 urls After I added the line below. Its just killing my server and I have to do a manual reboot for my site to work again. I will appreciate any help rendered. |
Be aware that there are well known event listener leaks in several base Node modules. For example, net socket has this issue. |
This is the full error message:
MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 exit listeners added. Use emitter.setMaxListeners() to increase limit
I run Node v8.1.2 with tip-of-tree Puppeteer
The code is written with
Promise.all
, but even if I runwork()
sequentially the same error appears.The text was updated successfully, but these errors were encountered: