-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Too many open files #280
Comments
it may not be the issue but what are you using to insert the data ? |
I already experienced that before on 0.4 and I'm pretty sure I solved it. Atm I'm at ~1500 connections Influx Server:
App Server, sending the inserts:
Both values are pretty stable, as far as I can tell from watching them. |
This is probably related to #277, which we'll be releasing a fix for today in rc.2. You'll have to blow away your data directories and start fresh. If the problem is still there please re-open this issue. |
Even with RC2, the problem persists:
|
Are you inserting a bunch of events all in the same time period or are the timestamps spread out ov multiple days? Also, what's your open file limit? |
I'm inserting about 160 timeseries, each with >1Mio events, in chunks of 1000 events per insert. The data goes back to Nov/Dec 2012.
|
That's definitely low for an InfluxDB server. Particularly if you're going to have 1500 concurrent connections. Any qualms about setting it to 100k or something like that? I did just notice that in the big refactor the max-open-files option didn't get moved over. I just pushed a commit that fixes that. However, a new LevelDB is created per shard, so if you're writing a ton of data in, that |
Ok, I've increased the file limits and will see how it goes overnight. The old limits were the defaults and I wasn't aware this could be an issue. |
Yep, that seems to have solved my issue, thank you @pauldix :) |
I had to increase my soft limit for maximum number of open files, which was set to 1024. |
I would like to add this comment here for reference, as this issue is still one of the first results on Google:
|
I'm having a similar issue in 1.2.3; after writing a few hundred rows I get an error message about too many open files. In order to fix it should I set my ulimit or kern.maxfiles to something higher? Or is there a configuration option in influxdb.conf? |
hi,
cu denny |
After upgrading to 0.5 I quickly run into "too many open files" while batch-inserting lots of events. Haven't had this issue on 0.4 with the same script migrating the data:
The text was updated successfully, but these errors were encountered: