New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Too many open files #731
Comments
How do you get the metadata? do you harvest from a server? |
No it was on an existing db. But some of them may trigger exceptions on indexing - it may be related ? Will do more testing to identify the issue. I have observed this errors 7 times this week. |
I index 6500 metadata on geocat test and integration servers so your system is somehow different. we need to isolate what is different. one thing we need to do is see what files are open. I suspsect they are the schema files but I would like verification. can you do the :
And quickly check which files are open multiple times? Then I will instrument the places I suspect that might provide links and will commit the instrumented code. Once done you can reindex your metadata and hopefully the instrumentation will lead us to the problem code. Jesse |
Not sure if this can be related, but I have probably something wrong with the Requests table which trigger that exception.
It looks like index XSLTs are opened a lot
Return
So maybe related to the oasis catalog resolver ? |
I have added 2 methods to IO (newInputStream and newBufferedReader) and made all calls of Files.newInputStream or newBufferedReader be changed to use those methods. In those methods I wrap the inputstream/reader with the Debugging equivalents (DebuggingReader and DebuggingInputStream). These classes register an exception with the OpenResourceTracker class on creation and remove the exception on closing. Thus the OpenResourceTracker maintains a list of all the open resources (along with the file name) and when there are 1000 open files it will write to the log all of the open files and the stacktrace at the point when they were opened.. Now that I think about it, it will kill the system at that point because each time an resource is opened after 1000 then all open files will be logged repeatedly. essentially crashing the system. Maybe it should be changed to only print once. If you get a linux error you can still:
|
Just committed a fix for this issue. Please verify that it fixes the issue. The fix is 7257438 It turned out that the parsing framework was calling both getInputStreamn and getReader on PathStreamSource which resulted in 2 streams being opened but only 1 was used and thus only 1 was closed. I made a test for this and verified that now the test passes. I also made changes to the open resource tracking. It now will track resources when in development mode but not in production mode. In addition it will print an error when 1000, 2000, 3000 and 4000 resources are open. The error will consist of the first 100 open files and their stack traces. |
Reopening so that @fxprunayre can verify. |
Tested and it does not happen anymore. Thanks Jesse. Closing. |
I get quite often too many open files exceptions.
eg. while indexing catalog with 700 records
It also happens when using the catalog - viewing records.
The text was updated successfully, but these errors were encountered: