-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HTTP 500 (max execution time exceeded) when searching for not limited events #2426
Comments
First of all make sure you upgrade your MISP, there have been a lot of improvements since 2.4.71 that deal with performance / export memory usage etc. Not setting any filters means that you are dealing with a massive amount of data that is being prepared and returned to you (as well as being looped through all the ACL, export modifications, etc). Make sure that you have enough memory available for PHP and that your MySQL is optimised. Also make sure that you increase the execution time if it's too low. Could you share the settings that you are using currently? Also, most importantly, make sure you update first! :) |
max_execution_time for PHP and also the web server timeout if you didn't change the setup. |
@iglocska As you can see memory for php is set to 2G which is way too much. Php-fpm consumes not more than 5% of memory but 100% CPU instead. What settings do you asking for? I don’t want to set any additional filters, just all unpublished events. @cristianbell |
Your database call is only fetching the event though ;) The api call will fetch: All of this while filtering all of the data-points based on your ACL that is build using your user account, organisation, sharing groups, etc, with export modifications done to the data. Then we didn't talk about the samples yet. It's really not something you can compare ;) 2G for php is not too much, the data set in itself that you are exporting in raw JSON is probably close larger than 2G alone if you don't set any filters. As for the CPU usage, that is probably due to disk reads blocking IO, which will show up as high CPU usage - or alternatively an untuned MySQL that runs out of memory. |
Dawid, maybe we should set up a conf call and cover some basics, if you're up for it sometime ;) |
Also, could you send us the exact query that caused the issue? I tried it on our instance and fetching all unpublished events took me around 580 ms, so something fishy is going on. Sure the parameter was set correctly? If MISP finds no valid filters it will just return everything unfiltered, which could be massive. |
@dawid-czarnecki an idea to try is to run the iostat tool to get an idea of wait time for io requests (r_await and w_await). Some details are available at: https://www.computerhope.com/unix/iostat.htm |
Pagination has been implemented in the API (2.4.96 and upper). You can now do such query via restSearch (events or/and attributes level):
Documentation is the following (available via MISP):
Could you test if this solves your issues? and reopen the issue if not. Thanks a lot. |
Just fixed a problem with the same symptoms (status 500 in response to restSearch with large result) by using pagination. Also resulted in a roughly 4x speed-up. |
HTTP 500 (max execution time exceeded) when searching for not limited events
Hi,
I've got a problem. I try to download events which are unpublished through api/pymisp. Unfortunately the script terminates with HTTP 500 (max execution time exceeded). With curl and REST API result is the same as with pymisp.
I noticed the same happens when search is not limited by some time for example 1d. If there is last=1d, then misp process for a while but return results.
Does anyone know why this might happen?
Here are some details:
Directly in the database, results goes immediately (X is more than 0):
Work environment
Expected behavior
List of unpublished events in json.
Actual behavior
Steps to reproduce the behavior
script.py:
python script.py
Logs, screenshots, configuration dump, ...
misp.local_access.log:
app/tmp/logs/error.log:
The text was updated successfully, but these errors were encountered: