-
Notifications
You must be signed in to change notification settings - Fork 13
Internal Server Error if visualization json too large #447
Comments
please find the log of the errror (if this can be of any help) that pops up when importing the data to ElasticSearch
|
Is it the same as this one ? (looks to, but not the same metric):
The other problem is it seems to interrupt the import of the other projects/metrics (but we already observed that, @valeriocos ) |
As you said, not really the same bug since the endpoint is different, but the cause (huge data exposed) is probably the same. In case the error cannot be fixed, I can provide a workaround to keep importing the remaining data. |
Idk, who would need to fix the issue. The main thing is that, we would need to implement a pagination system. And that might change completely how things are things are read. |
Are you saying that... it is the same issue ? :D Yes @creat89 , I think it's not the time to implement pagination. Anyway, as @valeriocos said what we are seeing is the cause, not the bug eg. why the MP - even if it's ""huge"" - isn't able to the expose the json data. |
The platform will return an Internal Server Error when we requested visualization file is very large. Currently, we have found that the historic metrics regarding topics can generate json files that are too large. For example, it can generate files of 9MB after just analyzing a few months. While, we can reduce the amount of data presented in the visualization file, this would affect the data available in the dashboard. Thus, the implementation of a pagination system would be preferable.
The text was updated successfully, but these errors were encountered: