Skip to content
This repository has been archived by the owner on Dec 24, 2019. It is now read-only.

Internal Server Error if visualization json too large #447

Open
creat89 opened this issue Dec 10, 2019 · 5 comments
Open

Internal Server Error if visualization json too large #447

creat89 opened this issue Dec 10, 2019 · 5 comments

Comments

@creat89
Copy link
Contributor

creat89 commented Dec 10, 2019

The platform will return an Internal Server Error when we requested visualization file is very large. Currently, we have found that the historic metrics regarding topics can generate json files that are too large. For example, it can generate files of 9MB after just analyzing a few months. While, we can reduce the amount of data presented in the visualization file, this would affect the data available in the dashboard. Thus, the implementation of a pagination system would be preferable.

@creat89 creat89 added the bug label Dec 10, 2019
@valeriocos
Copy link
Member

please find the log of the errror (if this can be of any help) that pops up when importing the data to ElasticSearch

2019-12-06 12:43:14,742 Looking for 'gnomelisttopics' project metrics at url 'http://oss-app:8182'
Traceback (most recent call last):
  File "./scava2es.py", line 1035, in <module>
    for item in scava_data:
  File "./scava2es.py", line 843, in fetch_scava
    for enriched_metric in enrich_metrics(scavaProject.fetch(CATEGORY_METRIC), meta):
  File "./scava2es.py", line 386, in enrich_metrics
    for scava_metric in scava_metrics:
  File "/usr/local/lib/python3.7/site-packages/perceval/backend.py", line 127, in fetch
    for item in self.fetch_items(category, **kwargs):
  File "/usr/local/lib/python3.7/site-packages/perceval/backends/scava/scava.py", line 160, in fetch_items
    for raw_items in self.client.get_items(category, project):
  File "/usr/local/lib/python3.7/site-packages/perceval/backends/scava/scava.py", line 352, in get_items
    project_metric = self.fetch(api)
  File "/usr/local/lib/python3.7/site-packages/perceval/backends/scava/scava.py", line 539, in fetch
    response = super().fetch(url, payload)
  File "/usr/local/lib/python3.7/site-packages/perceval/client.py", line 132, in fetch
    response = self._fetch_from_remote(url, payload, headers, method, stream, verify)
  File "/usr/local/lib/python3.7/site-packages/perceval/client.py", line 173, in _fetch_from_remote
    raise e
  File "/usr/local/lib/python3.7/site-packages/perceval/client.py", line 168, in _fetch_from_remote
    response.raise_for_status()
  File "/usr/local/lib/python3.7/site-packages/requests/models.py", line 940, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 500 Server Error: Server Error for url: http://oss-app:8182/projects/p/gnomelisttopics/m/newsgroups.topics.articles

@mhow2
Copy link

mhow2 commented Dec 17, 2019

Is it the same as this one ? (looks to, but not the same metric):

dashb-importer_1  | 2019-12-17 13:11:13,085 Looking for 'bonitaenginedoc' project metrics at url 'http://oss-app:8182'
dashb-importer_1  | Traceback (most recent call last):
dashb-importer_1  |   File "./scava2es.py", line 1035, in <module>
dashb-importer_1  |     for item in scava_data:
dashb-importer_1  |   File "./scava2es.py", line 843, in fetch_scava
dashb-importer_1  |     for enriched_metric in enrich_metrics(scavaProject.fetch(CATEGORY_METRIC), meta):
dashb-importer_1  |   File "./scava2es.py", line 386, in enrich_metrics
dashb-importer_1  |     for scava_metric in scava_metrics:
dashb-importer_1  |   File "/usr/local/lib/python3.7/site-packages/perceval/backend.py", line 127, in fetch
dashb-importer_1  |     for item in self.fetch_items(category, **kwargs):
dashb-importer_1  |   File "/usr/local/lib/python3.7/site-packages/perceval/backends/scava/scava.py", line 160, in fetch_items
dashb-importer_1  |     for raw_items in self.client.get_items(category, project):
dashb-importer_1  |   File "/usr/local/lib/python3.7/site-packages/perceval/backends/scava/scava.py", line 352, in get_items
dashb-importer_1  |     project_metric = self.fetch(api)
dashb-importer_1  |   File "/usr/local/lib/python3.7/site-packages/perceval/backends/scava/scava.py", line 539, in fetch
dashb-importer_1  |     response = super().fetch(url, payload)
dashb-importer_1  |   File "/usr/local/lib/python3.7/site-packages/perceval/client.py", line 132, in fetch
dashb-importer_1  |     response = self._fetch_from_remote(url, payload, headers, method, stream, verify)
dashb-importer_1  |   File "/usr/local/lib/python3.7/site-packages/perceval/client.py", line 173, in _fetch_from_remote
dashb-importer_1  |     raise e
dashb-importer_1  |   File "/usr/local/lib/python3.7/site-packages/perceval/client.py", line 168, in _fetch_from_remote
dashb-importer_1  |     response.raise_for_status()
dashb-importer_1  |   File "/usr/local/lib/python3.7/site-packages/requests/models.py", line 940, in raise_for_status
dashb-importer_1  |     raise HTTPError(http_error_msg, response=self)
dashb-importer_1  | requests.exceptions.HTTPError: 500 Server Error: Server Error for url: http://oss-app:8182/projects/p/bonitaenginedoc/m/documentation.sentiment.entries

The other problem is it seems to interrupt the import of the other projects/metrics (but we already observed that, @valeriocos )

@valeriocos
Copy link
Member

As you said, not really the same bug since the endpoint is different, but the cause (huge data exposed) is probably the same.

In case the error cannot be fixed, I can provide a workaround to keep importing the remaining data.

@creat89
Copy link
Contributor Author

creat89 commented Dec 17, 2019

Idk, who would need to fix the issue. The main thing is that, we would need to implement a pagination system. And that might change completely how things are things are read.

@mhow2
Copy link

mhow2 commented Dec 17, 2019

As you said, not really the same bug since the endpoint is different, but the cause (huge data exposed) is probably the same.

Are you saying that... it is the same issue ? :D
I just deleted the concerned project for now...

Yes @creat89 , I think it's not the time to implement pagination. Anyway, as @valeriocos said what we are seeing is the cause, not the bug eg. why the MP - even if it's ""huge"" - isn't able to the expose the json data.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

3 participants