Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GUI freezes when querying prometheus instance with large set of exporters. #4883

Closed
raypettersen opened this Issue Nov 19, 2018 · 6 comments

Comments

Projects
None yet
4 participants
@raypettersen
Copy link

raypettersen commented Nov 19, 2018

Bug Report

What did you do?
Upgraded prometheus in environment A, queries work as expected with ~50 exporters.
Updated in environment B, typing one letter in the GUI freezes the browser. Only difference is a larger number of exporters.

What did you expect to see?
Expected to see query work even with a large number of exporters.

Environment
Apache proxy in front of apache. Identical vhosts on QA and PROD.

	<Location />
    		ProxyPass http://<ip>:9090/
    		ProxyPassReverse http://<ip>:9090/
	</Location>
  • System information:

Linux 3.10.0-693.17.1.el7.x86_64 x86_64

  • Prometheus version:

2.4.3

  • Logs:
    Nothing of interest in logs. Browser just freezes. Nothing in apache or prometheus logs. Worked on version 1.8.3.
@simonpasquier

This comment has been minimized.

Copy link
Member

simonpasquier commented Nov 19, 2018

Which page is failing for you exactly?

@raypettersen

This comment has been minimized.

Copy link
Author

raypettersen commented Nov 19, 2018

/graph. Just typing a single letter sends chrome and firefox into some sort of loop in the environment with plenty of exporters. In our other environment it works fine. Also, grafana lookup succeeds with autocomplete. So we're quite puzzled.

image
image

@raypettersen

This comment has been minimized.

Copy link
Author

raypettersen commented Nov 19, 2018

I could also mention that grafana query works like a charm. It's only prometheus frontend /graph that kills the browser.

image

One of our team members pointed out this change:
[ENHANCEMENT] Improve typeahead on /graph page.
prometheus 2.2.0 / 2018-03-08

Perhaps related? We upgraded from 1.8 to 2.4.3

@dabear

This comment has been minimized.

Copy link

dabear commented Nov 19, 2018

So I'm working with @raypettersen. It seems the amount of metrics loaded into typeahead is the problem. I changed /js/graph/index.js in-flight in the browser as a test, and now it's smooth again (but obviously this isn't even valid as a workaround):

image

I also did this console.log change before the change mentioned above:

 updateTypeaheadMetricSet: function(metricSet) {
    console.log("metricset updated: " + metricSet.length)
    pageConfig.graphs.forEach(function(graph) {
      if (graph.expr.data('typeahead')) {
        graph.expr.data('typeahead').source = metricSet;
      }
    });

toggling history on/off now produces these debug statements:

metricset updated: 13261
metricset updated: 13257
@dabear

This comment has been minimized.

Copy link

dabear commented Nov 20, 2018

So we've implemented a fix(workaround) in our systems.

We assume the prometheus endpoint in /api/v1/label/__name__/values returns more values in prometheus2 than it did in prometheus1.8 and this is why we discovered the following problem:

During debugging, we discovered that one of our apps had been incorrectly writing unlabeled metrics with dynamic names. This led the allMetrics array variable in static/js/graph/index.js to contain roughly ten thousand unnecessary values. prometheus feeds the typeahead plugin with these metric names. The /graph endpoint seems to freeze consistenly in all browsers when there are 10 thousand or more such entries.

@brian-brazil

This comment has been minimized.

Copy link
Member

brian-brazil commented Nov 20, 2018

Dupe of #2119. Looks like a 10k limit is what we'll want.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.