Join GitHub today
GitHub is home to over 31 million developers working together to host and review code, manage projects, and build software together.
Sign upregex-match not working in 2.1.0 #3815
Comments
dharrigan
changed the title
regex-matches not working in 2.1.0
regex-match not working in 2.1.0
Feb 8, 2018
This comment has been minimized.
This comment has been minimized.
|
From what I understand, it is selecting the right series but returning the wrong data ( |
This comment has been minimized.
This comment has been minimized.
|
Regexp matches are fully anchored. Have you tried |
This comment has been minimized.
This comment has been minimized.
|
Hi, I will try. What I would say is that this query was working prior to the upgrade to 2.1.0, i.e., with 2.0 and 1.8.2 prior versions. -=david=- |
This comment has been minimized.
This comment has been minimized.
|
@gouthamve it returns the wrong data, since it can't regexp match on the topic name thus it returns a default of 0. |
This comment has been minimized.
This comment has been minimized.
|
I'm not sure I follow, as @simonpasquier The original query as it is given should work too. |
This comment has been minimized.
This comment has been minimized.
|
@gouthamve that's actually the result from a c&p from the Prometheus UI on 9090. I can't report on how the UI displays the data, or why it chooses to do so, all I can report is that prior to 2.1.0 the same query without modification worked, now it doesn't. Only by changing the query's topic name to be an exact match for the kafka topic name is the result returned, i.e., |
This comment has been minimized.
This comment has been minimized.
|
@simonpasquier your suggestion works (it's bit too expansive for my use-case, but this works):
whereas this does not:
Thank you for your suggestion. Is this still a bug then, or has this functionality changed from 2.0 to 2.1.0 (being fully anchored?) -=david=- |
This comment has been minimized.
This comment has been minimized.
|
This is still a bug. As the UI reports the series, I'm pretty sure its getting selected but not sure why the value is @brian-brazil Your thoughts on this? |
gouthamve
added
kind/bug
priority/P2
component/local storage
labels
Feb 8, 2018
This comment has been minimized.
This comment has been minimized.
|
There are no defaults in PromQL, this sounds like incorrect regex matching down in local storage. |
brian-brazil
added
priority/P0
and removed
priority/P2
labels
Feb 8, 2018
This comment has been minimized.
This comment has been minimized.
|
If you require any more information, I would be very happy to help out - I'm just glad that I can work around the issue atm :) Top marks to @simonpasquier |
This comment has been minimized.
This comment has been minimized.
|
What does |
This comment has been minimized.
This comment has been minimized.
|
Hi, With this query (original, working in 2.0 and 1.8.2)
then, with the above work around (to work in 2.1.0)
then, as an exact match
Hope that helps! -=david=- |
This comment has been minimized.
This comment has been minimized.
|
Uhm, that's not the way regexes work in Prometheus at all in the original example. |
This comment has been minimized.
This comment has been minimized.
|
And I cannot reproduce in 2.0.0. I'm suspecting this may be an issue with this particular Prometheus rather than a bug. If you bring up another Prometheus with an empty database, does it have the same issue? |
This comment has been minimized.
This comment has been minimized.
|
Hi,
It'll take a little while for the database to be populated with non-zero
values (i.e., users over a day, or even a few minutes), so let me get back
to you.
-=david=-
…On 8 February 2018 at 15:25, Brian Brazil ***@***.***> wrote:
And I cannot reproduce in 2.0.0. I'm suspecting this may be an issue with
this particular Prometheus rather than a bug.
If you bring up another Prometheus with an empty database, does it have
the same issue?
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#3815 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAaEjiLN4v0vWhVV8d7-qbJ_1eZgqE7gks5tSxHNgaJpZM4R-a8G>
.
--
I prefer encrypted and signed messages.
Fingerprint: 110A F423 3647 54E2 880F ADAD 1C52 85BF B20A 22F9
No trees were harmed in the sending of this message, however, a number of
electrons were inconvenienced.
|
This comment has been minimized.
This comment has been minimized.
|
For this problem the values shouldn't matter, only the time series which are present. |
This comment has been minimized.
This comment has been minimized.
|
Hi, Yes, I spun up a new instance of Prometheus, ran the same queries as above (#3815 (comment)) and only with the anchored query Helpful? |
This comment has been minimized.
This comment has been minimized.
|
Okay, sounds like just that Prometheus is broken then. Probably bad data in its storage. @gouthamve Do you think there's a potential bug in storage here?
To be clear nothing should be returned. Nothing is not the same as a single time series with no labels and the value 0. |
brian-brazil
added
priority/P2
and removed
priority/P0
labels
Feb 8, 2018
This comment has been minimized.
This comment has been minimized.
|
Hi, I'm a bit confused. With a brand new Prometheus install with a brand new database, I get the same problem. How can it be a problem with my Prometheus that has a broken database since it's been destroyed then recreated (in fact I deleted the entire Prometheus directory and extracted the tarball fresh). Pointed it to the metric on the remote Kafka server then gave it a bit of time to scrape a few data points and I observe the same behaviour that the regexp operator Perhaps I'm missing something in my understanding of Prometheus and I would welcome being corrected. -=david=- |
This comment has been minimized.
This comment has been minimized.
|
@dharrigan this indeed sounds like a bug and no fault in your storage. A few hours after startup you should be seeing directories with random IDs appearing in your data directory. If you could send one of those my way that would be hugely helpful do debug this. |
This comment has been minimized.
This comment has been minimized.
|
Are you still seeing this with the 2.2.0 rcs? |
This comment has been minimized.
This comment has been minimized.
|
Is this still happening with 2.2.1? |
This comment has been minimized.
This comment has been minimized.
mattsdni
commented
Apr 17, 2018
|
@brian-brazil I believe this may still be an issue in 2.2.1
And I get this a metric with this label: If I update the query to this
I get no results back. |
This comment has been minimized.
This comment has been minimized.
|
That's as expected, you're using an equality matcher. |
This comment has been minimized.
This comment has been minimized.
mattsdni
commented
Apr 17, 2018
|
oh, my bad. changed to |
This comment has been minimized.
This comment has been minimized.
|
I'm going to presume this is resolved then. |
brian-brazil
closed this
Apr 17, 2018
This comment has been minimized.
This comment has been minimized.
lock
bot
commented
Mar 22, 2019
|
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
dharrigan commentedFeb 8, 2018
What did you do?
Installed Prometheus 2.1.0 and executed this query:
sum by (topic)(rate(kafka_server_brokertopicmetrics_messagesin_total{topic=~'users'}[1d]))What did you expect to see?
Filtered metrics based upon a Prometheus Query, with a non-zero value.
What did you see instead? Under which circumstances?
Wrong data
{topic="foo_users"} | 0Environment
Linux
Linux 4.13.0-32-generic x86_64Other information:
Here is the result of the raw metric from kafka:
kafka_server_brokertopicmetrics_messagesin_total{topic="foo_users",} 733.0If I rerun the query, with the full topic name, it works:
sum by (topic)(rate(kafka_server_brokertopicmetrics_messagesin_total{topic=~'foo_users'}[1d])){topic="foo_users"} | 0.0009844983934935792So it appears that the regexp match
=~doesn't work correctly in 2.1.0, as it's only working if it's an exact match against the topic name.