Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not able to parse "dd/MMM/yyyy:HH:mm:ss Z" timestamp in logs type metrics #126

Closed
hardikbajaj opened this issue Apr 5, 2023 · 9 comments
Labels
datasource/OpenSearch type/bug Something isn't working

Comments

@hardikbajaj
Copy link

What happened:
In Logs (and raw data) metrics, timestamp is being passed as dd/MMM/yyyy:HH:mm:ss Z and Grafana is not able to parse it.
We get timeseries data when metrics is set as count, max, etc, because time is passed as epoch_millis

What you expected to happen:
Time properly being parsed and logs being displayed as timeseries dataframe

How to reproduce it (as minimally and precisely as possible):
See the logs of your index with time format as dd/MMM/yyyy:HH:mm:ss Z

Anything else we need to know?:
Screenshots proofs:

timestamp being parsed correctly in count

Screenshot 2023-04-06 at 12 17 56 AM

timestamp not being parsed correctly in logs

Screenshot 2023-04-06 at 12 21 04 AM

Environment:

  • Grafana version: 9.4.1
  • Plugin version: 2.2.0
@hardikbajaj hardikbajaj added datasource/OpenSearch type/bug Something isn't working labels Apr 5, 2023
@sarahzinger
Copy link
Member

sarahzinger commented Apr 7, 2023

Hi @hardikbajaj thanks for reporting this issue!

To make sure I follow your question, you're able to query opensearch with grafana to find the data you need, but the Logs visualization is not able to parse date formats of dd/MMM/yyyy:HH:mm:ss Z is that correct?

I'm checking in with our team that works on the Logs Visualization to see if this is a known limitation, or if there's something we need to do to update the Open Search Plugin to support this

@hardikbajaj
Copy link
Author

Thanks @sarahzinger ! Yeah that's the bug!
Also, is there any forum where I can discuss with people using opensearch grafana plugin, for general bugs/doubts I find?

@sarahzinger
Copy link
Member

We have a few ways for folks in the grafana community to interact:

That said if you see bugs you're always welcome to open an issue even if you're not sure it's a bug!

@sarahzinger
Copy link
Member

Would you mind opening the query-inspector, going to the JSON, tab, and choose the Dataframe JSON option and sending that to us after you've removed any sensitive data?

@hardikbajaj
Copy link
Author

Hey! these are just fake apache logs, I'm adding the response here.

[
  {
    "schema": {
      "refId": "A",
      "meta": {
        "preferredVisualisationType": "logs"
      },
      "fields": [
        {
          "config": {
            "filterable": true
          },
          "name": "timestamp",
          "type": "time"
        },
        {
          "config": {
            "filterable": true
          },
          "name": "@timestamp",
          "type": "string"
        },
        {
          "config": {
            "filterable": true
          },
          "name": "_id",
          "type": "string"
        },
        {
          "config": {
            "filterable": true
          },
          "name": "_index",
          "type": "string"
        },
        {
          "config": {
            "filterable": true
          },
          "name": "_source",
          "type": "string"
        },
        {
          "config": {
            "filterable": true
          },
          "name": "_type",
          "type": "string"
        },
        {
          "config": {
            "filterable": true
          },
          "name": "auth",
          "type": "string"
        },
        {
          "config": {
            "filterable": true
          },
          "name": "bytes",
          "type": "string"
        },
        {
          "config": {
            "filterable": true
          },
          "name": "clientip",
          "type": "string"
        },
        {
          "config": {
            "filterable": true
          },
          "name": "date",
          "type": "string"
        },
        {
          "config": {
            "filterable": true
          },
          "name": "httpversion",
          "type": "string"
        },
        {
          "config": {
            "filterable": true
          },
          "name": "ident",
          "type": "string"
        },
        {
          "config": {
            "filterable": true
          },
          "name": "log",
          "type": "string"
        },
        {
          "config": {
            "filterable": true
          },
          "name": "request",
          "type": "string"
        },
        {
          "config": {
            "filterable": true
          },
          "name": "response",
          "type": "string"
        },
        {
          "config": {
            "filterable": true
          },
          "name": "verb",
          "type": "string"
        }
      ]
    },
    "data": {
      "values": [
        [
          "17/Apr/2023:07:00:02 +0000",
          "17/Apr/2023:07:00:01 +0000",
          "17/Apr/2023:06:59:59 +0000",
          "17/Apr/2023:06:59:58 +0000",
          "17/Apr/2023:06:59:56 +0000",
          "17/Apr/2023:06:59:55 +0000",
          "17/Apr/2023:06:59:53 +0000",
          "17/Apr/2023:06:59:52 +0000",
          "17/Apr/2023:06:59:50 +0000"
        ],
        [
          "2023-04-17T07:00:06.051Z",
          "2023-04-17T07:00:04.051Z",
          "2023-04-17T07:00:03.048Z",
          "2023-04-17T07:00:01.054Z",
          "2023-04-17T07:00:00.049Z",
          "2023-04-17T06:59:58.050Z",
          "2023-04-17T06:59:57.055Z",
          "2023-04-17T06:59:55.051Z",
          "2023-04-17T06:59:54.051Z"
        ],
        [
          "MlcDjocBW7B6N2RffS4r",
          "MVcDjocBW7B6N2RffS4r",
          "MFcDjocBW7B6N2RffS4r",
          "L1cDjocBW7B6N2RfaS6p",
          "LlcDjocBW7B6N2RfaS6p",
          "LVcDjocBW7B6N2RfaS6p",
          "LFcDjocBW7B6N2RfWi4K",
          "K1cDjocBW7B6N2RfWi4K",
          "KlcDjocBW7B6N2RfWi4K"
        ],
        [
          "apache_logs",
          "apache_logs",
          "apache_logs",
          "apache_logs",
          "apache_logs",
          "apache_logs",
          "apache_logs",
          "apache_logs",
          "apache_logs"
        ],
....

@fridgepoet
Copy link
Member

fridgepoet commented Apr 19, 2023

Hi @hardikbajaj, on the data source config editor page, what if you put @timestamp in the "Time field name" field?
(Thanks Gábor!)

Talking about this part here (but not necessarily the same settings):
Screenshot 2023-04-19 at 14 33 42

@hardikbajaj
Copy link
Author

Yeah, that's the time field I added and It's a general time field that filebeat adds which is the time of ingestion of log. So that's actually very different from timestamp. I'm just doing a local test with this, so just found this bug. @timestamp field is just working fine, but there's problem in parsing timestamp

@idastambuk
Copy link
Contributor

idastambuk commented Apr 26, 2023

Hi @hardikbajaj it does seem like Grafana’s Log volume panel doesn't support dd/MMM/yyyy:HH:mm:ss Z format. Is there a way you can configure your log source to store the timestamp field in an ISO string format or similar?

Edit: Some additional info - the best date format to return would be a utc millisecond timestamp, since that is more compatible with our visualization plugins and requires fewer transformations, but an ISO string (like in @timestamp) will work too

@idastambuk
Copy link
Contributor

Hi @hardikbajaj We added a new task to handle on our end any unusual time formats from OpenSearch.
However, the suggestion from above is probably a good possible solution until we do so. Closing this, but you can track the progress in the linked issue!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
datasource/OpenSearch type/bug Something isn't working
Projects
Archived in project
Development

No branches or pull requests

4 participants