Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Strange behavior of the dashboard. #103

Closed
zzhao2010 opened this issue Feb 13, 2023 · 17 comments
Closed

Strange behavior of the dashboard. #103

zzhao2010 opened this issue Feb 13, 2023 · 17 comments

Comments

@zzhao2010
Copy link

Firstly thanks for sharing these great dashboards for visualization. They look awesome.
On the other hand, I saw strange behavior while I was testing the dashboards with my test cases, and I do have question about the data accuracy as the data reporting on the dashboards doesn't look to align with the test result on the command line.

Let's have the 1st metric "Request Made" on the "Test Result" dashboard as an example. There were 2 values reporting, which is quite confusing. And neither of these values reflected the accurate number of requests being generated over the test case.
image
And if you take a look at the P95 Response Time metric on the dashboard, it was 3x faster than the p95 response time reported in the test summary on command line side.
image

@jwcastillo
Copy link
Contributor

Have you tried the latest version of the dashboard?

@lagunkov
Copy link

lagunkov commented Feb 27, 2023

It is better with update, but there is still issue:

With new dashboard:

image

but when I decrease time range, I've got other total reqs count
image

Total requests count changes depends on range, but never equals the actual http requests count:
image

Also p95 response time looks different.

@jwcastillo
Copy link
Contributor

@lagunkov Is it a test that you can share for me to replicate the incident? I'm in the k6 slack like @wen

@jwcastillo
Copy link
Contributor

@zzhao2010 Do you still have the same problem?

@lagunkov
Copy link

lagunkov commented Feb 27, 2023

Sorry I can not share test that produced screen above because it contains private info.

I tried to make reduced test case with example from https://test.k6.io/ with such options

14 export let options = {
15   scenarios: {
16     sample: {
17       executor: 'ramping-vus',
18       startVUs: 1,
19       stages: [
20           { target: 20, duration: "1m" },
21           { target: 20, duration: "3m" },
22           { target: 0, duration: "1m" }
23       ],
24     }
25   },
26   tags: {
27     testid: 'test grafana 0.1'
28   }
29 };

It has the same total request and p95 response time count when time range is "last 3 hours" here:

image

And when I choose shorter time range it transforms into:
image

Hope this will help.

@jwcastillo
Copy link
Contributor

🆗 , I let me check

@zzhao2010
Copy link
Author

@jwcastillo Looks like the issue was fixed with the latest version.
@lagunkov btw, the issue you described above happens on my end as well.. Looks like the values would be messed up if we changed the timeframe. I always use the link in the test list dashboard to the test result dashboard. That way the data reporting would be accurate.

@soolch
Copy link

soolch commented Mar 1, 2023

Hi, I have this issue also. I found that if the test duration is short, the Request Made metrics is correct, but when test duration go longer, the Request Made metrics will be less than the exact request made count. I did a comparison with the k6 Cloud.
Here you can see the Request Made, Peak RPS, and the P95 Response Time also have different value.
image
image

@soolch
Copy link

soolch commented Mar 1, 2023

Hi @zzhao2010, may I know how do you solve your issue? Coz I am also using v0.2.0 but still the same.

@codebien
Copy link
Contributor

codebien commented Mar 1, 2023

@soolch are you sure you're using the latest version? Did you pull the latest commit from the main branch or from the latest tag? If yes, then can you post an anonymized script that allows us to reproduce your issue, please? You have an example a few lines before in this comment using test.k6.io.

@soolch
Copy link

soolch commented Mar 20, 2023

Hi @codebien, I have tried it once again, with the latest k6 binary, following the k6 documentation as it has updated that this is the official dashboard. But the issue still happens.
I tried using the following option

export const options = {
  scenarios: {
    'scenario-vehicle-content': {
      executor: 'ramping-arrival-rate',
      startRate: 50,
      timeUnit: '1m',
      preAllocatedVUs: 2,
      maxVUs: 50,
      stages: [
        { target: 50, duration: '1m' },
        { target: 100, duration: '1m' },
        { target: 100, duration: '1m' },
        { target: 200, duration: '1m' },
        { target: 200, duration: '1m' },
        { target: 300, duration: '1m' },
        { target: 300, duration: '1m' },
        { target: 400, duration: '1m' },
        { target: 400, duration: '1m' },
      ],
    },
  },
};

image
image

But if i reduce my total test duration to 5m then the result shows correctly.
For the following stage configuration, the result is correct.

stages: [
        { target: 50, duration: '1m' },
        { target: 100, duration: '1m' },
        { target: 100, duration: '1m' },
        { target: 200, duration: '1m' },
        { target: 200, duration: '1m' }
      ],

image
image

@codebien
Copy link
Contributor

@jwcastillo can you take a look into it, please?

@jwcastillo
Copy link
Contributor

yes, I take this

@soolch
Copy link

soolch commented Apr 27, 2023

Hi @jwcastillo, may I know are you able to simulate the same result at your side.

@soolch
Copy link

soolch commented May 2, 2023

@codebien
Copy link
Contributor

codebien commented May 2, 2023

Hi @soolch,
do you use the dashboard with the Stale marker option enabled?

@soolch
Copy link

soolch commented May 2, 2023

Hi @codebien, I didn't. Just that i read this stale option which also say 5mins. And the when i try search the result in the promtheus, those that are more than 5mins will be disappeared, which cause the grafana result incorrect.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants