Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use same timestamp in csv results as in json #2839

Closed
fenchu opened this issue Jan 5, 2023 · 11 comments · Fixed by #2906
Closed

use same timestamp in csv results as in json #2839

fenchu opened this issue Jan 5, 2023 · 11 comments · Fixed by #2906

Comments

@fenchu
Copy link

fenchu commented Jan 5, 2023

Feature Description

--out csv=results.csv has timestamp resolution in sec:

>>> df = pd.read_csv("results.csv")             
>>> print(df[['metric_name','timestamp']][:5])  
                metric_name   timestamp
0                 http_reqs  1672766933
1         http_req_duration  1672766933
2          http_req_blocked  1672766933
3       http_req_connecting  1672766933
4  http_req_tls_handshaking  1672766933

while --out json=result.jsonhas microtimestamp and zone data:

:
{"metric":"http_req_failed","type":"Point","data":{"time":"2023-01-05T09:45:41.088631+01:00",........
:

I now convert output from json to csv with micro-timestamp, but it takes minutes on long scenarios with >1GB of data,
and the json is much larger 1GB of json is 200MB of csv.

Suggested Solution (optional)

same timestamp in csv as in json output

Already existing or connected issues / PRs (optional)

No response

@na--
Copy link
Member

na-- commented Jan 5, 2023

Thanks for opening this issue! 🙇 This was supposed to be somewhat fixed by #2274, which added a timeFormat (K6_CSV_TIME_FORMAT) option to the csv output, with the possible values of unix (the default) and rfc3339. However, now that I test K6_CSV_TIME_FORMAT=rfc3339, that also has 1-second resolution, which wasn't the intended goal from what I can remember 😕

Skimming the discussion in the PR, I requested the addition of a second time granularity/precision option, but it seems like it was dropped somewhere along the line 😞 So either that extra option can now be added.

Or, thinking about it some more, it may be better if we simply expanded the timeFormat option with more possible values like unix-nano, rfc3339-nano, etc. 🤔 This is what Go basically does, in the constants of the time package - it has time.RFC3339, which we currently use, but also time.RFC3339Nano 🤔 And it will be simpler to implement, configure and reason about, so I am leaning towards this approach.

@NeelParihar
Copy link

Hello, I am new to this project and would love to start working on it by starting with this issue.

@na--
Copy link
Member

na-- commented Jan 12, 2023

Go for it 👍 Just make sure to read CONTRIBUTING.md first.

@Azanul
Copy link
Contributor

Azanul commented Feb 4, 2023

Is this issue available to be worked on?

@Azanul
Copy link
Contributor

Azanul commented Feb 7, 2023

@na-- Which should be the default one? unix in both or rfc3339 in both?

@na--
Copy link
Member

na-- commented Feb 8, 2023

Is this issue available to be worked on?

We haven't heard anything from @NeelParihar for almost a month, so probably yes?

@na-- Which should be the default one? unix in both or rfc3339 in both?

Not sure what you mean by both, this issue concerns only the csv output. And the default should stay whatever the current value is, otherwise it would be a breaking change.

@Azanul
Copy link
Contributor

Azanul commented Feb 8, 2023

Is this issue available to be worked on?

We haven't heard anything from @NeelParihar for almost a month, so probably yes?

I'll raise a PR

@na-- Which should be the default one? unix in both or rfc3339 in both?

Not sure what you mean by both, this issue concerns only the csv output. And the default should stay whatever the current value is, otherwise it would be a breaking change.

What I understood from this issue is that the outputs received in two different file formats have different formats for datetime. As default, csv has unix and json has rfc3339micro. Adding support for formats does resolve the issue. Shouldn't the defaults also be consistent among output formats?

@Azanul Azanul mentioned this issue Feb 8, 2023
4 tasks
@na-- na-- added this to the v0.44.0 milestone Feb 22, 2023
@fenchu
Copy link
Author

fenchu commented May 16, 2023

Hi
I tested with v0.44.1 and I still do not get timestamp with milli/micro/nanosec:

k6 version
k6 v0.44.1 (2023-05-08T12:36:56+0000/v0.44.1-0-gae04c40a, go1.20.4, windows/amd64)

running:
k6 --out csv=results.csv run --vus 10 --duration 3m  -e TESTENV=qa04 -e TENANT=buypass ./multitenant-token-login.js

the output csv file:

(cat -Head 10 .\results.csv).Substring(0,50)
metric_name,timestamp,metric_value,check,error,err
http_reqs,1684234777,1.000000,,,,true,,POST,https:
http_req_duration,1684234777,28.975600,,,,true,,PO
http_req_blocked,1684234777,41.947300,,,,true,,POS
http_req_connecting,1684234777,11.689100,,,,true,,
http_req_tls_handshaking,1684234777,16.171200,,,,t
http_req_sending,1684234777,0.000000,,,,true,,POST
http_req_waiting,1684234777,27.889400,,,,true,,POS
http_req_receiving,1684234777,1.086200,,,,true,,PO
http_req_failed,1684234777,0.000000,,,,true,,POST,

Example graph enclosed, you see the vertical lines of data:

scim_api_uuid-http_req_duration-1

@codebien
Copy link
Contributor

Hey @fenchu, can you open a new bug report so we can properly manage it, please?

@mstoykov
Copy link
Contributor

@fenchu while this was implemented the default for the timeFormat is still unix which is "only" with resolution of seconds.

You can see the other possible values in the doc.

If that still doesn't work for you - you can open a new issue.

@fenchu
Copy link
Author

fenchu commented May 16, 2023

excellent. it works. was just a bit unshure. great. it is much faster and save a stage of converting a 1GB json to a csv.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants