Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG: Series/Dataframe to_json not returning miliseconds #4362

Closed
mariusvniekerk opened this issue Jul 25, 2013 · 6 comments · Fixed by #4498

Comments

@mariusvniekerk
Copy link

commented Jul 25, 2013

According to the documentation for pandas 0.12

date_format : type of date conversion (epoch = epoch milliseconds, iso = ISO8601)
default is epoch

The json exporting function seems to be returning the epoch nanoseconds rather than the milliseconds as suggested by the documentation.

s = pd.Series(1, index=pd.date_range('2013-07-25', periods=2, ))
s.index.astype('int64')
>> array([1374710400000000000, 1374796800000000000])

s.to_json(orient='split', date_format='epoch')
>> '{"name":null,"index":[1374710400000000000,1374796800000000000],"data":[1,1]}'
@jreback

This comment has been minimized.

Copy link
Contributor

commented Jul 25, 2013

cc @hayd
cc @Komnomnomnom

I suppose this is incompatible with exportable JSON (e.g. you are writing a frame and using it somewhere else)
This works internally (e.g. writing a frame, then reading it back)

@Komnomnomnom

This comment has been minimized.

Copy link
Contributor

commented Jul 25, 2013

Yeah the expectation would be milliseconds as that seems to be the standard time-stamp format in JSON (and JavaScript). I think it should be JSONifying in millis (or ISO) by default.

Although , afaik the JSON spec says nothing about date and timestamp, so I think this is more of a convention, but in JavaScript it would be milliseconds.

In JavaScript (Chrome):

> new Date(1374710400000000000)
Invalid Date
> new Date(1374710400000)
Thu Jul 25 2013 10:00:00 GMT+1000 (EST)
> d = new Date(1374710400000)
Thu Jul 25 2013 10:00:00 GMT+1000 (EST)
> d.toJSON()
"2013-07-25T00:00:00.000Z"
> JSON.stringify(d)
""2013-07-25T00:00:00.000Z""
> JSON.stringify(+d)
"1374710400000"
@jreback

This comment has been minimized.

Copy link
Contributor

commented Jul 26, 2013

fyi...these can actually be fractional msecs (e.g. floats)

I can do this conversion before I pass to you (e.g. I'll just pass you an int or float column), easy enough to reconvert as well (as pd.to_datetime(values,unit='ms') does this type of conversion

or do you want to do it natively?

only other issue is we are going to have to back-compat now (though sort of easy to figure it out) as they differe by orders of magnitude

@Komnomnomnom

This comment has been minimized.

Copy link
Contributor

commented Jul 26, 2013

I'd like to have a go at doing it natively (I think it'd be more flexible that way). Right now (I think) it uses PyArray_CastToType, which might make things difficult, but if it doesn't work out we can look at conversion before hitting the JSON parser as you suggested.

I should have some time to look at it this evening (it's morning here).

@jreback

This comment has been minimized.

Copy link
Contributor

commented Jul 26, 2013

fine by me!

let me know

@jreback

This comment has been minimized.

Copy link
Contributor

commented Aug 15, 2013

closed via #4498

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
3 participants
You can’t perform that action at this time.