Why is an serialized obj unserialized into an array with length = 1? #21

Closed
diasdavid opened this Issue Aug 28, 2015 · 13 comments

Comments

Projects
None yet
3 participants
@diasdavid

When serializing a single JS obj (using cbor.encode), it becomes an array when unserialized (using cbor.decode), what is the reason behind this behaviour?

@hildjj

This comment has been minimized.

Show comment
Hide comment
@hildjj

hildjj Aug 28, 2015

Owner

Because there might have been multiple CBOR objects in your buffer. We could perhaps add an option that only returns the first one, but it would probably need to return the number of bytes eaten as well.

Owner

hildjj commented Aug 28, 2015

Because there might have been multiple CBOR objects in your buffer. We could perhaps add an option that only returns the first one, but it would probably need to return the number of bytes eaten as well.

@diasdavid

This comment has been minimized.

Show comment
Hide comment
@diasdavid

diasdavid Aug 28, 2015

It would be good for it to understand that it was serialized only '1' JSON object and that it was not serialized inside an array, so that decode(encode(a)) === a is true, otherwise it might be understood as it changing the data that was serialized

It would be good for it to understand that it was serialized only '1' JSON object and that it was not serialized inside an array, so that decode(encode(a)) === a is true, otherwise it might be understood as it changing the data that was serialized

@hildjj

This comment has been minimized.

Show comment
Hide comment
@hildjj

hildjj Aug 28, 2015

Owner

I can't stand functions that return different types based on their inputs. What I would be willing to do is add a decodeOne method that returns the first thing. Ideally, I'd change decode to be decodeOne and make the current decode be decodeAll, but that would be a breaking API change that I'd prefer not to make.

Owner

hildjj commented Aug 28, 2015

I can't stand functions that return different types based on their inputs. What I would be willing to do is add a decodeOne method that returns the first thing. Ideally, I'd change decode to be decodeOne and make the current decode be decodeAll, but that would be a breaking API change that I'd prefer not to make.

@diasdavid

This comment has been minimized.

Show comment
Hide comment
@diasdavid

diasdavid Sep 3, 2015

I can't stand functions that return different types based on their inputs.

I think your point is valid and it is for that reason why I am strongly in favor of returning the exact same object that was encoded.

but that would be a breaking API change that I'd prefer not to make.

Agree that is a PITA, but then again, would it be better to have the decoding returning a different type of that of what was encoded? Would love to have more opinions on this.

I can't stand functions that return different types based on their inputs.

I think your point is valid and it is for that reason why I am strongly in favor of returning the exact same object that was encoded.

but that would be a breaking API change that I'd prefer not to make.

Agree that is a PITA, but then again, would it be better to have the decoding returning a different type of that of what was encoded? Would love to have more opinions on this.

@jbenet

This comment has been minimized.

Show comment
Hide comment
@jbenet

jbenet Sep 3, 2015

having too much experience with tons of serialization systems + codecs, I would expect

x === decode(encode(x)) 

to hold

  • decodeOne seems like a good option.
  • understood about not breaking api.
  • but i probably would.
  • if not breaking api, clearly document + provide other file (like require('cbor/one'))

jbenet commented Sep 3, 2015

having too much experience with tons of serialization systems + codecs, I would expect

x === decode(encode(x)) 

to hold

  • decodeOne seems like a good option.
  • understood about not breaking api.
  • but i probably would.
  • if not breaking api, clearly document + provide other file (like require('cbor/one'))
@hildjj

This comment has been minimized.

Show comment
Hide comment
@hildjj

hildjj Sep 16, 2015

Owner

The more I think about this, the more I want to redo the whole interface for Node4, using Maps instead of objects, and not dealing with streams. It would be a lot faster, use less memory, be simpler with less code, and deal with integer keys correctly. It would also likely fix the recursion depth issue we have by getting rid of recursion.

Owner

hildjj commented Sep 16, 2015

The more I think about this, the more I want to redo the whole interface for Node4, using Maps instead of objects, and not dealing with streams. It would be a lot faster, use less memory, be simpler with less code, and deal with integer keys correctly. It would also likely fix the recursion depth issue we have by getting rid of recursion.

@jbenet

This comment has been minimized.

Show comment
Hide comment
@jbenet

jbenet Sep 16, 2015

What do you mean by not dealing with streams? 

(Our use case is duplex streams of cbor objects)


Sent from Mailbox

On Wed, Sep 16, 2015 at 12:04 PM, Joe Hildebrand notifications@github.com
wrote:

The more I think about this, the more I want to redo the whole interface for Node4, using Maps instead of objects, and not dealing with streams. It would be a lot faster, use less memory, be simpler with less code, and deal with integer keys correctly. It would also likely fix the recursion depth issue we have by getting rid of recursion.

Reply to this email directly or view it on GitHub:
#21 (comment)

jbenet commented Sep 16, 2015

What do you mean by not dealing with streams? 

(Our use case is duplex streams of cbor objects)


Sent from Mailbox

On Wed, Sep 16, 2015 at 12:04 PM, Joe Hildebrand notifications@github.com
wrote:

The more I think about this, the more I want to redo the whole interface for Node4, using Maps instead of objects, and not dealing with streams. It would be a lot faster, use less memory, be simpler with less code, and deal with integer keys correctly. It would also likely fix the recursion depth issue we have by getting rid of recursion.

Reply to this email directly or view it on GitHub:
#21 (comment)

@hildjj

This comment has been minimized.

Show comment
Hide comment
@hildjj

hildjj Sep 16, 2015

Owner

Yeah, the approach I'm thinking of would break your use case, @jbenet. Let me keep thinking.

Owner

hildjj commented Sep 16, 2015

Yeah, the approach I'm thinking of would break your use case, @jbenet. Let me keep thinking.

@hildjj

This comment has been minimized.

Show comment
Hide comment
@hildjj

hildjj Sep 17, 2015

Owner

I found a way that I think is going to work nicely for streaming. It would get rid of the SAX-style evented parser, and might be a little faster. It requires node4 (or at least --harmony). See: https://github.com/hildjj/node-cbor/blob/node4/src/stream.coffee

Owner

hildjj commented Sep 17, 2015

I found a way that I think is going to work nicely for streaming. It would get rid of the SAX-style evented parser, and might be a little faster. It requires node4 (or at least --harmony). See: https://github.com/hildjj/node-cbor/blob/node4/src/stream.coffee

@jbenet

This comment has been minimized.

Show comment
Hide comment
@jbenet

jbenet Sep 17, 2015

(Fwiw, node 4 fine for us. We're pushing for it too (it brings balance to the force))


Sent from Mailbox

On Thu, Sep 17, 2015 at 1:10 PM, Joe Hildebrand notifications@github.com
wrote:

I found a way that I think is going to work nicely for streaming. It would get rid of the SAX-style evented parser, and might be a little faster. It requires node4 (or at least --harmony). See: https://github.com/hildjj/node-cbor/blob/node4/src/stream.coffee

Reply to this email directly or view it on GitHub:
#21 (comment)

jbenet commented Sep 17, 2015

(Fwiw, node 4 fine for us. We're pushing for it too (it brings balance to the force))


Sent from Mailbox

On Thu, Sep 17, 2015 at 1:10 PM, Joe Hildebrand notifications@github.com
wrote:

I found a way that I think is going to work nicely for streaming. It would get rid of the SAX-style evented parser, and might be a little faster. It requires node4 (or at least --harmony). See: https://github.com/hildjj/node-cbor/blob/node4/src/stream.coffee

Reply to this email directly or view it on GitHub:
#21 (comment)

@hildjj

This comment has been minimized.

Show comment
Hide comment
@hildjj

hildjj Oct 3, 2015

Owner

OK, I have a completed node4 branch that I would appreciate feedback on before I commit it. It does fix #15 as well. Note that the API is now decodeFirst or decodeAll.

Owner

hildjj commented Oct 3, 2015

OK, I have a completed node4 branch that I would appreciate feedback on before I commit it. It does fix #15 as well. Note that the API is now decodeFirst or decodeAll.

@hildjj

This comment has been minimized.

Show comment
Hide comment
@hildjj

hildjj Oct 9, 2015

Owner

Fixed in #22

Owner

hildjj commented Oct 9, 2015

Fixed in #22

@hildjj hildjj closed this Oct 9, 2015

@diasdavid

This comment has been minimized.

Show comment
Hide comment
@diasdavid

diasdavid Oct 10, 2015

Thank you @hildjj, apologies for the delay on feedback, but this does look good!

Thank you @hildjj, apologies for the delay on feedback, but this does look good!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment