Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Encode to JSON? #139

Closed
jshier opened this issue Jun 2, 2015 · 31 comments
Closed

Encode to JSON? #139

jshier opened this issue Jun 2, 2015 · 31 comments

Comments

@jshier
Copy link
Contributor

jshier commented Jun 2, 2015

Is there any planned support for reversing the decode process and encoding back to JSON? I'm thinking this could be a good way to also automatically support serialization to disk.

@nvh
Copy link
Contributor

nvh commented Jun 2, 2015

I've done something to that end in #47, but haven't had the time yet to create it's own framework for it

@tonyd256
Copy link
Contributor

tonyd256 commented Jun 2, 2015

@jshier There is no planned support for this. Argo is focused on easily and safely decoding your JSON to native models. Going from your models to JSON is more straight forward.

Check out #10, #42, and #47. I'd be curious to see how you would expect to use this functionality. I'm closing this for now, but please continue to use this thread for discussion.

@tonyd256 tonyd256 closed this as completed Jun 2, 2015
@jshier
Copy link
Contributor Author

jshier commented Jun 2, 2015

@tonyd256 The intended usage is exactly what I mentioned. Encoding back to JSON lets me, say, update backend data with current state. It would also easily enable archiving to disk and back, letting me cache or otherwise persist data. Perhaps an intermediate dictionary representation would do that, but it's all part of the same process. Basically, I'm looking for a native Swift Mantle replacement. I like Argo's JSON decoding ability more than say, ObjectMapper, but I'm looking for more functionality going the other way. I'll look at the topics suggested and see what works.

@tonyd256
Copy link
Contributor

tonyd256 commented Jun 2, 2015

@jshier I get that, but I'm more interested in how you would implement it or expect to use it in code. The way I see making this is implementing an Encodable to match the Decodable. Then objects that conform to Encodable have to implement func encode() -> AnyObject. The implementation of this function would just be converting the model object into a dictionary and returning the dictionary.

Doing things this way would mean that all Argo is missing is that Encodable protocol, but I don't think that's interesting enough (as in it would be easy for anyone to do themselves) and it doesn't help with type safety at all.

I'm curious if you have ideas on an implementation that would help with type safety.

@paulyoung
Copy link
Contributor

I was wondering if anyone's thoughts around this had changed with Swift 2.

Some points of reference:

@tonyd256
Copy link
Contributor

Hmm, I don't think anything from Swift 2 helps us here, or at least I didn't make the connection.

I think the meta programming approach is interesting, but it assumes the property names of the swift model are the same as the key names in the JSON which is commonly not the case. This could be corrected with a function on the protocol like keyForProperty(String) -> String or something, but still may be limiting for those who don't use 1:1 JSON to model structure.

In general, our thoughts on the matter are such that if we could provide type safety when going from model to JSON then I think it would be worth it; however, I feel that a simple toJSON() -> AnyObject function on the model would suffice. In this function, the user would just construct a Dictionary or Array from their model and return it. If we wanted to add something to Argo to help here we would probably just add the protocol Encodable that had the one function. This one addition doesn't really seem that helpful though.

@jshier
Copy link
Contributor Author

jshier commented Jun 15, 2015

Aren't the mirroring APIs used in the first example new to Swift 2? If toJSON() can be generated automatically, isn't that a very useful addition?

@tonyd256
Copy link
Contributor

Looks like it, but I still don't think it helps for the reasons mentioned above.

@chriseidhof
Copy link

I think the mirroring APIs I wrote about can be fun for prototyping but are hardly ever more useful... I wouldn't use them for anything serious, because it limits your data model from changing.

@gfontenot
Copy link
Collaborator

I think these new approaches are super interesting, but generally out of scope for what Argo is trying to do (functional JSON decoding). I'd probably recommend building/using an additional lib to handle the encoding phase if that's what you want.

@jshier
Copy link
Contributor Author

jshier commented Jun 30, 2015

@tonyd256 So I was thinking about combining the approach described at the end of #10, where we turn the Argo intermediate JSON representation back into a dictionary, with a protocol implementation and extension in Swift 2. However, I'm still pretty unfamiliar with the functional aspects of Argo's decode() and the special operators, so I can't figure how to map the JSON coming into decode() into a property of my struct. My thinking is that, by keeping it around, turning it back into something that can be serialized to disk or then turned into JSON would be pretty easy. Is it possible to preserve the JSON representation in decode()?

Here's what I was thinking for the Encodable protocol.

import Foundation

protocol Encodable {
    func encode(json: JSON) -> AnyObject
}

extension Encodable {
    static func encode(json: JSON) -> AnyObject {
        switch json {
        case  .Null: return NSNull()
        case let .String(v): return v
        case let .Number(v): return v
        case let .Array(a): return a.map(encode)
        case let .Object(v):
            var object: [String: AnyObject] = [:]
            for key in v.keys {
                if let value: JSON = v[key] {
                    object[key] = encode(value)
                } else {
                    object[key] = NSNull()
                }
            }
            return object
        }
    }
}

@tonyd256
Copy link
Contributor

tonyd256 commented Jul 1, 2015

so I can't figure how to map the JSON coming into decode() into a property of my struct.

This is the basis of Argo. I would hope that the basic examples in the README and the tests would help with understanding but if you'd like me to explain anything specifically, please let me know.

turning it back into something that can be serialized to disk or then turned into JSON would be pretty easy

I can see how serialization like this would be useful for cacheing responses, but why not just cache the AnyObject returned from NSJSONSerialization or the NSData returned from the network request.

Is it possible to preserve the JSON representation in decode()?

I'm not sure what you mean here. What are you trying to accomplish?

@jshier
Copy link
Contributor Author

jshier commented Jul 1, 2015

@tonyd256 Sorry, I guess I wasn't clear. I can map values from the JSON to my model just fine. What I'm looking for is a way to preserve the Argo JSON itself by keeping it around in a property. decode() seems like the place to do that but I can't figure out a way to map the JSON to a property of my struct. Basically, this:

struct Thing {
    let property: String
    let json: JSON
}

extension Thing : Decodable {
    static func decode(j: JSON) -> Decoded<Thing> {
        return curry(Thing.init)
        <^> j <| "property"
        // preserve JSON?
    }
}

@tonyd256
Copy link
Contributor

tonyd256 commented Jul 1, 2015

oh interesting ... did you try <*> pure(j) in place of // preserve JSON?

@jshier
Copy link
Contributor Author

jshier commented Jul 1, 2015

Damnit, totally my fault. Tried that yesterday but it didn't like the fact that I was trying to map the JSON captured by decode as json to a property name json. Really wish Swift's compiler would warn about shadowing like that rather than just breaking everything. Sorry, dumb mistake.pure(j) worked just fine, thanks.

@tonyd256
Copy link
Contributor

tonyd256 commented Jul 1, 2015

Great! Glad that worked.

@jshier
Copy link
Contributor Author

jshier commented Jul 8, 2015

So I've bundled everything up into a protocol. I'll post it here mainly for posterity, but comments are welcome too. It does do a bit much in that it actually does the disk write, but we're able to then throw the command off into the background, so it's not an issue.

protocol Serializable {
    var json: JSON { get }
}

extension Serializable {

    func encodeToDisk(filename filename: String) {
        NSKeyedArchiver.archiveRootObject(encode(json), toFile: NSFileManager.cachesPath.stringByAppendingPathComponent(filename))
        logDebug("*** \(filename) successfully encoded to disk. ***")
    }

    func encode(json: JSON) -> AnyObject {
        switch json {
        case  .Null: return NSNull()
        case let .String(v): return v
        case let .Number(v): return v
        case let .Array(a): return a.map(encode)
        case let .Object(v):
            var object: [String: AnyObject] = [:]
            for key in v.keys {
                if let value: JSON = v[key] {
                    object[key] = encode(value)
                } else {
                    object[key] = NSNull()
                }
            }
            return object
        }
    }
}

extension Serializable where Self: Decodable {
    static func decodeFromDisk<T: Decodable where T == T.DecodedType>(filename filename: String) -> Decoded<T> {
        if let unarchivedObject = NSKeyedUnarchiver.unarchiveObjectWithFile(NSFileManager.cachesPath.stringByAppendingPathComponent(filename)) {
            logDebug("*** \(filename) restoration succeeded. ***")
            return T.decode(JSON.parse(unarchivedObject))
        }
        else {
            return .TypeMismatch("Failed to decode object of type \(T.self) from storage.")
        }
    }
}

As you can see I use the NSKeyedArchiver to archive the reversed AnyObject to disk and then reverse the process with a generic decoding to come back. Like I said, this is mainly for anyone who wants to encode Argo objects to disk (or, easily enough, back to JSON), but I'd appreciate any comments or improvements you want to make.

@stefangeir
Copy link

Really need this and @jshier I'm trying your solution out. Looks like a good one. Would like something similar in Argo.

@tonyd256
Copy link
Contributor

@stefangeir I'm curious about what you're doing that needs this and how you're using it?

@stefangeir
Copy link

I have a generic user model object that I'm currently saving to disk and retrieving by using a class and implementing NSCoding. I need to be able to persist that model to disk.

I would save the user to disk when he logs in to the app and on startup I would load the user object from disk. I have no problem with saving the AnyObject or NSData returned from the network to disk but then for example changing the users name would include diving into that saved JSON and finding the name field and changing it.

Obviously I haven't thought about this as much as you, please tell me what I'm missing. I would love for this to be simple.

Currently I am refactoring a project so that all models are structs (they were classes) and implement Decodable (we used ObjectMapper previously).

@tonyd256
Copy link
Contributor

To save the user to disk, could you have some sort of encode function that creates a [String: AnyObject] out of the user and then save that? When you load it back from disk you can use JSON.parse and User.decode to bring it back into the model form.

@stefangeir
Copy link

Thanks, exactly what we're doing!

    public func encode() -> AnyObject {
        var placesArray = [[String: AnyObject]]()
        for place in placesArray {
            placesArray.append(["name": place.name])
        }

        let nestedDictionary: [String: AnyObject] = ["gender": gender, "birthday": birthday, "places": placesArray]
        var encoded: AnyObject = ["id": id, "fullName": fullName, "profilePicture": profilePicture,
            "nested": nestedDictionary]
        return encoded
    }

@tonyd256
Copy link
Contributor

Oh great! So then what makes you want to hold onto the JSON object and use that for encoding rather than this?

@stefangeir
Copy link

Oh no I don't prefer holding onto the JSON object, it would just be nice if Argo knew how to encode my Decodable object to something I can save to disk since it already knows what it looks like in JSON format.

@tonyd256
Copy link
Contributor

Ah ok. I thought you were referring to the Serializable protocol above. You could probably make your encode function a bit easier.

extension Place {
  func encode() -> AnyObject {
    return ["name": name]
  }
}

extension User {
  func encode() -> AnyObject {
    return [
      "id": id,
      "fullName": fullName,
      "profilePicture": profilePicture,
      "nested": [
        "gender": gender,
        "birthday": birthday,
        "places": places.map { $0.encode() }
      ]
    ]
  }
}

We probably won't be adding something that does this to Argo for a few reasons, but I think one is that going back to AnyObject is that simple (see above). Even if Argo provided some Encodable protocol it wouldn't provide much that a user couldn't just implement themselves:

protocol Encodable {
  func encode() -> AnyObject
}

Argo heavily relies on the type system to safely decode json into model objects. When going the other way there is no static type system so there really is no "safety" to be had.

It sounds like the main thing you're looking for is a mapping from json to model properties so Argo can use that to decode and encode. Argo is dumb however and holds no state. It has a lot of power and flexibility because of this. I don't think it's very strange though to have separate functions for those things. Even NSCoding makes you conform to 2 function for those.

Anyways, I hope this helps.

@edwardaux
Copy link
Contributor

I totally understand that encoding is not something that Argo wants to do, and is fairly simple to do anyway. Just in case anyone is interested, I ended writing a very simple Encodable protocol along similar lines as @stefangeir, except that the protocol looks like:

public protocol Encodable {
    func encode() -> JSON
}

The nice thing about this approach (returning an Argo JSON object) is that it forces the implementor to make sure their encode function returns things that absolutely can be marshalled into JSON.

See https://github.com/edwardaux/Ogra/blob/master/Ogra/Encodable.swift for my implementation. Hope that helps others (like me) that do happen to need re-encoding back to JSON.

Thanks again for an awesome Argo library... one of my absolute favourites!

@stefangeir
Copy link

Thanks alot @tonyd256 . I changed my encode function to your version. Your reasoning for not implementing this for Argo makes perfect sense.

@muescha
Copy link

muescha commented Jan 10, 2017

i also looking for some encoder. the best think is that i can define a mapper and then this mapper is used for encoding and decoding stuff.

@tonyd256
Copy link
Contributor

Does Ogra work? https://github.com/edwardaux/Ogra

@muescha
Copy link

muescha commented Jan 10, 2017

thx for pointing out.
but there i need to do the mapping twice

PS: i expected something like the mapping in ObjectMapper

@lmsmartins
Copy link

ObjectMapper doesn't maintain the order of the attributes when encoding to JSON, which could be a big hassle when creating hashes. Perhaps this could be taken into account when developing the encoder for this library.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants