New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Encode to JSON? #139
Comments
I've done something to that end in #47, but haven't had the time yet to create it's own framework for it |
@jshier There is no planned support for this. Argo is focused on easily and safely decoding your JSON to native models. Going from your models to JSON is more straight forward. Check out #10, #42, and #47. I'd be curious to see how you would expect to use this functionality. I'm closing this for now, but please continue to use this thread for discussion. |
@tonyd256 The intended usage is exactly what I mentioned. Encoding back to JSON lets me, say, update backend data with current state. It would also easily enable archiving to disk and back, letting me cache or otherwise persist data. Perhaps an intermediate dictionary representation would do that, but it's all part of the same process. Basically, I'm looking for a native Swift Mantle replacement. I like Argo's JSON decoding ability more than say, ObjectMapper, but I'm looking for more functionality going the other way. I'll look at the topics suggested and see what works. |
@jshier I get that, but I'm more interested in how you would implement it or expect to use it in code. The way I see making this is implementing an Doing things this way would mean that all Argo is missing is that I'm curious if you have ideas on an implementation that would help with type safety. |
I was wondering if anyone's thoughts around this had changed with Swift 2. Some points of reference: |
Hmm, I don't think anything from Swift 2 helps us here, or at least I didn't make the connection. I think the meta programming approach is interesting, but it assumes the property names of the swift model are the same as the key names in the JSON which is commonly not the case. This could be corrected with a function on the protocol like In general, our thoughts on the matter are such that if we could provide type safety when going from model to JSON then I think it would be worth it; however, I feel that a simple |
Aren't the mirroring APIs used in the first example new to Swift 2? If |
Looks like it, but I still don't think it helps for the reasons mentioned above. |
I think the mirroring APIs I wrote about can be fun for prototyping but are hardly ever more useful... I wouldn't use them for anything serious, because it limits your data model from changing. |
I think these new approaches are super interesting, but generally out of scope for what Argo is trying to do (functional JSON decoding). I'd probably recommend building/using an additional lib to handle the encoding phase if that's what you want. |
@tonyd256 So I was thinking about combining the approach described at the end of #10, where we turn the Argo intermediate JSON representation back into a dictionary, with a protocol implementation and extension in Swift 2. However, I'm still pretty unfamiliar with the functional aspects of Argo's Here's what I was thinking for the import Foundation
protocol Encodable {
func encode(json: JSON) -> AnyObject
}
extension Encodable {
static func encode(json: JSON) -> AnyObject {
switch json {
case .Null: return NSNull()
case let .String(v): return v
case let .Number(v): return v
case let .Array(a): return a.map(encode)
case let .Object(v):
var object: [String: AnyObject] = [:]
for key in v.keys {
if let value: JSON = v[key] {
object[key] = encode(value)
} else {
object[key] = NSNull()
}
}
return object
}
}
} |
This is the basis of Argo. I would hope that the basic examples in the README and the tests would help with understanding but if you'd like me to explain anything specifically, please let me know.
I can see how serialization like this would be useful for cacheing responses, but why not just cache the
I'm not sure what you mean here. What are you trying to accomplish? |
@tonyd256 Sorry, I guess I wasn't clear. I can map values from the JSON to my model just fine. What I'm looking for is a way to preserve the Argo struct Thing {
let property: String
let json: JSON
}
extension Thing : Decodable {
static func decode(j: JSON) -> Decoded<Thing> {
return curry(Thing.init)
<^> j <| "property"
// preserve JSON?
}
} |
oh interesting ... did you try |
Damnit, totally my fault. Tried that yesterday but it didn't like the fact that I was trying to map the JSON captured by decode as json to a property name json. Really wish Swift's compiler would warn about shadowing like that rather than just breaking everything. Sorry, dumb mistake. |
Great! Glad that worked. |
So I've bundled everything up into a protocol. I'll post it here mainly for posterity, but comments are welcome too. It does do a bit much in that it actually does the disk write, but we're able to then throw the command off into the background, so it's not an issue. protocol Serializable {
var json: JSON { get }
}
extension Serializable {
func encodeToDisk(filename filename: String) {
NSKeyedArchiver.archiveRootObject(encode(json), toFile: NSFileManager.cachesPath.stringByAppendingPathComponent(filename))
logDebug("*** \(filename) successfully encoded to disk. ***")
}
func encode(json: JSON) -> AnyObject {
switch json {
case .Null: return NSNull()
case let .String(v): return v
case let .Number(v): return v
case let .Array(a): return a.map(encode)
case let .Object(v):
var object: [String: AnyObject] = [:]
for key in v.keys {
if let value: JSON = v[key] {
object[key] = encode(value)
} else {
object[key] = NSNull()
}
}
return object
}
}
}
extension Serializable where Self: Decodable {
static func decodeFromDisk<T: Decodable where T == T.DecodedType>(filename filename: String) -> Decoded<T> {
if let unarchivedObject = NSKeyedUnarchiver.unarchiveObjectWithFile(NSFileManager.cachesPath.stringByAppendingPathComponent(filename)) {
logDebug("*** \(filename) restoration succeeded. ***")
return T.decode(JSON.parse(unarchivedObject))
}
else {
return .TypeMismatch("Failed to decode object of type \(T.self) from storage.")
}
}
} As you can see I use the |
Really need this and @jshier I'm trying your solution out. Looks like a good one. Would like something similar in Argo. |
@stefangeir I'm curious about what you're doing that needs this and how you're using it? |
I have a generic user model object that I'm currently saving to disk and retrieving by using a class and implementing NSCoding. I need to be able to persist that model to disk. I would save the user to disk when he logs in to the app and on startup I would load the user object from disk. I have no problem with saving the Obviously I haven't thought about this as much as you, please tell me what I'm missing. I would love for this to be simple. Currently I am refactoring a project so that all models are structs (they were classes) and implement |
To save the user to disk, could you have some sort of |
Thanks, exactly what we're doing! public func encode() -> AnyObject {
var placesArray = [[String: AnyObject]]()
for place in placesArray {
placesArray.append(["name": place.name])
}
let nestedDictionary: [String: AnyObject] = ["gender": gender, "birthday": birthday, "places": placesArray]
var encoded: AnyObject = ["id": id, "fullName": fullName, "profilePicture": profilePicture,
"nested": nestedDictionary]
return encoded
} |
Oh great! So then what makes you want to hold onto the |
Oh no I don't prefer holding onto the JSON object, it would just be nice if Argo knew how to encode my Decodable object to something I can save to disk since it already knows what it looks like in JSON format. |
Ah ok. I thought you were referring to the extension Place {
func encode() -> AnyObject {
return ["name": name]
}
}
extension User {
func encode() -> AnyObject {
return [
"id": id,
"fullName": fullName,
"profilePicture": profilePicture,
"nested": [
"gender": gender,
"birthday": birthday,
"places": places.map { $0.encode() }
]
]
}
} We probably won't be adding something that does this to Argo for a few reasons, but I think one is that going back to protocol Encodable {
func encode() -> AnyObject
} Argo heavily relies on the type system to safely decode json into model objects. When going the other way there is no static type system so there really is no "safety" to be had. It sounds like the main thing you're looking for is a mapping from json to model properties so Argo can use that to decode and encode. Argo is dumb however and holds no state. It has a lot of power and flexibility because of this. I don't think it's very strange though to have separate functions for those things. Even Anyways, I hope this helps. |
I totally understand that encoding is not something that Argo wants to do, and is fairly simple to do anyway. Just in case anyone is interested, I ended writing a very simple
The nice thing about this approach (returning an Argo See https://github.com/edwardaux/Ogra/blob/master/Ogra/Encodable.swift for my implementation. Hope that helps others (like me) that do happen to need re-encoding back to JSON. Thanks again for an awesome Argo library... one of my absolute favourites! |
Thanks alot @tonyd256 . I changed my encode function to your version. Your reasoning for not implementing this for Argo makes perfect sense. |
i also looking for some encoder. the best think is that i can define a mapper and then this mapper is used for encoding and decoding stuff. |
Does Ogra work? https://github.com/edwardaux/Ogra |
thx for pointing out. PS: i expected something like the mapping in ObjectMapper |
ObjectMapper doesn't maintain the order of the attributes when encoding to JSON, which could be a big hassle when creating hashes. Perhaps this could be taken into account when developing the encoder for this library. |
Is there any planned support for reversing the decode process and encoding back to JSON? I'm thinking this could be a good way to also automatically support serialization to disk.
The text was updated successfully, but these errors were encountered: