Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Interpretation of Day 3 Resolutions #73

Closed
OR13 opened this issue Apr 11, 2023 · 29 comments
Closed

Interpretation of Day 3 Resolutions #73

OR13 opened this issue Apr 11, 2023 · 29 comments

Comments

@OR13
Copy link
Contributor

OR13 commented Apr 11, 2023

I expect that section runs afoul of a presumption in a resolution made at the most recent VCWG F2F Meeting, namely that "Transformation rules MUST be defined", specifically, I expect that a number of the individuals in the WG presumed that the intent here was to have normative language defined somewhere that was testable. The normative language doesn't have to be the only way "other transformation mechanism MAY be used to translate to the core data model" (understanding that language like that introduces interoperability challenges). Fundamentally, the language in this section isn't normative and therefore won't be tested by the WG.

It is imperative that if the WG is defining transformation rules, that those rules be testable and demonstrated to provide interoperability at some layer. Not providing that assurance (demonstrable interoperability) with a specification being produce by the WG would be a failure of our remit.

https://www.w3.org/2017/vc/WG/Meetings/Minutes/2023-02-16-vcwg#resolution1

This section needs to be something that has normative language, is tested during CR, and results in measurable demonstrations of multi-vendor interoperability.

@OR13
Copy link
Contributor Author

OR13 commented Apr 11, 2023

@selfissued @msporny @jandrieu we seem to have pretty different interpretations of "how" we should be implementing the day 3 resolutions.

I would like to get on the same page regarding the "mapping" versions of vc-jwt.

cc also @Sakurann @brentzundel regarding interpretation / actions for the working group from the F2F.

@OR13
Copy link
Contributor Author

OR13 commented Apr 11, 2023

Here is my opinion.

  1. Providing an example mapping is sufficient, we don't want or need normative requirements to do anything with the mapping.
  2. The mapping does not need to be tested, since it will have no normative requirements.
  3. The "MUST" was directed to the working group, not a suggestion for "specific spec text", and it is therefore satisfied by the fact that we have "shown" a mapping is possible, not that "the wg requires the mapping in normative text".

As an aside, the mappings in version 1.1 are a source of great pain, and we should learn from the lessons there and not encourage complicated and lossy transformations, even if we are required to acknowledge that they exist.

@iherman
Copy link
Member

iherman commented Apr 12, 2023

@OR13 I am a bit afraid that, if we go along the lines you propose, we will be facing a Formal Objection as we had with DID. If we do not have at least one normative mapping (ie, "MUST") then how would we prove the interoperability between vc-jwt and the vcdm? If we cannot prove it, then how exactly would these standards fit together?

@OR13
Copy link
Contributor Author

OR13 commented Apr 12, 2023

@iherman The vc data model has no security / interoperability tests, its interoperability is established by JSON data model assertions... You test conformance, by reviewing JSON.

JWT and Data Integrity Proofs have additional security and interoperability tests to consider.

The mapping is not a consideration for those security tests, but the output of the mapping IS relevant to the core data model JSON-LD tests.

You don't need to "do a mapping" to verify a JWT, in an interoperable way.

After verifying a JWT, if you wish, you may apply one or more mapping to produce "vc+ld+json".

If you don't apply the mapping, you are not holding a "vc+ld+json"... and therefore, you don't need to worry about "interoperability testing" with "vc+ld+json".

Does that make sense?

@iherman
Copy link
Member

iherman commented Apr 13, 2023

@iherman The vc data model has no security / interoperability tests, its interoperability is established by JSON data model assertions... You test conformance, by reviewing JSON.

... and that may become a problem. We should prove that a VC issued through one implementation can be consumed by others. That should be part of the CR tests.

But that is a different discussion.

JWT and Data Integrity Proofs have additional security and interoperability tests to consider.

The mapping is not a consideration for those security tests, but the output of the mapping IS relevant to the core data model JSON-LD tests.

You don't need to "do a mapping" to verify a JWT, in an interoperable way.

You need to do a mapping to prove that this is a Verifiable Credential. You also need to prove that a VC expressed in JWT can be consumed by a non-JWT based implementation. That requires a normative mapping. Otherwise the JWT world become its own silo. And that may raise lots of eyebrows down the line outside this WG.

(I have the feeling to repeating old arguments here, and that worries me.)

Let me put it another way: why is it a problem if that mapping is normative in the spec? Which applications are harmed by having that mapping normatively defined? I really do not understand why this became an issue...

After verifying a JWT, if you wish, you may apply one or more mapping to produce "vc+ld+json".

If you don't apply the mapping, you are not holding a "vc+ld+json"... and therefore, you don't need to worry about "interoperability testing" with "vc+ld+json".

Does that make sense?

See above.

@Sakurann
Copy link
Contributor

The point of the test suites in W3C is to prove that the features defined in each of the specifications are independently implementable. it is not to guarantee interoperability between any of the VC implementations - that was not the case in VCDM v1.1 and is not for VCDM v2.0.

We should prove that a VC issued through one implementation can be consumed by others.

You also need to prove that a VC expressed in JWT can be consumed by a non-JWT based implementation.

It is unrealistic to mandate or expect that any implementer using only data integrity can also consume a VC issued by an implementer using only JWT/JWS.

We were able to reach Miami resolution because we agreed there could be multiple mappings to the base data model. Hence mapping defined in vc-jwt is A mapping, not THE mapping, and no need to test it.

@iherman
Copy link
Member

iherman commented Apr 13, 2023

We were able to reach Miami resolution because we agreed there could be multiple mappings to the base data model. Hence mapping defined in vc-jwt is A mapping, not THE mapping, and no need to test it.

How will we prove that this mapping is indeed correct? Because if we cannot prove it somehow, it is not useful...

@msporny
Copy link
Member

msporny commented Apr 13, 2023

@OR13 wrote:

  1. Providing an example mapping is sufficient, we don't want or need normative requirements to do anything with the mapping.

I'd change the above to say "Providing a normative mapping that can be tested, which is one option among many, is sufficient. We need normative requirements in order to test the mapping."

Waving our hands that "There are many ways to do the mapping." is inviting non-interoperability in the ecosystem. What that is saying is basically: "There is no way that the WG suggests that JWTs should be mapped to VCs; you can do anything you want." -- at which point, the non-normative mapping is useless.

  1. The mapping does not need to be tested, since it will have no normative requirements.

I'd change this to: "The mapping needs to be tested."

If we don't test the algorithm, we're inviting Formal Objections on the specification. The argument will be: "You have not demonstrated interoperability on the only algorithm you've provided for transformation."

It is a logical certainty that you can arbitrarily map one data structure to another one in a uni-directional way. We don't need a specification to say things that are just logically true. IOW, outlining one algorithm that we know exists (in the logical sense) and then saying "others may exist", is a fairly useless thing to do in a global standard.

The purpose of standards is to define (ideally) a single mechanism that will be widely used and adopted in industry, and that will lead to interoperability. That is one of the fundamental reasons that we write standards.

  1. The "MUST" was directed to the working group, not a suggestion for "specific spec text", and it is therefore satisfied by the fact that we have "shown" a mapping is possible, not that "the wg requires the mapping in normative text".

It's clear that one of the points of contention is this text: "Transformation rules MUST be defined"

My interpretation of that text is: "The WG (or someone else) MUST define transformation rules" -- that is, they MUST exist (in a way that has some normative weight), otherwise, it's just prose that no one needs to pay attention to or implement to (which is largely useless for an algorithm that is fairly critical to interoperability).

We can all convey what we thought that statement meant, but it's clear that a number of us interpreted that statement to mean something different. Debating "what was meant" is probably not going to get us through this next stage, though, because the concern at this point is Formal Objections based on the fact that the WG has defined zero normative rules on how to map a JWT to a VC. As I said above, saying that something is possible, but you can do anything, is a recipe for non-interoperability.

As an aside, the mappings in version 1.1 are a source of great pain, and we should learn from the lessons there and not encourage complicated and lossy transformations, even if we are required to acknowledge that they exist.

Yes, we should learn from the lessons there and not encourage complicated and lossy transformations. The way to do that is to define a normative transformation that is the way the VC-JWT folks recommend that it happens, and state that other transformations are possible, but really, use THIS one (to achieve broader interop).

The VC-JWT folks have been given another chance at getting the transformation algorithm right this time... but defining it in a way where no one has to follow it is a recipe for non-interop.

What I'd suggest is that specifications defining application/vc+* media types:

  1. MUST define one normative transformation algorithm.
  2. MUST note whether the algorithm(s) are unidirectional (lossy) or bidirectional (loss-less).
  3. MUST demonstrate interoperability for the algorithm via a test suite.
  4. MAY note that other interoperable algorithms are possible.

One way this could be accomplished is noting that each of these specs define at least a Convert To Verifiable Credential algorithm, and possibly a Convert From Verifiable Credential algorithm.

@OR13
Copy link
Contributor Author

OR13 commented Apr 13, 2023

@iherman

How will we prove that this mapping is indeed correct? Because if we cannot prove it somehow, it is not useful...

I think you are missing the part that is useful.

JWT is an RFC, and has wide deployment.

Many different "linked data views" of JWTs exist today, I have created a few myself, I don't expect any of them to get consensus on being "the only way to convert from JSON to JSON-LD VC".

The problem with CBOR-LD is similar, there are many ways that CBOR-LD can be mapped to JSON-LD, and in fact we expect some of those mappings to probably chance as CBOR-LD proceeds through standardization.

This working group does not need to define how CBOR-LD maps to "vc+ld+json", it is indeed sufficient to test a "mapped representation" against the "core data model tests", and if that mapping passes the core data model tests, we can say CBOR-LD is a representation of a "W3C Verifiable Credential".

If this argument does not have consensus than I suggest we pause on the conversations regarding "mappings", and not comment on them in the working group.

@msporny

If we don't test the algorithm, we're inviting Formal Objections on the specification. The argument will be: "You have not demonstrated interoperability on the only algorithm you've provided for transformation."

The working group is only required to test normative statements.

We are not required to test optional and non normative transformations, this applies to ACDCs, CBOR-LD and JSON formatted "W3C Verifiable Credentials".

We can't define normative mappings for work that happens outside the working group.

The resolution says the mappings are not required to be done in this working group:

Transformation rules MUST be defined, but not necessarily by this WG..

"MUST be defined" does not imply "normatively and with tests", nor does it imply "uniqueness".

It would be better to do the mapping outside of the working group, then "crown the one mapping to rule them all" in the working group.

This remains a good option for us to consider.

My interpretation of that text is: "The WG (or someone else) MUST define transformation rules" -- that is, they MUST exist (in a way that has some normative weight), otherwise, it's just prose that no one needs to pay attention to or implement to (which is largely useless for an algorithm that is fairly critical to interoperability).

I agree with this interpretation, but not the "uniqueness" part of it.

JSON-LD is an open world data model, there are lots of valuable ways to produce a valid "vc+ld+json", and all of them are "valid" as long as they are defined in a spec, and the resulting object passes the core data model conformance tests.

I will note that not all those mappings are useful, but the same is true of the core data model's JSON-LD term definitions, or arbitrary credential examples in general.

The way to do that is to define a normative transformation that is the way the VC-JWT folks recommend that it happens, and state that other transformations are possible, but really, use THIS one (to achieve broader interop).

I might agree with this, my preferred "mapping" is the "identity mapping" that is used when cty: vc+ld+json is true.

We could see if that mapping as RECOMMENDED might get consensus, it is already in the current document.

The VC-JWT folks have been given another chance at getting the transformation algorithm right this time... but defining it in a way where no one has to follow it is a recipe for non-interop.

I thing you mean the VC WG members... The tone of this statement is a bit triggering for me as an editor.

It is also too vague a statement to respond to... what do you mean by "non interop"...

I consider JWT interop in the following dimensions:

  1. sign and verify function consistently for JWTs/ JWSs across implementations.
  2. required claims in the specific typ are processed consistently across implementations.
  3. the claimset after mapping to "vc+ld+json" passes the core data model interoperability tests.

If all 3 of these things are true, there is interop...

I don't need to do (3) unless I am intending to process the credential as RDF....

There might be very good reasons why some implementors want to produce different JSON-LD prior to importing into a graph database, consider the cases where terms were defined improperly, or language translation needed to be added as annotation, or any other JSON-LD enhancements needed to be added to make the graph representation nicer.

If you make the mapping normative, you prevent implementers from injecting useful/useless JSON-LD into the "vc+ld+json" as part of the transformation... this destroys the value of the transformation.

What I'd suggest is that specifications defining application/vc+* media types:

  1. MUST define one normative transformation algorithm.
  2. MUST note whether the algorithm(s) are unidirectional (lossy) or bidirectional (loss-less).
  3. MUST demonstrate interoperability for the algorithm via a test suite.
  4. MAY note that other interoperable algorithms are possible.

One way this could be accomplished is noting that each of these specs define at least a Convert To Verifiable Credential algorithm, and possibly a Convert From Verifiable Credential algorithm.

^ I suggest we move this to a separate issue in the "vc-data-model", since it is guidance that does not apply exclusively to vc-jwt.

Its relevant to this open PR: w3c/vc-specs-dir#14

And these open issues: w3c/vc-data-model#1048

@mprorock
Copy link
Contributor

I consider JWT interop in the following dimensions:

  1. sign and verify function consistently for JWTs/ JWSs across implementations.
  2. required claims in the specific typ are processed consistently across implementations.
  3. the claimset after mapping to "vc+ld+json" passes the core data model interoperability tests.

If all 3 of these things are true, there is interop...

I don't need to do (3) unless I am intending to process the credential as RDF....

this is 100% my read as well

@iherman
Copy link
Member

iherman commented Apr 14, 2023

@iherman

How will we prove that this mapping is indeed correct? Because if we cannot prove it somehow, it is not useful...

I think you are missing the part that is useful.

JWT is an RFC, and has wide deployment.

I was not questioning that at all; I am of course perfectly aware that there is a wide deployment of JWT.

Many different "linked data views" of JWTs exist today, I have created a few myself, I don't expect any of them to get consensus on being "the only way to convert from JSON to JSON-LD VC".

And that is fine.

The question I was asking was different, apologies if I was not clear. Let us put aside whether a given transformation is THE transformation.

There will be a W3C Recommendation that has a claim in there which says "this is a way to transform this type of vc using jwt into a bona fide vcdm representation". The presence of this transformation in the specification is key, because this is what links the various documents this WG is producing together. I think we are in agreement so far.

My question is: how do you prove that this transformation is correct? What type of testing regime will we have to make the claim in the spec more than just a claim? If we want the W3C Membership to approve that document as being a standard, this claim must be proven.

@iherman
Copy link
Member

iherman commented Apr 14, 2023

My interpretation of that text is: "The WG (or someone else) MUST define transformation rules" -- that is, they MUST exist (in a way that has some normative weight), otherwise, it's just prose that no one needs to pay attention to or implement to (which is largely useless for an algorithm that is fairly critical to interoperability). [from @msporny]

I agree with this interpretation, but not the "uniqueness" part of it. [from @OR13]

I have the feeling that the "uniqueness" issue is misleading the discussion.

I am not seeking a "unique" mapping. What I am looking for is a transformation that has been vetted by the W3C Process, ie, that has undergone the same rigorous checks through, e.g., the CR process of the recommendation-to-be as any other aspects of the specification. I have absolutely no problem if the text makes it clear that the transformation is not the unique one, that specific implementations may choose not to implement it because they do not need the Linked Data aspects. But if an implementation needs the Linked Data aspects and, therefore, needs to do a transformation, the specification must provide a vetted, proven way of doing this.

At least as far as I am concerned, that is the reason why I'd prefer the mapping to be normative.

@OR13
Copy link
Contributor Author

OR13 commented Apr 14, 2023

The day 3 resolution says that a mapping MUST exist and that it does not need to be created by this working group.

It seems not in the spirit of that resolution to assert that the mapping needs to have normative weight that comes from the W3C process, especially because we have been operating under the assumption, that mDoc, ACDCs, Gordion and other systems can define mappings outside the W3C and not be treated as second class citizens.

We intentionally positioned the current language to be respectful of mappings outside the W3C.

It feels like we would be going back on that commitment, if we take your suggestion.

I think there will be objections if we start (continue?) using W3C process to try to push some formats over others.

I feel this has already happened with:

https://w3id.org/security#proof

@iherman
Copy link
Member

iherman commented Apr 14, 2023

It seems not in the spirit of that resolution to assert that the mapping needs to have normative weight that comes from the W3C process, especially because we have been operating under the assumption, that mDoc, ACDCs, Gordion and other systems can define mappings outside the W3C and not be treated as second class citizens.

We intentionally positioned the current language to be respectful of mappings outside the W3C.

I do not think there is a contradiction. Mappings can be defined outside the W3C indeed. They can be defined by individuals, outside organizations like IETF, etc. And, obviously, W3C has no say in there.

But we are talking, in this case, of a mapping that is part of a document that will be, eventually, voted upon by the W3C membership to be promoted as a W3C Recommendation. The W3C membership has to be convinced that this specification is integral part of the VC WG's work and the WG's charter and, therefore, has to trust that the connection between the jwt and the linked data approaches is indeed present, and it is technically correct.

I am not saying that this would be the only mapping. I am also not saying that every conformant implementation MUST provide that mapping. I am saying that a proof of correctness of the mapping should be provided by the CR process, and I do not know any other way than to have (experimental, maybe) implementations (note the plural) that prove that the mapping is properly specified (ie, implementation can be done) and what the mapping produces is indeed a bona fide VC in JSON-LD form. That is exactly what the CR process is created for.


(The standard disclaimer applies: this is my personal opinion and not some sort of an official W3C statement.)

@decentralgabe
Copy link
Collaborator

decentralgabe commented Apr 14, 2023

I agree your interpretation of the day 3 resolution @OR13 — and with respect to the purpose of this issue my statements end there.

Going beyond your initial intent with the issue, and the resolutions made at the F2F, I sympathize with the points @iherman and @msporny are making. I see the benefit of having at least one testable representation for each media type, with language clarifying that it is not to be seen as the one true mapping.

I would amend what @msporny wrote to suggest... (bold are my amendments):

MUST define at least one non-normative transformation algorithm.
MUST note whether the algorithm(s) are unidirectional (lossy) or bidirectional (loss-less).
MUST demonstrate interoperability for the algorithm via a test suite.
MAY note that other interoperable algorithms are possible.

@msporny
Copy link
Member

msporny commented Apr 14, 2023

MUST define at least one non-normative transformation algorithm.

One of the points that both @iherman and I are trying to make is: If it's non-normative, not a single implementation needs to implement the transformation. That is: Every implementer can say: "That's not a requirement of the specification, I don't need to implement it!" OR, they can say: "Yeah, I implemented half of that algorithm, but not the other half, because it's non-normative and I don't have to implement it as stated to be conformant to anything."

That's the interoperability danger that both @iherman and I are attempting to highlight.

Non-normative language === permission to implement whatever you want without any repercussions

Standards have very few tools to drive implementer behavior -- normative language is one of them, and it's one of them because many large governments and organizations enforce the normative language by refusing to purchase solutions that do not implement the normative language.

@Sakurann
Copy link
Contributor

How will we prove that this mapping is indeed correct? Because if we cannot prove it somehow, it is not useful...

It is a non-normative mapping, so the WG does not need to prove. It is useful without a WG "proving", for example, by inspiring other mappings. There are a lot of non-normative statements throughout VC WG documents that the WG does not prove.

@decentralgabe
Copy link
Collaborator

If it's non-normative, not a single implementation needs to implement the transformation

That's true and it would not be compliant if it could not demonstrate a mapping to the core data model.

I go back to @OR13's earlier comment

The mapping is not a consideration for those security tests, but the output of the mapping IS relevant to the core data model JSON-LD tests.

This is the disconnect. Normative mapping outputs, non-normative mappings. So you can have many non-normative mappings and test that their outputs are normatively compliant.

@jandrieu
Copy link
Contributor

I'm sorry, if we have no normative statements about how to map, how do we test it?

If it isn't testable--and we have at least two implementations that can self-report satisfying those tests--then I don't understand how we publish it as a normative standard?

It's fine to say there are other ways to do it, but without an example of sufficient specification maturity, it seems like a feature at risk. The parallel for me is the extension points. We've long held that we need at least one testable example defined in an appropriate specification, implemented by multiple implementers. Why would that policy not apply to vc-jwt, which MUST map to a vc+ld+json for it to be in scope.

My take on the resolution was that, while it doesn't matter where the mapping is defined, there must be such a definition, and it must be testable.

If all you're saying is that it's possible to map from XYZ to VCs but there is no testable normative standard that demonstrates that mapping, then IMO, whatever XYZ is, it is not a standard. It's a free for all. I can make a mapping for vc-jwt to vc+json+ld that is wrong. How can implementations know that my implementation is wrong if there isn't at least one concrete example of how to do it right?

@dlongley
Copy link
Contributor

dlongley commented Apr 14, 2023

We have said that serializations defined outside of this WG can create their own mappings to application/vc+ld+json and a mechanism by which to test them to prove to someone that it works. ACDC is an example of this. The WG isn't going to do the work of testing to make sure that's true and announce that it is. But if it works for someone out there, that's great for them.

However, if this WG is going to define a serialization (such as XYZ, or VC-JWT), then we MUST prove it can be transformed into the base media type. We can't sign off on it as a WG otherwise. There are two ways to do this that I'm aware of:

  1. This WG writes the normative transformation rules and the CR process successfully demonstrates two or more independent implementations.
  2. The WG references another standard (of sufficiently equal weight) that defines the transformation rules.

@msporny
Copy link
Member

msporny commented Apr 14, 2023

@Sakurann wrote:

There are a lot of non-normative statements throughout VC WG documents that the WG does not prove.

Yes, and those statements are 1) not algorithms, and 2) not necessary for the level of interoperability that the WG has desired.

At present, there are zero algorithms that the WG is defining that are non-normative.

I take @decentralgabe's point that perhaps we could focus on testing the result of the transformation, and I'd buy that argument if, in the WG, we weren't also specifying a non-application/vc+ld+json media type, with a concrete serialization, and then not telling anyone how they should map that data format to/from the base media type. That is, "What should I do with iss? What should I do with iat? kid?"

A provable, normative, bi-directional transformation to/from VC-JWT is something that has existed in v1.0 and v1.1. What's being proposed at present is removing that entirely and replacing it w/ something that no one has to implement. As problematic as the initial VC-JWT normative guidance was, what's being proposed now, IMHO, is measurably worse for interoperability.

@OR13
Copy link
Contributor Author

OR13 commented Apr 14, 2023

  1. not necessary for the level of interoperability that the WG has desired.

I don't agree with this assertion, I have made that clear, please don't assume consensus like this.

and I'd buy that argument if, in the WG, we weren't also specifying a non-application/vc+ld+json media type, with a concrete serialization, and then not telling anyone how they should map that data format to/from the base media type. That is, "What should I do with iss? What should I do with iat? kid?"

This is a fair point, as I noted previously , this working group is not required to define mappings, and if the consensus is that the mapping is not useful (or is harmful), it should be removed from the spec.

A provable, normative, bi-directional transformation to/from VC-JWT is something that has existed in v1.0 and v1.1.

Actually multiple normative mappings that were not interoperable existed in v1 and v1.1.

We are specifically trying to avoid repeating that mistake.

Having implemented both approaches, I will add that both of them are terrible, because they dangerously blend and map JSON-LD and JSON, and violate assumptions in both directions.

IMHO, is measurably worse for interoperability.

You keep using the word interoperability without being specific, and in a spec dedicated to a security format.

JSON Web Tokens have lots of interoperability.

The new forms of JWT that secure vc+ld+json have more interoperability than what we saw in v1 and v1.1. They are interoperable both with vanilla JWS/ JWT processors and with vc+ld+json.

The new form of JWT that secure "application/json" also have good interoperability it just is not with "vc+ld+json", without a mapping.

We can confirm interoperability with "vc+ld+json" with a simple JSON Schema.

Should we publish such as JSON Schema as a normative requirement of the core data model, and then any mapping can simply point at it to meet the normative requirements?

We could add the following text to the conformance section of vc-data-model.

"Any JSON Object validated by the following JSON Schema is a valid representation of a W3C Verifiable Credential regardless of any algorithms used to produce it".

No need to define the algorithm, and 100% confidence that any given data element is conformant.

@dlongley
Copy link
Contributor

@OR13,

A provable, normative, bi-directional transformation to/from VC-JWT is something that has existed in v1.0 and v1.1.

Actually multiple normative mappings that were not interoperable existed in v1 and v1.1.

If by "actually" you mean there were actually two of those things @msporny mentioned, then I agree. There were at least two mappings in the v1.1 spec ("instead of" and "in addition to" as we've referred to them). But the point here is that those actually existed and worked. Having just one (we had two) is better than stepping that back to none. None is clearly worse for interop with the core data model -- which is a requirement to get to consensus.

The new forms of JWT that secure vc+ld+json have more interoperability than what we saw in v1 and v1.1. They are interoperable both with vanilla JWS/ JWT processors and with vc+ld+json.

I think this is great. This particular serialization / securing method sounds like it fits in with our work in this WG quite well.

The new form of JWT that secure "application/json" also have good interoperability it just is not with "vc+ld+json", without a mapping.

Without a mapping, that interoperability is out of scope for our work here. If there's a mapping, that's great. It means there are multiple ways to use the data interoperably and it all fits together with our work here. If this WG is going to define the serialization you speak of here, then this WG either needs to define the mapping and prove it through the CR process or point to another standard of equal weight that defines that mapping.

Alternatively, that serialization could be defined outside of this WG similar to how ACDC is being defined.

@selfissued
Copy link
Collaborator

selfissued commented Apr 14, 2023

To be clear, mappings are unnecessary to use the various native VC representations. They are only needed if you want to process a VC using the VC Data Model representation, and in that case, you might want different mappings depending upon properties of the VCDM representation you want. Therefore, it's fine to define an example mapping but implementers are free to use others in ways that meet their needs.

@brentzundel
Copy link
Member

What if the mapping provided is both normative, so that it can be tested, and optional, so that implementations of VC-JWT can still be compliant f they choose t use a different mapping?

@msporny
Copy link
Member

msporny commented Apr 21, 2023

What if the mapping provided is both normative, so that it can be tested, and optional, so that implementations of VC-JWT can still be compliant f they choose t use a different mapping?

Yes, I think that's all that's being requested... that there exist at least ONE testable normative mapping that folks need to demonstrate conformance to... OR, they can choose to do another one (as long as it's normative and testable through some other venue). IOW, there can be many mappings (btw: many mappings -> bad interop, but we can't stop that), but for the sake of interoperability, you need to be able to prove that you're compliant to at least one of them... and since this group is working on one of them (for VC-JWT), we need to make sure we can demonstrate interop on at least that one mapping.

@OR13
Copy link
Contributor Author

OR13 commented Jun 27, 2023

I expect this to close at the same time the open PRs on the core data model close.

@OR13
Copy link
Contributor Author

OR13 commented Jun 27, 2023

I am marking at pending close, given the conversation has moved elsewhere.

@OR13
Copy link
Contributor Author

OR13 commented Jul 19, 2023

Marked pending close over 1 week ago, closing.

@OR13 OR13 closed this as completed Jul 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

10 participants