Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Vocabulary normative, context isn't? #1103

Closed
iherman opened this issue Apr 27, 2023 · 18 comments
Closed

Vocabulary normative, context isn't? #1103

iherman opened this issue Apr 27, 2023 · 18 comments
Assignees
Labels
discuss pending close Close if no objection within 7 days

Comments

@iherman
Copy link
Member

iherman commented Apr 27, 2023

I was not present at the special call on 2023-04-25 but, by reading the minutes, I have the impression that one of the controversy comes from the status of the JSON-LD @context file, namely, whether it is normative or not. That file being normative seems to clash with the desire to provide a context file that developers can use without hassle in cases where the JSON-LD behaviour is necessary, ie, if the URDNA canonicalization is used.

So here is a proposal that, though maybe controversial, may make the situation cleaner:

  1. The JSON-LD @context file is not normative. Its goal is, among other things, is to make the life of developers easy insofar as all the "usual" terms are there, mapping to the VCDM and the various security vocabulary terms.
  2. The VCDM vocabulary is normative. To be a little bit more precise: the JSON-LD representation of the vocabulary is normative and should be copied (automatically) into the VCDM specification. The rule (which is already the case) is that the vocabulary contains a URL reference of all terms defined by the VCDM and only those.

That approach may cleanly separate the issues once and for all. The @context file's role is "just" to provide a mapping from the (standard) VCDM terms in JSON to the (standard) Vocabulary URLs, making the Linked Data representation of the VCDM following the standard. The JSON-LD representation of the vocabulary gives a clean specification for all VCDM terms by providing its URL and some of the essential characteristics (type, range, domain). And it does it in a machine readable way, assuring that the VCDM also plays following the rules established for Linked Data.

(The same approach should be used for the Security vocabulary, with the particularity that the terms appearing in the security vocabulary may be defined by the Data Integrity spec or one of the cryptosuite specs.)

@msporny @dlongley @OR13 @decentralgabe

@iherman iherman self-assigned this Apr 27, 2023
@decentralgabe
Copy link
Contributor

I like the proposal, @iherman. It makes sense to me and seems like a reasonable compromise.
@msporny had mentioned on the call it's best not to make the context normative because it could be tougher to make changes if bugs are found. Would this be a problem if we make the vocabulary normative?

@iherman
Copy link
Member Author

iherman commented Apr 28, 2023

@msporny had mentioned on the call it's best not to make the context normative because it could be tougher to make changes if bugs are found. Would this be a problem if we make the vocabulary normative?

In a way yes, but the situation is different. I presume @msporny was considering changes on the context file that adapts it to new vocabularies popping up, using some extra quirk in the JSON-LD context mechanism, etc. However, none of this affects the vocabulary proper.

Put it another way: because the vocabulary lists only the terms and specifications defined in the VCDM spec, any change (except for minor bugs like spelling mistakes, but that can be handled easily) means, in fact, a change in the VCDM core spec as well, because the two are strongly linked.

@OR13
Copy link
Contributor

OR13 commented May 4, 2023

I think the @context value for https://www.w3.org/ns/credentials/v2 and the vocabulary it references https://www.w3.org/2018/credentials need to normative.

The reason is that the base media type application/vc+ld+json only supports compact JSON-LD, and yet the term definitions are not present in this representation, so a consumer is NOT required to understand them:

(Term URLS produced by the context object)

Unless, we make these resources normative.

This also helps address confusion over "mappings", that might exploit alternative context values, such as the following:

https://www.w3.org/ns/credentials/v2 ->

{ "@context": { "@vocab": "https://www.w3.org/2018/credentials#" } }

If we say a "verifier" needs to understand these terms, we probably need to make them normative, and explain how a verifiers get them.

@iherman
Copy link
Member Author

iherman commented May 5, 2023

I think the @context value for https://www.w3.org/ns/credentials/v2 and the vocabulary it references https://www.w3.org/2018/credentials need to normative.

We seem to agree on one aspect, namely that the vocabulary, ie, https://www.w3.org/2018/credentials (or the JSON-LD representation thereof) should be normative.

I am not sure about the context file, though.

The reason is that the base media type application/vc+ld+json only supports compact JSON-LD, and yet the term definitions are not present in this representation, so a consumer is NOT required to understand them:

I understand what you say, but we have to pragmatic.

What that requirement means is that any @context that is used MUST map the terms of this specification to its https://www.w3.org/2018/credentials equivalent. But that does not mean that the all the mappings in a @context MUST map to https://www.w3.org/2018/credentials, ie, it is perfectly acceptable if, beyond the VCDM terms, the context maps other terms to other vocabularies for reasons of usability, pragmatism, and the like (and, I believe, that was the reason of our disagreements in a number of PRs).

If we simply say that https://www.w3.org/ns/credentials/v2 is normative, and that this means every that bits in it is normative, then we are driven into a corner that creates more difficulties than what it solves. Among other things, the users would have to use at least two context files (one for the VCDM and one for DI terms) which is something we want to avoid.

(I am not sure how to put that into the spec editorially. What we are looking at is to declare some part of https://www.w3.org/ns/credentials/v2 normative and other parts non-normative...)

@iherman
Copy link
Member Author

iherman commented May 5, 2023

If we simply say that https://www.w3.org/ns/credentials/v2 is normative, and that this means every that bits in it is normative, then we are driven into a corner that creates more difficulties than what it solves. Among other things, the users would have to use at least two context files (one for the VCDM and one for DI terms) which is something we want to avoid.

I have just hit a different example in #1074: the current context file includes mappings to schema.org terms. And that is fine, we should not reinvent the wheel. If we followed a the approach whereby the context should only include references terms that are defined the VCDM, these should be taken out, ie, the application should include a separate context reference to schema.org...

@melvincarvalho
Copy link

melvincarvalho commented May 8, 2023

Agree with @OR13

While the proposal of making the JSON-LD @context file non-normative might seem like a cleaner solution, there are important reasons for making it normative, especially due to the presence of @protected terms.

The @protected keyword in JSON-LD context ensures that certain terms cannot be overwritten or aliased in the JSON-LD documents, and their meaning remains consistent across all conforming documents. If the JSON-LD @context file is non-normative, there is a possibility of inconsistencies arising from different implementations, which might lead to a loss of meaning, security vulnerabilities, or issues in interoperability.

By making the JSON-LD @context file normative, it ensures that all implementations follow the same set of rules and guidelines, guaranteeing consistent behavior and interpretation of the @protected terms. This way, developers and users can rely on a uniform and stable specification.

Moreover, the JSON-LD context file is essential in providing a mapping from the standard VCDM terms to their corresponding vocabulary URLs, which makes the Linked Data representation of the VCDM consistent with the standard. This mapping is crucial for achieving proper semantic interoperability between different implementations.

In conclusion, the JSON-LD @context file should remain normative to ensure consistency in the use of @protected terms and maintain the semantic interoperability necessary for a robust ecosystem. The VCDM vocabulary, as well as the Security vocabulary, can continue to be normative, offering a comprehensive and reliable set of rules for all relevant terms. This approach will help in creating a consistent and secure environment for developers and users alike.

@TallTed

This comment was marked as outdated.

@OR13
Copy link
Contributor

OR13 commented May 9, 2023

@iherman I think it is ok to make a context file normative, that points to random websites on the internet, that is basically what w3id.org is ... and that is already impossible to change in the current context, because Data Integrity Proof vocabulary is defined by w3id.org not w3c.

The problem remains that if the context is not normative, you cannot assert that any JSON-LD object will actually use any of the URLs we are reserving... For example, I frequently replace:

https://www.w3.org/ns/credentials/v2

with

{ "@context": { "@vocab": "https://www.w3.org/2018/credentials#" } }

Which is still 100% spec compliant, and saves me the trouble of @protected errors, and other JSON-LD parsing details that fail in different libraries.

If the object behind the URL is not normative, it does not need to be understood, and then it can be whatever people find useful... I find it useful for it to not cause errors, which is why I use the value above instead of the one hosted by W3C.

@iherman
Copy link
Member Author

iherman commented May 12, 2023

I have the impression that a single, binary choice on "normative"/"informative" may not be appropriate for a context file. I try, below, to characterize what I feel we should say in the specification regarding the context; maybe that would help us to move forward.

  1. Any conformant JSON(-LD) representation of VCDM MUST contain a @context, whose first entry MUST be https://www.w3.org/ns/credentials/v2 (hereafter "vcdm context")

    Note: This is already in the spec. B.t.w. the alternative in Vocabulary normative, context isn't? #1103 (comment) is not spec compliant per that statement (although it is valid JSON-LD).

  2. The mappings or constraints in the "vcdm context" MUST NOT "define" any mapping, constraint, etc. I.e., it should not include any features regarding VCDM terms beyond what the normative part of the VCDM specification and the VCDM Vocabulary define. In case of discrepancy, i.e., when there is a difference between the "vcdm context" and the VCDM specification, the specification wins.

    Note: while I find the comments of @melvincarvalho regarding @protected compelling, if there are features that put a constraint on a term through through the usage of @protected then this is a bug in the specification. Such constraints MUST be specified in the specification and not in the "vcdm context".

  3. The "vcdm context" MAY include further term mappings (and @vocab setting) that are not part of the VCDM specification. These are done in order to make VCDM applications simpler, and avoid the necessity to add additional generic @context statements to other vocabularies like the Data Integrity, schema.org, DCMI, etc.

  4. The overall content of the "vcm context" is specified by the Working Group and should be stable for a specific version of the VCDM.

So far I side-stepped whether the vcdm context is indeed "normative" or not. Per (2) and (3) it is not, because it does not, and should not, define any feature. Per (4) it may be, because that (plus (1)) ensures the type of stability that @melvincarvalho asked for (and that I agree with) in #1103 (comment).

My feeling is that the "normative" terminology can be misunderstood, hence my hesitation of declare it as normative. Note that the current spec (in §B.1) uses the SHA-256 digest of the "vcdm context" as security measure for the stability described in (4). One alternative is to turn §B.1 into a normative appendix (it is informative now), ie, we use the SHA digest to normatively "implement" (4) point above without getting into unnecessary discussions about what it means for a context file to be normative.

@melvincarvalho
Copy link

I appreciate the nuances you've highlighted, especially regarding the @protected terms, experimental @vocab, and referenced vocabs. They indeed present normative implications that require thoughtful consideration. Reflecting on our experience in the Social Web group, where we faced persistent developer confusion around context, vocab, and spec text, it's clear that these issues can entail years of ongoing clarification. Given this, I find myself aligning with @OR13's suggestion to establish stricter guidelines, which might aid in ensuring semantic interoperability and reduce long-term confusion.

@iherman
Copy link
Member Author

iherman commented Jun 13, 2023

The issue was discussed in a meeting on 2023-06-13

List of resolutions:

View the transcript

1. Vocabulary normative, context isn't? (issue vc-data-model#1103)

See github issue vc-data-model#1103.

Ivan Herman: introduces topic - vocab vs context, starting with vocab being normative or not.
… in the case of linked data and the ld point of view, urls and terms must be assigned, in addition relationships, etc.
… the current vcdm document describes the terms and semantics, and the vocab describes these as well as additional.

Orie Steele: +1 ivan, vocabulary needs to be normative, if implementers are required to understand it.

Orie Steele: if implementers don't understand it, they don't use those terms, then they don't get interop... this proves the document needs to be normative.

Ivan Herman: what i think is that the vcdm is obviously normative, and the vocab should also be normative, though in practice that is not always the case.
… the html generated from the vocab need not be normative.
… the same principle should be used for the security vocab.

Orie Steele: +1 ivan, we need to discuss vocabulary for all TR track items.

Ivan Herman: with the small diff that some terms in the security vocab may be defined in another spec, but it should still be normative.
… the other question is the context - this is more complicated, see my current standpoint on that:.

Ivan Herman: #1103 (comment).

Ivan Herman: from a purely theoretical point of view the context is just a transformation tool and does not define anything other than a mapping between urls and terms.
… continue to hold the opinion that it should not define anything.
… and that anything in the context should be present in the vcdm or the vocab.

Orie Steele: context has normative statements associated... https://w3c.github.io/vc-data-model/#contexts ... meaning the context is what connects "compact json-ld" to the normative vocabulary.

Ivan Herman: the context contains mapping between definitions of terms defined both in the wg and on the web at large in other well known vocabs.
… there is discussion on the other hand that the context should be normative.

Orie Steele: +1 to making context normative, especially given the trend to include status list and other normative term definitions in it.

Ivan Herman: the ld world does not require the context, but it is helpful on the pure json level.
… there is a statement in the vcdm that points to the context normatively.

Orie Steele: +1 ivan, there are normative statements associated with @context, which means we are assuming that the underlying value does not change... but we know that is possible, unless we make the value normative.

Ivan Herman: the spec might actually point in the informative section.
… but does refer to the url and the hash of the context so that it is clear which version is included.
… i would be happy if that statement were normative.
… bc it makes it stronger / more clear as to use of the context and which context.

Orie Steele: iirc, the hash part is currently not normative, but I agree with the comment that it might make things clearer... it seems like making the value normative is a more direct solution though.

Dave Longley: +1 to make a JCS-canonized hash value of the context normative and allow for changes to the context during CR to address concerns around minor tweaks that may be needed to be responsive to implementations.

Manu Sporny: the general question for the group is that if we make either or both normative, what changes on the implementation side.
… the concern is that the stuff in the context might change, and we don't give directions around that.

Dave Longley: https://w3c.github.io/vc-data-model/#base-context.

Dave Longley: https://w3c.github.io/vc-data-model/#contexts.

Manu Sporny: one option is to lock everything in with a normative statement and a hash.

Dave Longley: +1 to manu.

Manu Sporny: as far as vocab being normative we are not sure what will change there, and a lot of tests to validate that.

Orie Steele: sounds like making a hash normative is just a shortcut for writing more tests.

Manu Sporny: want clarity on what is normative - the static representation, the tests, etc.
… not aware of other working groups stating this in the way that we are discussing.

Orie Steele: +1 to issue markers.

Dave Longley: +1 to manu and issue markers.

Manu Sporny: there is a change that if we need any changes that we will need to note that things will break.
… during cr.

Michael Prorock: https://github.com/w3c/vc-data-model/pull/1140/files.

Michael Prorock: I see the multiple sides to this issue. I wanted to highlight something. I opened a PR on how to hash a context, expanded to something it wasn't intended to. If we are going to define how this is done, we should take this into account.
… Do we make this normative or not, if we are insistent on well-formed JSON-LD data model, if we can, we should try to make assets that go along with that as close to normative as possible, if we can make it normative (even in a simpler way), aspects from normative vocab, that might be a good path.
… I do want to understand what the implementation concerns are... this is the 2.0 WG, I don't mind if we have to change implementations to match up with what WG decides.

Orie Steele: want to comment on how impl might change.
… in did core we had lots of context changes - typically via an at-vocab.
… but certainly if there is not a set context, and hash, etc then we should expect manipulation or changes.
… addition of terms by the developer can be a feature or a bug depending on your perspective.
… hash seems worthy of exploring.

Dave Longley: +1 to the hash approach -- and we should consider whether we think using JCS on the content prior to hashing is necessary or helpful.

Manu Sporny: want to agree with a focus on hashing and statement as to url and hash.

Orie Steele: making the hash normative, makes w3id.org and schema.org term definitions normative, by proxy.

Orie Steele: and for the record the hash being normative does not do anything regarding the URLs and their governance model, that are contained in it.

Manu Sporny: think that this makes things easier, and lets us test stuff cleanly, while also preventing dns poisoining, domain takeover, etc.
… could add a statement that first context must be a link with such and such hash.

Dave Longley: +1 to explicitly tell implementers they should not load the context from the Web once they have their own copy as it will not change.

Manu Sporny: can do the same with vocab.
… not sure that that would change things for implementation.
… believe that that would address concerns there - are there other concerns.

Orie Steele: schema.org, w3c and w3id.org can still be bought / sold / transferred or compromised, regardless of if the hash does not change.

Dave Longley: something we may want to consider is jcs prior to hash.
… this would prevent possible issues with whitespace, etc.
… otherwise be careful that files don't change.

Orie Steele: -1 to JCS.

Orie Steele: just publish the document at a w3c origin, and publish its hash along side it.

Orie Steele: no need for JCS.

Dave Longley: Orie_: i'm happy if that's true for all time :).

Orie Steele: if we don't trust W3C to not tamper with documents, we should not use them to publish standards.

Brent Zundel: normative approach to provide a hash and link to to context.

Michael Prorock: I'm happy to let Ivan go first.

Dave Longley: Orie_: W3C changes documents all the time in non-normative ways.

Ivan Herman: find with context and hash - keep to opinion that vocab should be normative.
… the level of tests would not be burdensome if we made vocab normative.
… let alone that the way things would be set on the vocab would point back to normative spec.
… changing vocab would require wg consensus.
… cannot just fiddle at will.
… which is not necessarily the case for the context.

Michael Prorock: Appreciate Ivan adding clarification to have vocab point back to core data model spec, helpful in general, good exploitation of LD in general.
… I know JCS was brought up, DNS poisoning, domain take overs, we should try to keep things inline w/ subresource integrity... base64url vs base64, dealing w/ exchange over the wire, pros/cons to both... something to be aware of, actual hash representation should be aligned w/ SRI specification at W3C.

Orie Steele: dlongley we are assuming the W3C will not break context documents by changing them.

Dave Longley: Orie_: yes, we are (if we don't do JCS).

Orie Steele: No we are not.

Dave Longley: Orie_: No, I'm agreeing with you. "We are assuming...".

Orie Steele: Yes, assuming W3C doesn't mutate published standards is a given i think... if they mutate context values, we can't use them to serve them.

Manu Sporny: concerned re certain items in vocab that might become normative statements like range of domain and similar.
… pointing definitions to vocab are a good idea.
… putting a hash on it gives a concrete item to go check in a simple manner.
… not sure if it is important for impl to go check above and beyond.

Ivan Herman: i think the way to keep things together is that range etc fall outside scope of group.

Orie Steele: +1 ivan.

Ivan Herman: if there are statements in the vocab that fall outside the vcdm then there is an issue since they are not normatively defined by the vcdm.

Michael Prorock: +1 ivan.

Orie Steele: huge +1 to that point.

Orie Steele: +1 to testing normative requirements.

Manu Sporny: concern not around discrepancies, concerns around stuff we are not testing today.

Ivan Herman: if they are in vcdm we should test them.

Manu Sporny: notes that we will have to add tests for coverage, especially data types, ranges, etc.
… not sure how we test that.
… concerned that each impl may have to generate nquads.

Dmitri Zagidulin: +1 to what manu is saying, I don't see how we can test the vocab...

Orie Steele: if we say compact JSON-LD and we don't test that... you can get different nquads... from different implementations.

Orie Steele: +1 ivan, normative statements need to be tested... how they are tested is different topuc.

Ivan Herman: the vcdm does state that there are constraints on values normatively - question is do we test or not - nquads are irrelevant - we can test a multitude of ways.

Michael Prorock: +1 ivan.

Orie Steele: we did resolve something we didn't have last time is that the base media type is compact json-ld which means that unless there are additional normative requirements we don't have to test to the level being suggested.
… completely agree with ivan.
… the normative statements in the current doc are in conflict with how we can test things.
… the hash approach might be a workaround for this, but we can't mix normative statements pointing to urls that are not covered under the core data model.
… we need to solve these normative issues one way or the other.

Dmitri Zagidulin: want to push back that vocab is primary normative artifact.
… json-ld means that the vocab may not need to exist, bad practice of course, but it can work.
… we should define these terms somewhere.
… but if we don't define a vocab things don't break.

Orie Steele: if nothing breaks, the context is normative... and its integrity is normative.

Manu Sporny: +1 to dmitriz.
… one thing that we could do is a data integrity transform and see if sigs match.
… all of this goes back to the context though, and that feels like something we should definitely test, and we should make sure that the context is integrity protected.
… the vocab is for machine reasonaing and humans.
… feels like context and hash are an important thing.
… not sure how to test if we make vocab normative.

Dmitri Zagidulin: @Orie - and I agree. I think the context is normative.

Orie Steele: you can test to see if a files bytes has changed without converting the object to nquads.

Orie Steele: and you can't use data integrity proofs to secure json-ld contexts.

Orie Steele: for all terms in the vocabulary, there must be a human readable definition for the term, the term may be defined by w3id.org, w3c, or schema.org.

Michael Prorock: -1 to add stuff from elsewhere vs. just checking hashes. Have the resource and then the hash to the resource. You do want to detect those changes.
… As for testing vocab, agree with Manu -- ensuring context is normative is important, perhaps hash to context and hashes to other versions that come from that, like schema.org.

Orie Steele: -1 to bundling external contexts by reference... makes development harder.

Michael Prorock: Regarding testing on vocab, there might be approaches there.

Orie Steele: and makes integrity checking harder.

Michael Prorock: If we say something has certain parameters, value of certain shape, we have to test for it. Does it mean more work, it's something we have to do.

Ivan Herman: good to assign a hash < i think, garbled >.
… if the consensus of the WG is that the vocabulary is referred to and is 'secured' via a hash, I will not lie down on the road on that.

Orie Steele: +1 to ivan, if the working group is publish a TR for a JSON-LD media type, we should do a proper job.

Brent Zundel: concerns around testing - what i have heard is no opposition to a link to vocab / context and hash for each and normative statements that those match.
… sounds like folks are fine with that.
… going beyond that, concerns appear to be how would we test.
… my concern is what would the normative statements be - are those in vocab itself, etc.

Orie Steele: think i heard: there are normative statements that should be testable.

Brent Zundel: that is what I understand.

Orie Steele: think that i also heard that there should be a normative statement that includes the hash.
… question on the vocab side is interesting, since we need to make sure that the statements are testable.
… some suggested language might look like an assertion that terms defined in TR appear in vocab as well, and that some terms may be defined externally.
… if we dont make vocab normative, but make context normative, we make vocab normative by reference.

Michael Prorock: +1 to norm by ref, so lets make it good.

Michael Prorock: Possible language for a first proposal would be that we would normatively define URL for both context and vocab and provide hash that must be included.
… I am also happy to pull #1140, core context hash in there, add additions to there or add separately, that's the key thing.

Manu Sporny: -1 to tying this to 1140.

Michael Prorock: We should follow model set by subresource integrity and use that mechanism if multiple hashes are provided.

Brent Zundel: is there anyone that wants changes or alternates to that proposal.

Proposed resolution: The v2 context URL will remain normative (https://www.w3.org/ns/credentials/v2), its value will be made normative through the use of a hash digest. (Brent Zundel)

Brent Zundel: +1.

Dave Longley: +1.

Michael Prorock: +1.

Andres Uribe: +1.

Orie Steele: +1.

Dmitri Zagidulin: +1.

Shigeya Suzuki: +1.

Joe Andrieu: +1.

Gabe Cohen: +1.

Ted Thibodeau Jr.: +1.

Manu Sporny: +1 (and we should add issue markers saying that the value might change before PR and that's expected).

David Chadwick: +1.

Will Abramson: +1.

Ivan Herman: +1.

Ivan Herman: +1 to manu.

Ted Thibodeau Jr.: +1 w/issue markers.

Resolution #1: The v2 context URL will remain normative (https://www.w3.org/ns/credentials/v2), its value will be made normative through the use of a hash digest.

Shigeya Suzuki: would versioning change hash - major or minor change?

Proposed resolution: Add issue markers saying that the value of the hash digest for the v2 context may change before PR and that's expected. (Brent Zundel)

Michael Prorock: +1.

Dave Longley: +1.

Ivan Herman: +1.

Manu Sporny: +1.

Orie Steele: +1.

David Chadwick: +1.

Andres Uribe: +1.

Shigeya Suzuki: +1.

Ted Thibodeau Jr.: +1.

Will Abramson: +1.

Joe Andrieu: +1.

Brent Zundel: +1.

Resolution #2: Add issue markers saying that the value of the hash digest for the v2 context may change before PR and that's expected.

Brent Zundel: not seeing or hearing objections.
… thanks everyone for being awesome.


@iherman
Copy link
Member Author

iherman commented Jun 14, 2023

The question arose yesterday (I think from @msporny) on whether there are any W3C Recs that followed the model I proposed. The answer is yes, the Web Annotation specification. This spec consists of three different Recommendation documents:

  • The Web Annotation Data Model. The structure is very familiar: the model is in JSON-LD but, in this document, it is "disguised" behind a single JSON-LD context reference (look at all the examples). The document includes a higher level "principle" of the model and then specifies the behavior of each JSON term. The testing principles are part of the document as explicit exit criteria.
  • The Web Annotation Vocabulary is the formal specification of the (RDF vocabulary). Much more detailed than the document we have (and that I have in mind; indeed, it is probably too late to produce such a document). There are some notable points for us in the document:
  • The Web Annotation Protocol is about the exchange of annotation with an annotation server; I only list here for the sake of completeness, it is probably not relevant for us.

On a personal level (having been in that WG) I do not remember any major discussions about the normativeness of the @context, neither about its usage in the JSON representation of annotations. And yes, we survived the CR phase, see the implementation results for the model and the vocabulary.

I hope this helps.

@iherman
Copy link
Member Author

iherman commented Jun 14, 2023

Another question came up on our call yesterday: what do we gain by making the vocabulary normative. My answer is: clarity (and we should not underestimate the importance of that).

The question that needs an answer is: where is it normatively defined that the official URL of the evidence property (as defined in the VCDM spec) is https://w3.org/2018/credentials#evidence?

I have already argued that the @context is not the right place for this. The role of the context is different. If we relied on @context (in this case https://www.w3.org/ns/credentials/v2) as the source of that normative information, then we hit a bunch of additional questions like: what is the status of the mapping of nonce to https://w3id.org/security#nonce v.a.v. the security specification? How do we account for mappings like name to https://schema.org/name? In essence: which part of the @context is normative and which one is not? Etc.

(This confusion led to proposals to prune the @context file to include VCDM statements only. Which is good for theoretical purity but, because users would have to include a bunch of context files to use other terms like name, would be atrocious for app developers.)

If we do not use @context, then the only place for answering that question is the vocabulary. After all, the very role of a vocabulary is to specify the URL that identifies a term… Hence, my startup proposal in this issue: the vocabulary should be normative.

Note that I do not propose to make the vocabulary description in HTML a normative document (which would follow the model of the aforementioned Web Annotation specification). What I propose is that we include the JSON-LD representation of the vocabulary into a normative appendix of the VCDM spec. (We can also "prune" that normative part not to include the "annotation" style entries, like rdfs:label and/or rdfs:comment. Those are editorial/tooling details.)

@melvincarvalho
Copy link

What I propose is that we include the JSON-LD representation of the vocabulary into a normative appendix of the VCDM spec

@iherman why not put it in a script tag?

@iherman
Copy link
Member Author

iherman commented Jun 16, 2023

What I propose is that we include the JSON-LD representation of the vocabulary into a normative appendix of the VCDM spec

@iherman why not put it in a script tag?

That would be invisible to the reader. Besides, the W3C publishing rules, more specifically the statement whether some parts are normative or not, do not apply to the content of the script tag.

@brentzundel
Copy link
Member

since both #1158 and #1159 have been merged, I believe this issue can be closed.

@msporny
Copy link
Member

msporny commented Jun 30, 2023

+1 to close, I believe #1158 and #1159 addressed this issue.

@iherman iherman added the pending close Close if no objection within 7 days label Jul 14, 2023
@brentzundel
Copy link
Member

This has been addressed, closing

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
discuss pending close Close if no objection within 7 days
Projects
None yet
Development

No branches or pull requests

7 participants