Skip to content

Latest commit

 

History

History
156 lines (128 loc) · 6.39 KB

README.md

File metadata and controls

156 lines (128 loc) · 6.39 KB

Implementation Report

This directory contains the WoT Architecture 1.1 Implementation Report and the sources for it. The report itself is in testing/report11.html Do not edit this file, it is autogenerated using a script and any edits would be overwritten the next time it is updated.

Each commit here will sync it to the master, which will expose the content to http://w3c.github.io/wot-architecture/testing/report11.html.

Adding Your Results

If you want to update your input for a Member organization, please go to the wot-testing repo and follow the instructions. Testing results (CSV files) should then be placed in inputs/results, testimonials in inputs/testimonials and implementation descriptions in inputs/implementations.

At-Risk Highlighting

If you need to update the at-risk highlighting in the TD specification, you may also need to run npm run render. If new assertions have been added to the TD specification then you will need to run npm run render BEFORE generating the report, then re-run it again afterwards.

Suppressing Assertions

The file suppressed.csv can be used to list assertions for which test results should be ignored. Such results will also not be used for "roll-up" results (when a child assertion, one with an underscore in its name, is used to break down assertions with multiple options into simpler, separately testable assertions).

Manual Assertions

The file manual.csv can be used to identify assertions that need manual testing or declarations, for which there is no automated test available.

At-Risk Highlighting

The file atrisk.css provides highlighting for at-risk elements in the report but should NOT be edited directly as it is autogenerated. Instead edit inputs/atrisk.csv as necessary.

Report Template

The HTML template for the Implementation Report is in inputs/templates/report.html. If you wish to edit the main explanatory text of the report or update the metadata (date, authors, etc.), do so here.

Testimonials

To add a new testimonial, use the template in inputs/templates/testimonial.html and put a new file in inputs/testimonials,

Implementation Descriptions

Implementations are described in HTML files in inputs/implementations, based on the template in inputs/templates/, The IDs declared in these descriptions should be unique and descriptive as they will be used elsewhere, for example, in the interoperability data files referring to those implementations.

Implementations should also be entered into the table in inputs/impl.csv with identifiers, titles, and roles consistent with those assigned in the above HTML files.

Test Specifications

A procedure for testing particular assertions may be given in inputs/testspec.html. These may be included as an appendix in the report, but are currently suppressed.

Test specifications can be given both for assertions given in the specification (see inputs/template.csv for an automatically generated list of identifiers) and for any extra assertions in inputs/extra-asserts.html.

Extra Assertions

Assertions used for testing but not (yet) included in the specification may be listed in inputs/extra-asserts.html. The intention is that these assertions should eventually, and before final release, be inserted into the final specification. In the final release this file should be empty.

At-Risk Assertions

Assertions related to features that are at risk of being deleted from the final CR should be identified in the inputs/atrisk.csv file. The assertion text for these will be given a special color in the report table.

Categories

Assertions can be assigned to a category in inputs/categories.csv.

Dependencies

Dependencies between assertions can be recorded in inputs/depends.csv.
The "Parents" column relates detailed assertions to more general assertions. the "Contexts" column indicates assertions that only need to be considered in a particular context (either syntactic, if pointing at another vocabulary item, or logical, if pointing at another optional assertion). Both "Parents" and "Contexts" may have multiple items separated by spaces. Entries should be IDs of other assertions. Use "null" if there is no dependency.

Result Data

Each implementation should record which features they have implemented and tested under the inputs/results directory. All data will be read and merged into the report. Mark each implemented feature with a status of either "pass" (if it satisfies the specification) or "fail" (if it does not).
If you have not implemented an optional feature, list its status as "not-impl".

Features not listed or with the status of "null" will not be included in the sums; this is distinct from "not-impl" as absence of a feature is meant to be used to allow different features to be reported in different files.

If you did not implement a feature on purpose please indicate this explicitly.

Any other status will be ignored (e.g., "null" as used in the template). If you have tested a feature in multiple implementations check in one file per implementation, using as a filename the id given in the template for the implementations' description. The filename should also be used as an id in the description of each implementation.

The template.csv file lists all features but with a "null" status. Do not edit this file; it is autogenerated. It is provided so you can use it as a reference and as a basis for your own data files.

Files should be in CSV format, including headers as defined in template.csv, and will be parsed by the csvtojson Node.js library.