A library and a curated collection of parsers that bridge raw instrument output files and the semantic-schemas knowledge graph pipeline.
semantic-transformers/
src/semantic_transformers/ Python library (Transformer, QuickMapper, …)
parsers/ Machine-specific file parsers
<domain>/ Mirrors the semantic-schemas folder structure
<specialisation>/
<instrument>/ One folder per instrument model
parser.py Language-agnostic parsing logic
README.md Quick-start, schema compatibility, and known limitations
CHANGELOG.md Schema compatibility history
<lang>/ One subfolder per export language (e.g. de/, en/)
column_mapping.json Maps locale-specific column names to ontology IRIs and units
docs/ Guides for users and contributors
| Class | Role |
|---|---|
Parser |
Protocol to implement when adding support for a new instrument |
ParseResult |
What every parser returns: simplified JSON + DataFrame |
Transformer |
Runs parsing → JSONata transform → RDF graph |
TransformResult |
What Transformer.run() returns: RDF graph + DataFrame |
QuickMapper |
Turns any tabular file into RDF using a simple YAML mapping (no parser needed) |
Each parser targets a specific instrument model. The folder path mirrors the
schemas/ tree in semantic-schemas:
| Schema | Instrument | Import path |
|---|---|---|
characterization/tensile-test/TTO |
Zwick/Roell (testXpert III) | semantic_transformers.parsers.characterization.tensile_test.testxpert_iii |
# Install the transformers library
pip install semantic-transformers
# Optional: install optional dependencies
pip install semantic-transformers[excel] # for Excel file support
pip install semantic-transformers[dev] # for development and testingBoth repositories are designed to be cloned as siblings under a shared folder:
mkdir semantic-dataspace && cd semantic-dataspace
git clone https://github.com/Semantic-Dataspace/semantic-schemas
git clone https://github.com/Semantic-Dataspace/semantic-transformers
python3 -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
pip install -e semantic-transformers/
pip install jupyterlab # only needed for the interactive notebooksUse a ready-made parser and the matching schema notebook. For a Zwick/Roell tensile test:
jupyter lab semantic-schemas/schemas/characterization/tensile-test/TTO/docs/2_tensile_test_csv_workflow.ipynbEdit Step 0 (one line, point to your file) and run all cells. Done.
Use QuickMapper. Provide a short YAML that names the columns and points each
one at an ontology class IRI:
from semantic_transformers import QuickMapper
mapping = {
"label": "my experiment",
"columns": {
"Force": {
"iri": "https://w3id.org/pmd/tto/StandardForce",
"unit": "http://qudt.org/vocab/unit/N",
},
"Extension": {
"iri": "https://w3id.org/pmd/tto/Extension",
},
},
}
result = QuickMapper(mapping).run("my_data.csv")
print(result.graph.serialize(format="turtle"))
print(result.dataframe.head())Supported file formats: CSV, TSV, Excel (.xlsx), Parquet, JSON. See the QuickMapper notebook for a guided walkthrough.
To contribute or run tests locally, see CONTRIBUTING.md for setup and development workflow instructions.
- Getting started: convert your first instrument file
- QuickMapper walkthrough: turn any tabular file into RDF
- Adding a parser: support a new instrument or handle file variants