Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions .github/workflows/unittests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ jobs:
strategy:
matrix:
python-version: ["3.11", "3.12"]
pydantic: [true, false]
os: [ubuntu-latest]
steps:
- uses: actions/checkout@v4
Expand All @@ -21,6 +22,9 @@ jobs:
run: |
python -m pip install --upgrade pip
pip install tox
- name: install pydantic if requested
if: matrix.run_step == 'true'
run: pip install -r dev_requirements/requirements-pydantic.txt
- name: Run the Unit Tests via Tox
run: |
tox -e tests
48 changes: 48 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,6 +79,54 @@ assert {awf.pruefidentifikator for awf in ahb.anwendungsfaelle} == {

Die vollständigen Beispiele finden sich in den [unittests](unittests).

### Verwendung mit Pydantic
Per default verwendet fundamend die [dataclasses aus der Python-Standardlibrary](https://docs.python.org/3/library/dataclasses.html).
Es lässt sich aber auch direkt mit [Pydantic](https://docs.pydantic.dev/latest/) und den [Pydantic dataclasses](https://docs.pydantic.dev/2.7/concepts/dataclasses/) verwenden.
Wenn entweder pydantic schon installiert ist, oder mittels
```bash
pip install fundamend[pydantic]
```
mit installiert wird, dann sind Datenmodelle, die von `AhbReader` und `MigReader` zurückgegeben werden, automatisch pydantic Objekte.

Mit Pydantic können die Ergebnisse auch leicht bspw. als JSON exportiert werden:
```python
from pathlib import Path

from pydantic import RootModel
from fundamend import Anwendungshandbuch, AhbReader

ahb = AhbReader(Path("UTILTS_AHB_1.1d_Konsultationsfassung_2024_04_02.xml")).read()
ahb_json = RootModel[Anwendungshandbuch](ahb).model_dump(mode="json")
```

Das Ergebnis sieht dann so aus:
```json
{
"veroeffentlichungsdatum": "2024-04-02",
"autor": "BDEW",
"versionsnummer": "1.1d",
"anwendungsfaelle": [
{
"pruefidentifikator": "25001",
"beschreibung": "Berechnungsformel",
"kommunikation_von": "NB an MSB / LF",
"format": "AWF",
"segments": [
{
"id": "UNH",
"name": "Nachrichten-Kopfsegment",
"number": "00001",
"ahb_status": "Muss",
"data_elements": [
{
"id": "D_0062",
"name": "Nachrichten-Referenznummer",
"codes": []
},
```

### JSON Schemas
Das fundamend Datenmodell ist auch als JSON Schema verfügbar: [`json_schemas`](json_schemas).

## Verwendung und Mitwirken
Der Code ist MIT-lizenziert und kann daher frei verwendet werden.
Expand Down
1 change: 1 addition & 0 deletions dev_requirements/requirements-pydantic.in
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
pydantic
16 changes: 16 additions & 0 deletions dev_requirements/requirements-pydantic.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
#
# This file is autogenerated by pip-compile with Python 3.11
# by the following command:
#
# pip-compile requirements-pydantic.in
#
annotated-types==0.7.0
# via pydantic
pydantic==2.7.2
# via -r requirements-pydantic.in
pydantic-core==2.18.3
# via pydantic
typing-extensions==4.12.0
# via
# pydantic
# pydantic-core
2 changes: 1 addition & 1 deletion dev_requirements/requirements-type_check.txt
Original file line number Diff line number Diff line change
Expand Up @@ -9,5 +9,5 @@ mypy==1.10.0
# via -r dev_requirements/requirements-type_check.in
mypy-extensions==1.0.0
# via mypy
typing-extensions==4.10.0
typing-extensions==4.12.0
# via mypy
Loading