Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tests: Use pytest instead of custom test framework #3444

Merged
merged 1 commit into from
Mar 9, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
36 changes: 32 additions & 4 deletions .github/workflows/integration_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,34 @@ jobs:
uses: actions/setup-python@v4
with:
python-version: "3.9"
cache: pip
- run: pip install geojson-pydantic requests
- name: Install Poetry
uses: snok/install-poetry@v1
with:
virtualenvs-in-project: true
- name: Load cached venv
uses: actions/cache@v3
id: cached-poetry-integration-tests-dependencies
with:
path: tests/.venv
key: cached-poetry-integration-tests-dependencies-${{ hashFiles('tests/poetry.lock') }}
- name: Install dependencies
if: steps.cached-poetry-integration-tests-dependencies.outputs.cache-hit != 'true'
run: |
cd tests
poetry install --only dev

- name: Flake8
run: |
cd tests
poetry run pflake8 --config ./pyproject.toml
- name: Black
run: |
cd tests
poetry run black . --check
- name: Isort
run: |
cd tests
poetry run isort . --check

- name: Setup Node
uses: actions/setup-node@v3
Expand Down Expand Up @@ -57,8 +83,10 @@ jobs:
DOCKER_BUILDKIT: 1
COMPOSE_DOCKER_CLI_BUILD: 1

- name: Run tests
run: "python3 tests/run_integration_tests.py"
- name: Run pytest
run: |
cd tests
poetry run pytest

- name: Run Playwright tests
run: yarn --cwd front playwright test
Expand Down
8 changes: 6 additions & 2 deletions python/api/osrd_infra/management/commands/setup_dummy_db.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
from pathlib import Path

from django.core.management.base import BaseCommand
from django.db import transaction
from geojson_pydantic import LineString, Point
from osrd_schemas.infra import (
ApplicableDirections,
Expand All @@ -11,7 +12,7 @@
TrackSection,
)

from osrd_infra.models import Infra, RollingStock
from osrd_infra.models import Infra, RollingStock, RouteModel


class Command(BaseCommand):
Expand Down Expand Up @@ -69,5 +70,8 @@ def handle(self, *args, **options):
release_detectors=[],
switches_directions={},
)
route.into_model(infra).save()
with transaction.atomic():
# update_or_create
RouteModel.objects.filter(infra=infra).delete()
route.into_model(infra).save()
print(infra.id)
13 changes: 9 additions & 4 deletions python/api/osrd_infra/models/rolling_stock.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
from __future__ import annotations

from typing import Dict

from django.contrib.postgres.fields import ArrayField
Expand Down Expand Up @@ -99,13 +101,16 @@ def to_schema(self):

@staticmethod
@transaction.atomic
def import_railjson(rolling_stock_dict: Dict, force: bool = False):
def import_railjson(rolling_stock_dict: Dict, force: bool = False) -> RollingStock:
# Parse rolling stock payload
rs_obj: RollingStockSchema = RollingStockSchema.parse_obj(rolling_stock_dict)

if force:
return RollingStock.objects.update_or_create(**rs_obj.dict())
return RollingStock.objects.create(**rs_obj.dict())
rolling_stock: RollingStock
with transaction.atomic():
if force:
RollingStock.objects.filter(name=rs_obj.name).delete()
rolling_stock = RollingStock.objects.create(**rs_obj.dict())
return rolling_stock


class RollingStockImage(models.Model):
Expand Down
27 changes: 5 additions & 22 deletions tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,31 +2,14 @@


### Run the tests
To start the tests, run `python3 run_integration_tests.py` after starting a docker
(`docker-compose up` at the root of the project).
To run tests `poetry run pytest` after starting docker containers(`docker-compose up` at the root of the project).

To run a list of specific tests, run `python3 run_integration_tests.py test_name_1 test_name_2 ...`.
To run a list of specific tests, run `poetry run pytest -k test_name_1 test_name_2 ...`.

### Create new integration tests

To add a test, create a python file in the `tests/` folder with the other tests.
Inside, create a `run(*args, **kwargs)` function. It will be given some parameters
in the `kwargs`, for now it contains:

```json
{
"all_scenarios": {
"dummy": infra, project, operational study and scenario,
"tiny": infra, project, operational study and scenario,
"small": infra, project, operational study and scenario
},
"url": api url
}
```

The python file can instead contain a function `list_tests() -> Iterable[Tuple[string, Callable]]`.
In this case, it will be called to get a list of tests.

To add a test, follow [pytest doc](https://docs.pytest.org/).
Available fixtures are defined in `conftest.py`.

# Fuzzer

Expand All @@ -37,4 +20,4 @@ Note: you need a docker running locally *with at least one infra imported*.
It can be a generated infra, or it can be imported from some other DB.

If the test is run on a generated infra, the json containing the error report
can be copied to `tests/regression_fuzzer_tests/` to integrate it into the test suite.
can be copied to `tests/regression_tests_data/` to integrate it into the test suite.
87 changes: 87 additions & 0 deletions tests/conftest.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,87 @@
import subprocess
import sys
from dataclasses import dataclass
from pathlib import Path
from typing import Mapping

import pytest
import requests

from tests.services import API_URL, EDITOAST_URL
from tests.utils.timetable import create_op_study, create_project, create_scenario


@dataclass(frozen=True)
class Scenario:
project: int
op_study: int
scenario: int
infra: int
timetable: int


def _load_generated_infra(name: str) -> int:
generator = Path(__file__).resolve().parents[1] / "core/examples/generated/generate.py"
output = Path("/tmp/osrd-generated-examples")
infra = output / f"{name}/infra.json"
subprocess.check_call([sys.executable, str(generator), str(output), name])
subprocess.check_call(["docker", "cp", str(infra), "osrd-api:/infra.json"])
result = subprocess.check_output(
[
"docker",
"exec",
"osrd-api",
"python",
"manage.py",
"import_railjson",
name,
"/infra.json",
],
)
id = int(result.split()[-1])
return id


@pytest.fixture(scope="session")
def scenarios():
# Setup project and operational study
project_id = create_project(API_URL)
op_study_id = create_op_study(API_URL, project_id)

# Setup dummy infra with scenario
result = subprocess.check_output(
["docker", "exec", "osrd-api", "python", "manage.py", "setup_dummy_db"],
)
infra_id = int(result)
scenario_id, timetable_id = create_scenario(API_URL, infra_id, project_id, op_study_id)

# Setup dummy, small and tiny infra with their scenarios
scenarios = {}
scenarios["dummy"] = Scenario(project_id, op_study_id, scenario_id, infra_id, timetable_id)
for infra in ["small_infra", "tiny_infra"]:
infra_id = _load_generated_infra(infra)
scenario_id, timetable_id = create_scenario(API_URL, infra_id, project_id, op_study_id)
scenarios[infra] = Scenario(project_id, op_study_id, scenario_id, infra_id, timetable_id)
yield scenarios
project = scenarios["dummy"].project
response = requests.delete(API_URL + f"projects/{project}/")
for scenario in scenarios.values():
infra_id = scenario.infra
response = requests.delete(EDITOAST_URL + f"infra/{infra_id}/")
if response.status_code // 100 != 2:
raise RuntimeError(f"Cleanup failed, code {response.status_code}: {response.content}")


@pytest.fixture
def dummy_scenario(scenarios: Mapping[str, Scenario]) -> Scenario:
yield scenarios["dummy"]


@pytest.fixture
def small_scenario(scenarios: Mapping[str, Scenario]) -> Scenario:
yield scenarios["small_infra"]


@pytest.fixture
def tiny_infra(scenarios: Mapping[str, Scenario]) -> Scenario:
yield scenarios["tiny_infra"]
Loading