Skip to content

Verification and validation #18

@amyheather

Description

@amyheather

Verification

Desk checking

  • Systematically check code.
  • Keep documentation complete and up-to-date.
  • Maintain an environment with all required packages.
  • Lint code.
  • Get code review.

Debugging

  • Write tests - they'll help for spotting bugs.
  • During model development, monitor the model using logs - they'll help with spotting bugs.
  • Use GitHub issues to record bugs as they arise, so they aren't forgotten and are recorded for future reference.

Assertion checking

  • Add checks in the model which cause errors if something doesn't look right.
  • Write tests which check that assertions hold true.

Special input testing

  • If there are input variables with explicit limits, design boundary value tests to check the behaviour at, just inside, and just outside each boundary. --> Not applicable
  • Write stress tests which simulate worst-case load and ensure model is robust under heavy demand.
  • Write tests with little or no activity/waits/service.

Bottom-up testing

  • Write unit tests for each individual component of the model.
  • Once individual parts work correctly, combine them and test how they interact - this can be via integration testing or functional testing.

Regression testing

  • Write tests early.
  • Run tests regularly (locally or automatically via. GitHub actions).

Mathematical proof of correctness

  • For parts of the model where theoretical results exist (like an M/M/s queue), compare simulation outputs with results from mathematical formulas. --> Too complex

Validation

N/A: replicates existing model, but not re-applying to a real system

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

Status

Done

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions