New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OSP software validation - what and how? #106

Closed
DanielMoj opened this Issue Dec 8, 2017 · 3 comments

Comments

Projects
None yet
4 participants
@DanielMoj

DanielMoj commented Dec 8, 2017

Dear all,

at the moment we are internally discussing the software installation/performance validation of the OSP suite. Does someone have experience in what exactly and how one has to validate the OSP software to comply with regulatory requirements? Additional question: Where can I find these regulatory requirements? (and do they give any instructions on how to comply with them?)
Is the Installation Validator Tool that is included in the OSP installation from 7.1 on enough?

Many thanks in advance!
Daniel

@msevestre msevestre added the question label Dec 8, 2017

@StephanSchaller

This comment has been minimized.

Show comment
Hide comment
@StephanSchaller

StephanSchaller Dec 13, 2017

Member

What do you mean by performance validation?

Stephan

Member

StephanSchaller commented Dec 13, 2017

What do you mean by performance validation?

Stephan

@Yuri05

This comment has been minimized.

Show comment
Hide comment
@Yuri05

Yuri05 Dec 19, 2017

Member

@DanielMoj

  1. OSP Suite contains the library of standard test models

  2. For every new/changed modeling feature (e.g. new species; adjustment of anatomical/physiological parameters; new computational models; …) extensive testing is performed (e.g. comparison of simulated time-concentration profiles with published literature data, plausibility checks, etc.)

  3. Once the testing is finished, the library of standard test models is extended to include newly implemented features for the future automated testing

  4. For every new OSP Suite version the following steps are performed:

    • All standard test models are simulated in different computational environments (different operating systems, different language settings, etc.)

    • All simulated results (e.g. time-concentration profiles in all model compartments) are (pointwise) compared with the results of the previous OSP Suite version.

      Comparison deviations must be:

      a) either below predefined comparison threshold or

      b) a reasonable explanation of the deviations due to changed modeling features must exist (in this case the results are tested as described above (comparison with literature data etc.))

    • Simulated results are included into the OSP Suite Installation Package (Reference results)

  5. The Installation Validator Tool performs the following steps:

    • Simulates all standard test models on the system of the end user
    • Compares (pointwise) all simulated results with the installed Reference results
    • Creates a „Single state” report (“VALID” / “INVALID”). Installation is considered valid if and only if all deviations in all models and all output curves are below predefined comparison threshold

This ensures that all tested features produce on the system of the end user the same results as in the OSP Suite (since the features were implemented and tested for the first time)

With this procedure e.g. the installation validation from the EMA guideline Qualification and reporting of physiologically based pharmacokinetic (PBPK) modelling and simulation is fulfilled.

ema_47

Member

Yuri05 commented Dec 19, 2017

@DanielMoj

  1. OSP Suite contains the library of standard test models

  2. For every new/changed modeling feature (e.g. new species; adjustment of anatomical/physiological parameters; new computational models; …) extensive testing is performed (e.g. comparison of simulated time-concentration profiles with published literature data, plausibility checks, etc.)

  3. Once the testing is finished, the library of standard test models is extended to include newly implemented features for the future automated testing

  4. For every new OSP Suite version the following steps are performed:

    • All standard test models are simulated in different computational environments (different operating systems, different language settings, etc.)

    • All simulated results (e.g. time-concentration profiles in all model compartments) are (pointwise) compared with the results of the previous OSP Suite version.

      Comparison deviations must be:

      a) either below predefined comparison threshold or

      b) a reasonable explanation of the deviations due to changed modeling features must exist (in this case the results are tested as described above (comparison with literature data etc.))

    • Simulated results are included into the OSP Suite Installation Package (Reference results)

  5. The Installation Validator Tool performs the following steps:

    • Simulates all standard test models on the system of the end user
    • Compares (pointwise) all simulated results with the installed Reference results
    • Creates a „Single state” report (“VALID” / “INVALID”). Installation is considered valid if and only if all deviations in all models and all output curves are below predefined comparison threshold

This ensures that all tested features produce on the system of the end user the same results as in the OSP Suite (since the features were implemented and tested for the first time)

With this procedure e.g. the installation validation from the EMA guideline Qualification and reporting of physiologically based pharmacokinetic (PBPK) modelling and simulation is fulfilled.

ema_47

@Yuri05 Yuri05 added the answer label Dec 19, 2017

@DanielMoj

This comment has been minimized.

Show comment
Hide comment
@DanielMoj

DanielMoj Dec 19, 2017

@StephanSchaller
What I meant was what Yuri05 describes under point 4:
All simulated results (e.g. time-concentration profiles in all model compartments) are (pointwise) compared with the results of the previous OSP Suite version

@Yuri05
Perfect, thats all the info I needed - thank you very much!

DanielMoj commented Dec 19, 2017

@StephanSchaller
What I meant was what Yuri05 describes under point 4:
All simulated results (e.g. time-concentration profiles in all model compartments) are (pointwise) compared with the results of the previous OSP Suite version

@Yuri05
Perfect, thats all the info I needed - thank you very much!

@msevestre msevestre closed this Apr 6, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment