-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
2020-04-01 tc155 M1SS computation engine doesn't generate error when duplicate time values appear with different concentration values #201
Comments
Just an FYI, I can't find this testcase, so am unable to close the issue. |
I think this issue is still present as of commit f5bfc3a. CE provides a warning message that the issue has occurred but still produces parameters for the SDEID that the warning is about, as shown below for ## Warning in doTryCatch(return(expr), name, parentenv, handler): Removing
## SDEID: '250852597' due to duplicate TIME but different CONC values
## Warning in doTryCatch(return(expr), name, parentenv, handler): Removing
## SDEID: '2004533073' due to duplicate TIME but different CONC values
## Warning in doTryCatch(return(expr), name, parentenv, handler): Removing
## SDEID: '2831342063' due to duplicate TIME but different CONC values
|
Assessing this issue with the latest commit cca452d this issue no longer occurs and this issue will be closed. Profiles with duplicate times where the concentrations differ now correctly provide NA as output for all variables. Example below. knitr::kable(r[r$SDEID %in% c("250852597", "2004533073", "2831342063"),1:10])
|
There are two requirements relative to data/record duplication scenarios
If duplicate time but conc not exactly the same - ERROR condition - error message;fail the profile/run
if duplicate time but concentration exactly the same - warning message - use 1 of the records, continue to successful execution
This test case probes the first scenario with the same time in PKATPD_TEST and differing concentrations in PKCNCN. When the computation engine runs, it should generate an error message recording the details of the duplication: "Error: records 22 and 23 are duplicated with TIME=11.5 and CONC=18.1 and TIME=11.5 and CONC=18.2, respectively." and then exit gracefully.
However the STDOUT (see below) for this example indicates that it continues to run with no indication of what is occurring with duplicate values placing the parameter results in uncertain status, i.e. they cannot be trusted but there's no indication that there's anything wrong with the input data.
Computation engine must be updated to react as indicated to both record duplication scenarios above.
relevant portion of input dataset
STDOUT
The text was updated successfully, but these errors were encountered: