You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There might be a case for splitting up the package into multiple smaller ones, which could have benefits to usability, developer engagement and future sustainability. For example we could have a "core package" centered around implementing estimate_infections and the corresponding model, and the following related packages:
a package implementing the truncation model from estimate_truncations - this would facilitate replacing this by improved approaches such as in epinowcast or dynamicaltruncation that could still feed into estimate_infections
a package implementing the estimate_secondary model
a package for the more engineered versions epinow and regional_epinow facilitating running on HPCs / logging etc.
a package for visualisation, which could be based on a base class that could eventually also be provided by other packages that perform similar tasks, and maintain utility once all functionality provided here is superseded by functionality in epinowcast
A downside of at least the first two points (splitting off models) is that some common Stan code e.g. for the PMFs or convolutions would get lost unless this was e.g. provided as a git submodules. There would also need to be some common functionality (possibly provided by the core package) e.g. on processing arguments/stan options etc. Points 3 and 4 might still be worth considering even if we decide against proceeding with 1 and 2.
The text was updated successfully, but these errors were encountered:
In terms of other parts of the package functionality that people make use of I think there is the following:
Censored delay distribution estimation
Parameter estimates storage and retrieval
Visualisation of multiple models (i.e the output from regional_ functions).
Splitting estimate_infection still further into the deconvolution and generative process approaches
The adjust_ family of functions.
Of these the functionality in 1. isn't anything that special. From a more methodological point of view the work I have been doing in dynamicaltruncation supercedes it. It also looks like the quickfit package being built as part of Epiverse trace will also replicate some of this functionality and provide a likely faster/easier user experience.
Has essentially already been superseded by another Epiverse project (epiparameter) which follows a very similar design methodology to that implement here and so could be a slot in replacement should this functionality be dropped.
Could form part of some kind of base case visualisation package but may be better suited to another package explicitly designed for multiple estimates. Not sure there is a drop-in for this as of yet.
This is a tricky one as quite a lot of code is shared but doing so would drastically reduce technical debt. It would also help surface the deconvolution approach which is likely attractive for people aiming for speed vs other factors in their pipelines.
These existed as tools for our originally multiple-step forecasting functionality (which forecast $R_t$ and then simulated infections. Tools available at the time (for example projections) were not suitable for this and so we rolled our own. However now these are not used here but still remain useful for others potentially (hence not dropping them). These would also fit nicely into another package.
There might be a case for splitting up the package into multiple smaller ones, which could have benefits to usability, developer engagement and future sustainability. For example we could have a "core package" centered around implementing
estimate_infections
and the corresponding model, and the following related packages:estimate_truncations
- this would facilitate replacing this by improved approaches such as in epinowcast or dynamicaltruncation that could still feed intoestimate_infections
estimate_secondary
modelepinow
andregional_epinow
facilitating running on HPCs / logging etc.epinowcast
A downside of at least the first two points (splitting off models) is that some common Stan code e.g. for the PMFs or convolutions would get lost unless this was e.g. provided as a git submodules. There would also need to be some common functionality (possibly provided by the core package) e.g. on processing arguments/stan options etc. Points 3 and 4 might still be worth considering even if we decide against proceeding with 1 and 2.
The text was updated successfully, but these errors were encountered: