Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Basal tuning and ratio estimation #99

Closed
scottleibrand opened this Issue Apr 6, 2016 · 10 comments

Comments

Projects
None yet
4 participants
@scottleibrand
Copy link
Contributor

scottleibrand commented Apr 6, 2016

Per the discussion in #58 and many preceding and subsequent discussions in various fora, it would be very useful to implement an algorithm to provide suggestions for tuning a user's current basal schedule, insulin sensitivity factor (ISF), insulin to carb (IC) ratio, and duration of insulin action (DIA).

To do so, we could use the OpenAPS automatic sensitivity detection algorithm to calculate, for each 5m data point, how much BG deltas deviated from what would’ve been predicted based on net insulin activity (relative to current schedule basals) and the user's current ratios.

For each carb entry, we can determine the sum of subsequent positive deviations (above a threshold used to detect when carbs are absorbing, perhaps 0.5 mg/dL/min). This can then be directly compared to the carb estimate to get a carb sensitivity factor (CSF, units of mg/dL per g), which can be combined with estimated ISF (units of mg/dL per U) to get IC ratio (units of g per U). (Note: while it would appear that this relies on correct carb counting, it actually does not. If the user's carb counts are biased high or low, the resulting ratios will essentially correct for that, resulting in what you might call perceived-carb ratios.)

To calculate ISF/DIA and tune basal schedules, we need to first exclude any data between carb announcement and 15-30 minute after the last carb absorption is detected. We can then bucket the remaining data points according to whether their insulin activity is dominated by boluses / high-temps or by basal insulin. The former can be used to calculate ISF, and the latter to tune basal schedules. We can initially assume that ISF is constant, and then come back and fine-tune based on time of day or day-to-day variation, assuming we have sufficient data, after we’ve tuned basal schedules and DIA.

To calculate ISF, we need to take the subset of data with no carb absorption whose insulin activity is bolus / high-temp dominated as described above, and perform a parameter estimation function to solve for ISF. One simple (but maybe not ideal) method to do so would be to perform a simple goal seek algorithm, recalculating the deviations for each new ISF to find the ISF that converges on a median or mean deviation (for this subset of data) of zero.

To determine the best DIA, we want to select the one that minimizes the sum (or perhaps the RMS?) of all the deviations, in order to choose the insulin curve that best fits the actual observed insulin activity profile over time. This parameter could be estimated from the same subset of data used to estimate ISF, using simple goal seek or more sophisticated parameter estimation methods.

To tune basal schedules, we can use the data points whose insulin activity is dominated by basal insulin (where net IOB is small), bucket the data by time of day, and look at the BG deviations for the timeframe ~90m later. (So for a 12am-2am basal, the deviations of interest would be for 1:30am-3:30am.) To determine how far off the schedule basal is for each time bucket, we can divide the previously-estimated ISF (mg/dL per U) by the median deviation (units of mg/dL per hour) to get a basal offset in U/hr.

@danamlewis

This comment has been minimized.

Copy link
Contributor

danamlewis commented May 4, 2016

Redirecting/extending this idea: We can do this real-time (i.e. dynamic tuning) and not just as a one-off.

  • Do the auto-sensitivity calculation, but also do a version of it on an hour by hour basis, so we can compare the same hour (i.e. 2-3am) across multiple days.
  • Knowing per the original description @scottleibrand outlined above that we need to exclude periods of carb activity, this will allow us to get more time frames without carb activity.
  • We can average a certain number of days (3+ ideal) for the same hour block to come up with a dynamic recommendation of what the basal needs likely are for that hour.
  • We should use a combination of auto-sensitivity for 24 hours (perhaps 50%) and the time-block-series-analysis (the other 50%, in proportion to the number of days available for the time series) to determine the baseline basal. Note that if the time series analysis cannot be performed (i.e. if a person ALWAYS eats lunch at noon and every day has carb activity from noon to 2pm), we may want to consider reverting to either a) 100% 24-hr-calculated auto sensitivity adjustment to preset pump basal rates (as it is done now) or b) 50% 24-hr-calculated auto sensitivity plus 50% weighted average of the preceding and following time blocks (in this example, an average of the 11am-noon and 2pm-3pm averages calculated).

Per @bewest showing recently how to append, we should be able to add small files to make it easier to do this longer time series. We may be able to do something like create a basal file where OpenAPS can write the basal rates by hour to this file, and then pull from this file (and any calculations adjusting these rates will update the file) for then calculating the needed temp basal rates at that time based on BG changes. If it can't read from the file (i.e. offline mode), it'll revert to the pump-programmed basal rates as it does now. (Thing to think about, though, will be what the adjusted calculated rates do in terms of calculating netIOB, if anything?)

Per @TC2013 and @Sulka, this should help us improve basal adjustments for people with widely varying sensitivity and basal needs over the course of the day, whether it's an adjustment on a per-day basis (i.e. sick day, bad pump site day), or a weekly basis (i.e. growth/hormone spurt/change).

This can also be a tool that can be run before people start looping, as a way to look at suggestions for adjusting their existing basal rates as part of the open loop/before looping process, since it'll just require to read the data and that is already being done at the beginning of the loop build process.

@scottleibrand

This comment has been minimized.

Copy link
Contributor Author

scottleibrand commented May 15, 2016

More detailed algorithm for implementing this:

  • Prep data:
    • Download historical treatments and glucose data from Nightscout.
    • Use json tool to merge all new pumphistory and glucose data to the end of the long-running historical treatments and glucose data files.
  • Tune basals:
    • Divide each day’s data according to the user’s pre-preprogrammed (or previously auto-adjusted) basal schedules.
    • Select data points ~90m later where net IOB is small, and therefore insulin activity is dominated by basal insulin, and exclude any data between carb announcement and 15-30 minute after the last carb absorption is detected.
    • Look back as many days as are required to get sufficient data to run the autosens algorithm, and/or combine scheduled basal blocks with adjacent ones.
    • Run the autosens algorithm for each basal in the schedule, and output the resulting adjusted basal schedule.
    • Periodically repeat the basal tuning algorithm starting with the adjusted basal schedules (and any new BG and pump data) until the resulting adjustments converge.
  • Calculate CSF
    • Identify whether each carb entry was fully absorbed before the next carbs were ingested (use AMA COB calculation algorithm, with a minimum assumed carb absorption rate of something like 3 mg/dL/5m). Combine stacked carbs into a single meal entry at the time of the first carbs in the series.
    • Identify the time at which the last carb absorption is detected for each meal entry.
    • For each meal entry, determine the sum of subsequent positive deviations (above a threshold used to detect when carbs are absorbing, perhaps 0.5 mg/dL/min) between the time of first carb ingestion and the tie of last carb absportion.
    • Compare that total BG impact with the carb estimate to calculate carb sensitivity factor for that meal. Combine CSFs for all meals in the timeframe of interest to calculate the average/median CSF for that timeframe.
    • When sufficient data is available, determine whether CSF varies by time of day (more than it varies day to day) by calculating it separately for each ISF and IC ratio schedule timeframe.
    • Output the resulting calculated CSF schedule.
  • Calculate ISF & DIA
    • Exclude data between carb announcement and 15-30 minute after the last carb absorption is detected.
    • Select remaining data points where net IOB is large relative to schedule basal, and therefore insulin activity is dominated by basal insulin.
    • Perform a parameter estimation function to solve for ISF. One simple (but maybe not ideal) method to do so would be to perform a simple goal seek algorithm, recalculating the deviations for each new ISF to find the ISF that converges on a median or mean deviation (for this subset of data) of zero.
    • Perform a similar parameter estimation to solve for DIA, by minimizing the sum (or perhaps the RMS?) of all the deviations, in order to choose the insulin curve that best fits the actual observed insulin activity profile over time.
    • When sufficient data is available, determine whether ISF varies by time of day (more than it varies day to day) by calculating it separately for each ISF and IC ratio schedule timeframe.
    • Output the resulting adjusted ISF schedule.
  • Calculate IC ratio
    • Combine calculated CSF (units of mg/dL per g) with calculated ISF (units of mg/dL per U) to get IC ratio (units of g per U).
    • Output the calculated IC ratio schedule.
@scottleibrand

This comment has been minimized.

Copy link
Contributor Author

scottleibrand commented Jul 18, 2016

Figured out how to use jq to merge multiple time series json files together and append new data to long-running json file: scottleibrand/openaps-sh@bc7db9f

@scottleibrand

This comment has been minimized.

Copy link
Contributor Author

scottleibrand commented Jul 24, 2016

To download historical and glucose data from Nightscout:

openaps device add ns-glucose-7d process 'bash -c "curl -m 30 -s \"$NIGHTSCOUT_HOST/api/v1/entries/sgv.json?find\[date\]\[\$gte\]=0\&count=2016\""'
openaps report add cgm/ns-glucose-7d.json text ns-glucose-7d shell
openaps invoke cgm/ns-glucose-7d.json
@jasoncalabrese

This comment has been minimized.

Copy link
Member

jasoncalabrese commented Jul 29, 2016

This mongo map/reduce script from @sulkaharo might be a good way to handle the processing instead of trying to do it all locally on the edison https://gist.github.com/sulkaharo/18d4b06c1288a0b14da48e7a5cd46d87

@beached

This comment has been minimized.

Copy link
Contributor

beached commented Oct 27, 2016

I used to have a program. I am looking for source that would take medtronic csv's and generate this. But this is an example of the binning of data into time periods where the influence is dominated by liver/basal insulin. The units are delta mmol/L(so it's aggregate first derivative/rate of change during those time periods. sorry about the scale) This is over a period of time from Aug22-Sept23 2014
image

@beached

This comment has been minimized.

Copy link
Contributor

beached commented Oct 27, 2016

One note I had observed is that to get enough data to build a day can be a challenge. There are actions that can enhance this too, and a system that indicates what has been fulfilled in the last time period could help. However, the following observation helped too. Once one find the rough doses for basal, the changes are often global and not localized to that time period. So that lets one use night times when you are fasting as an indicator of potential changes needed.

@danamlewis

This comment has been minimized.

Copy link
Contributor

danamlewis commented Dec 28, 2016

(For anyone following this issue, 99, you may also want to keep an eye on 261 (#261 (comment)) which is where we are working on iterative autotuning using some of the framework described here, but may ultimately achieve both one-off basal tuning and/or more iterative ongoing tuning adjustments.

@scottleibrand

This comment has been minimized.

Copy link
Contributor Author

scottleibrand commented Feb 7, 2017

Is there anything left to do here? Or does #261 resolve this?

@danamlewis

This comment has been minimized.

Copy link
Contributor

danamlewis commented Feb 7, 2017

#261 covers 99 (this one) - closed, anyone who wants to track tuning should track #261.

@danamlewis danamlewis closed this Feb 7, 2017

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.