Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

documentation for suppressed inside a basal #12

Merged
merged 9 commits into from
Oct 20, 2016
Merged

Conversation

jebeck
Copy link
Contributor

@jebeck jebeck commented Oct 10, 2016

Ready for discussion @gniezen and @darinkrauss

There's one place where I called out a TODO in particular for discussion, but LMK what else jumps out at you.

It may me easiest to spin it up locally and read in the GitBook form - just npm start for that, in case you've forgotten.

@jebeck
Copy link
Contributor Author

jebeck commented Oct 10, 2016

@gniezen
Copy link
Member

gniezen commented Oct 10, 2016

The description of suppressed info sounds clear to me. I'm going to start implementing it in the Medtronic driver, and will comment if there are any issues. Maybe we can discuss the expectedDuration issue live after standup sometime this week?

@jebeck
Copy link
Contributor Author

jebeck commented Oct 10, 2016

Yeah, let's aim to discuss after standup sometime. Tuesday or Wednesday should work for me (but likely not Thurs).

Copy link
Contributor

@darinkrauss darinkrauss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added some comments that I think need to be either documented or discussed/resolved.


<!-- TODO: DISCUSS WITH GERRIT AND DARIN!!! -->
**NB:** A *known* issue with this data model is that when a `temp` or `suspend` basal is programmed for a certain `duration` and crosses more than one schedule boundary but then is *canceled* early within one of the "middle" (not edge) segments, we have no good way to represent the original `expectedDuration` of the *entire* programmed `temp` or `suspend`. Rather, the `expectedDuration` on a middle segment of a three-or-more segment `temp` or `suspend` basal should be the expected `duration` of *that* segment from the basal schedule.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to retain the information about the length of the original temp basal (3 hours, in this case)? Do we want that in one place or do are we okay with assembling it from the 3 or more segments? If 3 or more segments how do we tell if a temp basal segment "belongs" to the previous segment? (Yes, you could assume if the time aligns perfectly, but theoretically it could have been done manually by a user. For example, for my ordered mind (okay, OCD) I prefer setting temp basals to end on an easily remembered time. So, if it is 12:37am and I want it to go for 2 1/2 hours, I'll actually set it for 2:23 hours so that I can easily remember that it ends at 3am. I might set an alarm for 3am to see what the number is and might set another temp basal at exactly 3am for another couple of hours. Okay, this example if a bit weird (and/or I am), but you can see what can happen about making implicit assumptions rather than using explicit data. Actually, this can become more of a problem when we consider any sort of "automated" temp scheduling.)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, what about using something like the previous field or some connection between all of the related segments.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you're weird (😉), but I do see your point, and I'm really not entirely happy with the model we have here with the splitting into segments, but I'm not really sure what to propose instead - a good question to discuss when we do so. The suggestion of bringing back previous is interesting, but it would be a change from how we've gone forward (thus far) with the platform not using previous, only legacy jellyfish.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To close this loop, I will add a proposal into the "20160815 Proposed Data Schema Changes" document about potentially adding a new event type that describes the original "user action" to set a temporary basal. It will include the more immediate proposal to add this information into the payload of the first segment of the temporary basal.

In general, this `suppressed` object need and **must** only contain the bare minimum of information:
- the `type` (= `basal`)
- the `deliveryType` (only `scheduled` if the currently active basal is a `temp`; otherwise (i.e., if the currently active basal is a `suspend`), may be `scheduled` *or* `temp`)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is production data that has temp basal suppressing temp basal suppressing temp basal suppressing temp basal suppressing temp basal suppressing scheduled basal. So, deliveryType can be temp if the currently active basal is temp, as well. For an example, see: https://gist.github.com/darinkrauss/7a91115d0d3475e4b11e559bf6dcdcaf

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suspect that the feedback here is that the proposal needs to do a better job of explaining that the restrictions described are not true of jellyfish-ingested data (and thus much production data) because suppressed basically wasn't validated at all (beyond it had to be an object). Is your objection really that the data model should be different or that I need to do a waaaay better job of calling out that this doc mostly describes platform-only (proposed) constraints, in my interpretation of what the actual intent of the data model is?

Also, production data includes known very bad data (like that uploaded during early-stage development of the Tandem driver by contractors who weren't super familiar with our data model), so in general I'm not inclined to care much about any old thing that exists in production data. Do you have a different instinct there?

Your example here is CareLink, of course, but that's also code that's on its way out...

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, we need to clarify that this is the "going forward" data model, not necessarily what folks would see if they invoked the tide-whisperer API to get existing user data.

I do not think we can just ignore old, bad data in the long term. If we expect to build a vibrant third-party ecosystem then we have to ensure that the data conforms to some sort of standard. We could, of course, choose to simply mark any older data as "bad" and that it doesn't conform to the documented model.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm. Lots to discuss. I really don't understand how I can document something that had zero constraints or why it would be useful to do so - a lot of it's just bad, and I think my recommendation to client consumers of data would be to filter out anything that doesn't match the proposed restrictions. A (subset) of the fields on some of the old data will match, and we can pluck those out to use (i.e., a subset of the fields at the top level), but I think the example you posted is pretty nonsensical and not a candidate for visualization...I'd call it a result of a bug in the CareLink parser/basal logic that got through because of the lack of constraints imposed by jellyfish's (non-)validation of suppressed.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To close this loop, please add documentation that states that the data model represents what we are working towards and that existing data and code are not necessarily aligned with this (e.g. additional fields, additional levels of suppressed).

(More details of the precise allowed shape of a nested `suppressed` are available in the [`temp` basal documentation](./temp.md#suppressed)) and the [`suspend` basal documentation](./suspend.md#suppressed).)

Note in particular that we do **not** include any timestamp or duration information in the `suppressed`: by definition, these values are always equal to those of the *active* basal interval's, and so it is not necessary to specify them.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not true with actual production data. See https://gist.github.com/darinkrauss/7a91115d0d3475e4b11e559bf6dcdcaf

Also, this isn't just legacy data; it is data going in now.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But it's data going in now from the CareLink code which is currently being replaced.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See above about additional documentation.

Numerical type: Floating point value representing a percentage, where 1.0 represents 100%.
Range:
min: 0.0
max: 10.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1.0?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the question? We represent 0 - 100% as a float from 0 to 1.0, so 0.1 is 10% etc.

The reason it goes to 10.0 is people can and often do program temps with percentages greater than 100%.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right, that makes sense now.

numericalType: 'Floating point value representing a percentage, where 1.0 represents 100%.',
range: {
min: '0.0',
max: '10.0'
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1.0?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it.

@jebeck
Copy link
Contributor Author

jebeck commented Oct 16, 2016

Pinging @darinkrauss and @gniezen: I believe I've made the changes we discussed (principally splitting doc into two, one for new suppressed specification/proposal, one for legacy). Ready for another round of review.

@darinkrauss
Copy link
Contributor

LGTM :shipit:

@jebeck jebeck merged commit 450a3dd into master Oct 20, 2016
@jebeck jebeck deleted the jebeck/suppressed-doc branch October 20, 2016 03:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants