Join GitHub today
GitHub is home to over 50 million developers working together to host and review code, manage projects, and build software together.Sign up
Interface refactor #16
Develop a better system of managing UI and interactive elements on the screen. This became a higher priority after I realized that UI became my bottleneck in developing the NIBP system (PR #15).
Goals of this refactor:
Starting to develop a new scheme where there is a master `Screen` object which contains a `Vector` of `ScreenElement` objects (see docs/interface.md for more details).
Moved the signal based interface elements into their own file. Also, beginning work on how to manage touch events using the new screen -> screen element system. This is a work in progress, see TODO note in inc/screen.h.
Add callback functions to Button objects that are called when the Screen they are associated with checks for touch events. Also, perform general code base maintenance, renaming, and organization of variables. Also, update docs to reflect that I ditched the proposed Action system in favor of the callback function because it was easier and clearer in pretty much every conceivable way.
Update docs to describe the lifecycle hooks that occur during every screen update. Changed `main.cpp` to more clearly reflect when these lifecycle hooks are called. In keeping with this, `listenForTouch` was renamed to `propogateTouch` and the full touch screen cycle was moved inside this function. The call sequence for a screen lifecycle iteration is now: ``` screen.update(some_delay); screen.propogateTouch(); ``` where `some_delay` is a delay period to allow the display chip to finish rendering all updated elements.