Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UB in safe API if LVGL is not initialized #69

Closed
nia-e opened this issue Mar 6, 2023 · 7 comments · Fixed by #79
Closed

UB in safe API if LVGL is not initialized #69

nia-e opened this issue Mar 6, 2023 · 7 comments · Fixed by #79
Labels
bug Something isn't working question Further information is requested

Comments

@nia-e
Copy link
Collaborator

nia-e commented Mar 6, 2023

Attempting to call any function that requires memory allocation on the LVGL side before lvgl::init() is called will cause undefined behaviour (usually resulting in a segfault). This technically ought to make all memory-allocating functions unsafe as safe code can currently call them and trigger this bug, though obviously marking a good chunk of the API as unsafe is not desirable. Possibilities I see for fixing this:

Dynamically check if LVGL is initialized at runtime

Making all LVGL allocating/reading functions call lvgl_sys::lv_is_initialized() and error on false is probably the most obvious (and safe) way to fix this. This would definitely prevent any UB and force the caller to actually initialize LVGL instead of giving a cryptic error. However, this would lead to a (small) runtime performance penalty as it requires calling a non-inlineable external function and also we need to remember to do this on all new APIs in the future, along with figuring out precisely which APIs require this check so as to not perform this check too many times for no reason. We could probably get around the first issue by simply checking the existing RunOnce in the initializer and inline that, but the 2nd issue remains. This could also feasibly be done by patching LVGL to check this in its allocator, but I fail to see how that is anything other than an all-around worse form of the above.

Create a dummy LvMemHandler

Moving all memory allocating APIs out of being globally callable and instead making them be methods on a dummy LvMemHandler struct whose constructor checks the status of LVGL and initializes it if needed is probably the way to go, though this would make the API somewhat less intuitive and harkens back to the UI struct (though even UI was not UB-free as e.g. Style objects could be initialized before even instantiating the UI). That said, it's probably the option which would lead to the fewest headaches long-term (if only for us). One potential way to simplify this would be to simply have constructors require a reference to this struct be passed to ensure that it does indeed exist and is initialized, thus not messing up the API too much (but this leads to the same hassle of figuring out which constructors do/don't require it as in option 1, and requiring all memory-allocating functions receive a reference is probably overkill and clunky).

Catch SIGSEGV and panic with a more understandable error

This is probably the worst option of the 3 but I'm including it since it has some merit - namely, it requires no API compliexification and will lead to no overhead at runtime. However, triggering a segfault means that UB has already occured and therefore exiting the program "gracefully" would just lead to more UB by e.g. attempting to drop malformed values (since panic!() still calls Drop) or instantiating uninitialized values (if only for a fraction of a second), which is still UB. This is especially problematic if the offending binary has elevated privileges or is running on baremetal, since it might mess up the entire system. It would also not let us know if the segfault is a result of user error for not initializing LVGL or some other bug in lvgl-rs (or in LVGL itself).

I haven't quite decided what I want to do yet - probably option 1, for API prettiness reasons, but I could be convinced option 2 is better. I doubt anyone short of Ralf Jung could convince me option 3 is at all okay, but I'm including it as I guess it could be made to work. Also, I'd be happy to hear any other suggestions.

@nia-e nia-e added bug Something isn't working question Further information is requested labels Mar 6, 2023
@cydergoth

This comment was marked as off-topic.

@nia-e
Copy link
Collaborator Author

nia-e commented Mar 7, 2023

What specifically are your suggesting?

@rafaelcaricio
Copy link
Collaborator

I've thought A LOT about this in the past. It is one of the major frustrations I had while trying to think about a nice API for this crate.

In the end, I came to peace with the simplest solution possible, which is letting the people using this library remember to call lvgl::init() and letting them take the responsibility. Unfortunately, the LVGL C API makes heavy use of global state, and my first attempt with the lvgl::UI struct was to try to contain misuse. But that landed us in a very cumbersome API. Since LVGL is meant to be used in constrained environments, adding any overhead at runtime could be a dealbreaker.

@cydergoth

This comment was marked as off-topic.

@nia-e
Copy link
Collaborator Author

nia-e commented Mar 7, 2023

I've thought A LOT about this in the past. It is one of the major frustrations I had while trying to think about a nice API for this crate.

In the end, I came to peace with the simplest solution possible, which is letting the people using this library remember to call lvgl::init() and letting them take the responsibility. Unfortunately, the LVGL C API makes heavy use of global state, and my first attempt with the lvgl::UI struct was to try to contain misuse. But that landed us in a very cumbersome API. Since LVGL is meant to be used in constrained environments, adding any overhead at runtime could be a dealbreaker.

There is another option a friend suggested in private. The static_init crate could allow us to initialize LVGL asap by setting a function which initializes it as something to be executed before main() in all crates that compile us in. Since that might be undesirable and we might want to at least allow users to not initialize LVGL, we could also e.g. set a static mut UNSAFE_LVGL_NO_AUTO_INIT: bool = false with priority 0 in static_init and then call lvgl::init() if it's not true with priority 2 (thus allowing the user to unsafely set it to true with priority 1). We could perhaps provide a macro for this (again, clearly marking as unsafe).

@nia-e
Copy link
Collaborator Author

nia-e commented Mar 7, 2023

So type builder is an extension of the builder pattern in which you return a different type after various steps of building. In this case, the initial type would only expose the init methods, but calling that would return a type with the rest of the methods exposed. This article goes into some more detail https://dev.to/mindflavor/rust-builder-pattern-with-types-3chf

Ah, thanks! But I worry this gets into the same issue as the UI struct we used to have, namely that the API ends up being grossly unintuitive and a far cry from the neat LVGL API

@rafaelcaricio
Copy link
Collaborator

rafaelcaricio commented Mar 7, 2023

I like the idea of the static_init crate enabled by default and with the configuration to opt-out.

This was referenced Mar 29, 2023
@nia-e nia-e closed this as completed in #79 Mar 29, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working question Further information is requested
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants