Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add unit testing framework and skeleton test project #2058

Open
1 task done
joelghill opened this issue May 6, 2024 · 4 comments
Open
1 task done

Add unit testing framework and skeleton test project #2058

joelghill opened this issue May 6, 2024 · 4 comments

Comments

@joelghill
Copy link

Verification

  • I searched for similar issues (including closed issues) and found none was relevant.

Introduce the issue

Unit testing can improve code quality, help developers find bugs, and make it easier to add new features.

After looking into the project I see that there are no unit tests for Infinitime. I would like to suggest that the community adopts a unit testing framework and add documentation to the source code on how to include test coverage to new or existing code.

Preferred solution

After some initial research it looks like Google Test is a framework that is easy to integrate into an existing project and it has some nice features. There are a lot of unit testing frameworks out there for C++ and I think the community should discuss in case people have preferences for other solutions.

For the scope of this issue, I think the chosen framework should be installed and a skeleton test project should be added to the source code. It should also be well documented and provide a rough outline on how future tests could be included.

Version

No response

@NeroBurner
Copy link
Contributor

I personally prefer Catch2 as it has much nicer syntax.

Do you have some thoughts about testing strategy? What should be tested? What shouldn't?

What do you mean with a skeleton project? Could you explain further what you mean and what it should accomplish?

@joelghill
Copy link
Author

I personally prefer Catch2 as it has much nicer syntax.

I'm very open to suggestions on the framework. Google Test appears to have a nice mocking framework which would be really nice, but I know close to nothing about Catch2 so I will defer to people with more experience than me.

Do you have some thoughts about testing strategy? What should be tested? What shouldn't?

In my mind, and in a perfect world, a project would have >80% coverage, and that coverage would be "useful". Unit testing doesn't help when people add nonsense tests that pass and cover code for the sake of simply improving the coverage metric.

For this project though, I think the goal of unit testing should help developers debug and verify their code works without having to connect to a device or run code in a simulator. I would not suggest we enforce coverage metrics, but rather put something in place that is quick and easy to use and is a helpful tool, not a time consuming burden. It would also be helpful to be able to create tests that are able to reproduce bugs. If users report add behavior we could debug with a device to identify troublesome code and then write focused tests that reproduce that edge case and fail state.

As for what specifically should be tested, the BLE controllers, services, and managers all look like good candidates for tests if we are able to mock their dependencies. We could verify how they interpret event payloads, that they properly update whatever internal states they have, and that they make correct calls to other classes/functions. I'm thinking of tacking the Apple Notification Service implementation and unit testing the implementation of the specification would be really helpful in development.

What do you mean with a skeleton project? Could you explain further what you mean and what it should accomplish?

I'm imagining a mostly empty project under a root tests project that would mirror the source code directories. For the scope of this issue I would want to supply enough boiler-plate code and documentation to make it easy for other developers to jump in and add tests easily. I'm open to suggestions here too on how much code and documentation would be appropriate.

@JF002
Copy link
Collaborator

JF002 commented May 11, 2024

Thanks for this suggestion @joelghill ! Adding unit tests is definitely a good idea and will help improving the general quality of the code.

However, I think adding unit test in the current code base won't be easy : most of the modules and classes depend on a code that is related to the hardware : device drivers, NimBLE (BLE stack), FreeRTOS,...
If you want to write "true" unit tests (that do not depend on anything else than the class that you want to test), you'll have to stub/mock most of the low level code. That's not impossible, since we've already done that for InfiniSim, but it's not easy either.

I'm currently experimenting with a new architecture for the project, that split the code in 2 parts : the "core" which contains everything that makes InfiniTime : SystemTask, DisplayApp, user applications, settings,... and the "ports", which implement the hardware abstraction. This is not ready yet, but it looks promising to me ;-)
With this new structure, the code in "core" will be very easily unit testable, since it does not depend on any hardware detail, it only depends on "interfaces" (C++ concepts). And we'll be able to run those unit tests on anything (the target hardware or a desktop computer, for example).

You can have a look at this proof of concept here : https://codeberg.org/JF002/InfiniTime-multi-target

@joelghill
Copy link
Author

Thanks for this suggestion @joelghill ! Adding unit tests is definitely a good idea and will help improving the general quality of the code.

However, I think adding unit test in the current code base won't be easy : most of the modules and classes depend on a code that is related to the hardware : device drivers, NimBLE (BLE stack), FreeRTOS,... If you want to write "true" unit tests (that do not depend on anything else than the class that you want to test), you'll have to stub/mock most of the low level code. That's not impossible, since we've already done that for InfiniSim, but it's not easy either.

I'm currently experimenting with a new architecture for the project, that split the code in 2 parts : the "core" which contains everything that makes InfiniTime : SystemTask, DisplayApp, user applications, settings,... and the "ports", which implement the hardware abstraction. This is not ready yet, but it looks promising to me ;-) With this new structure, the code in "core" will be very easily unit testable, since it does not depend on any hardware detail, it only depends on "interfaces" (C++ concepts). And we'll be able to run those unit tests on anything (the target hardware or a desktop computer, for example).

You can have a look at this proof of concept here : https://codeberg.org/JF002/InfiniTime-multi-target

The new architecture sounds great and I love the idea of moving the source code in that direction!

In your opinion, is it worth starting unit testing now given that there is a major refactor on the way? I lean towards moving ahead with some testing anyway. It would not be perfect, but some tests are better than none and it would give contributors time to form opinions about the testing framework. It's also just a nice development tool to have available.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants