-
-
Notifications
You must be signed in to change notification settings - Fork 255
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Design: Test TinyPilot Debian package install #1659
Comments
What tests will we perform to validate the installation?The installation process broadly consists of four types of action - installing dependencies, copying files into place, modifying existing files, and running external commands. When taken from a known starting point, these actions have defined outcomes and should be easy to test. I've addressed each of these actions below. Were dependencies installed correctly?It's reasonable to assume that the package management tools will work correctly. However, if we want to test this action ourselves, we could have a script query the Next step: @mtlynch will decide whether to test this action. Were files copied correctly?Many approaches could work for validating that the installer copied files into place correctly, ranging from simple presence checks all the way through to comparing checksums. As our other functional tests provide us with reasonable confidence in the contents of the files, we should only need to run a series of basic presence checks. We could implement these presence checks by comparing the expected file and folder structures against the output of a series of Next step: @cghague will determine which files and folders to compare. Were existing files modified correctly?The installer modifies various files, which, from a quick review of the code, all appear to be text-based. We can test that the installer has changed the files correctly by running simple before and after comparisons. A robust but blunt approach would be to use a script to:
A more focused approach could use pattern matching to validate the changes, but that would require us to design precise patterns to avoid accidentally fuzzy matching, changes in the wrong part of a file, and other common pitfalls that occur with this approach. Next step: @mtlynch will choose which comparison method to use. Did external commands have the desired outcome?The installation process runs numerous external commands, most of which are mainly to support the installation. However, some of these commands make changes to the system. We should identify these commands and implement suitable tests. One example would be confirming the successful creation of the "tinypilot" user. Next step: @cghague will identify the commands we should test. How can we create a Docker container that mimics Raspberry Pi OS?In an ideal scenario, we'd run these tests on an actual installation of Raspberry Pi OS, but we can't realistically do that within the confines of CircleCI and Docker. The closest we can get will likely be starting with a Debian Docker container and then using The process would look something like this:
Next step: One of our team will build a basic proof of concept. @mtlynch - I've put together an outline plan for this with our suggested next steps. Please let me know your thoughts on the ones tagged for you; in the meantime, I'll start on the others. |
Yeah, I agree. I don't think this is worth the effort.
I want to avoid checking the presence of every file because that's going to be a brittle test that's likely to break accidentally during normal development (e.g., we rename a script but forget to rename the check). I think it's sufficient to check that one file is present in its expected location (e.g. There are other files we place during
Sure, this sounds like a good approach. My concern is that it's going to become brittle and hard to maintain. For example, if we upgrade to a newer release of Raspberry Pi OS, maybe the line numbers change, and now the diff needs to be rewritten. But maybe we can pass flags to But I think we can start with this approach and revisit if we find the tests breaking out from under us when we haven't changed anything.
This sounds good.
Let's try this. My concern is that it's going to be too slow to download the Raspbian image every time. We might get 90% as much confidence in 1/100th the time if we just place a couple of dummy files to make Debian look like Raspbian. Or we could try creating our own Docker image ahead of time that already has the Raspbian But let's start with the approach of downloading everything at CI runtime and see what performance looks like. Next steps@cghague - Can you create a list of tasks (in this issue, hold off on creating tickets) to lay out the plan of what steps we need to do in what order? We want to sort in descending order of "bang for buck," so our first task should be to do the least amount of work possible to get to an easy test of the install, and then once we have that, we keep adding more tests based on how much it costs us to implement vs. how likely they are to catch an error. |
Tasks to create a full proof-of-conceptThe following tasks will allow us to develop a full proof-of-concept while continually offering something of value should we abort. Step 1: Create a proof-of-concept test shell script
Rationale: Implementing the test in isolation immediately produces a resource we could use as part of our manual release testing process. A single test should be enough for a proof-of-concept. Step 2: Implement scripts to bootstrap from Debian into a Raspberry Pi OS image
Rationale: It is likely easier to develop and debug the bootstrap scripts without the complexity of CircleCI and Docker. This approach also makes it more likely we could have the ability to run these test suites locally in the future. Step 3: Verify that the test works in the bootstrap environment
Rationale: This step allows us to spot any obvious issues with the Step 4: Migrate from a local environment to CircleCI
Rationale: CircleCI allows for the automated testing we want. The work up until this point should prove the validity of the Step 5: Test the proof-of-conceptWe should have a working proof-of-concept at this stage, which we should test as follows:
Rationale: This is a good checkpoint for ensuring the proof-of-concept addresses the testing gap we set out to resolve and that any tests that seem like they might be "tricky" can be implemented with this approach. Outline of future tasksOnce we have tested the proof-of-concept, we can decide whether to continue developing it or abort the project. Assuming we do continue with it, the next steps would be:
@mtlynch - Does this seem like a good plan of action to you? |
@cghague - Cool, this is a good draft! I think this is a good plan, but I'd like to adjust the presentation a bit. Can we orient the tasks around deliverables? When we finalize a plan, we'll want to convert the tasks to a set of Github tickets, and we'll want the Github tickets to be things we can resolve with a PR. So "create a test" and "verify it works" would be a single step. And we'd check that task off by creating the script and merging it into out
Can we enumerate specifically what checks we'll perform in this script? I'm thinking something like:
Can we also do this as a PR of a markdown file to make it easier to review? See https://github.com/tiny-pilot/tinypilot-pro/pull/1090 for an example.
Just want to clarify that we don't need to do anything special with branches for this work. We'll do PRs from branches like normal, but changes can go directly into our main branch once approved. There should be little risk to affecting prod with these scripts.
I think we should plan for Docker unless we have a strong reason not to. We need it to work under CI, so it needs to work under Docker at that point. |
The TinyPilot install process is sufficiently complex that it would be helpful to have tests validating that the installer does what we expect.
We've had a few bugs in the past due to subtle oversights in install logic, and we only discovered them when they bit us later.
Rough idea
/boot/config.txt
that exist in Pi OS)/boot/config.txt
are there, paths we want to create exist, users exist that we expect).Deliverable
This ticket is just to flesh out the design rather than to get into implementation.
The design should cover:
I accidentally duplicated this idea in #1691 so there are additional ideas there.
The text was updated successfully, but these errors were encountered: