Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
42 changes: 37 additions & 5 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -110,6 +110,7 @@ jobs:
run: CI_ENV=1 CI_MINIMIZE_DISK_USAGE=1 ./ci/ci-tx-sync-tests.sh

coverage:
needs: fuzz
strategy:
fail-fast: false
runs-on: self-hosted
Expand All @@ -133,6 +134,11 @@ jobs:
# Maybe if codecov wasn't broken we wouldn't need to do this...
./codecov --verbose upload-process --disable-search --fail-on-error -f target/codecov.json -t "f421b687-4dc2-4387-ac3d-dc3b2528af57" -F 'tests'
cargo clean
- name: Download honggfuzz corpus
uses: actions/download-artifact@v4
with:
name: hfuzz-corpus
path: fuzz/hfuzz_workspace
- name: Run fuzz coverage generation
run: |
./contrib/generate_fuzz_coverage.sh --output-dir `pwd` --output-codecov-json
Expand Down Expand Up @@ -253,21 +259,47 @@ jobs:

fuzz:
runs-on: self-hosted
env:
TOOLCHAIN: 1.75
steps:
- name: Checkout source code
uses: actions/checkout@v4
- name: Install Rust ${{ env.TOOLCHAIN }} toolchain
# For whatever reason, honggfuzz doesn't build on 1.75, and there's not a lot of
# reason to insist on 1.75 for fuzzing, so we just pick an MSRV of 1.80 for fuzz.
- name: Install Rust 1.80 toolchain
run: |
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y --profile=minimal --default-toolchain ${{ env.TOOLCHAIN }}
- name: Sanity check fuzz targets on Rust ${{ env.TOOLCHAIN }}
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y --profile=minimal --default-toolchain 1.80
# This is read-only for PRs. It seeds the fuzzer for a more effective run.
# NOTE: The `key` is unique and will always miss, forcing a fallback to
# the `restore-keys` to find the latest global cache from the `main` branch.
- name: Restore persistent fuzz corpus (PR)
if: ${{ github.ref != 'refs/heads/main' }}
uses: actions/cache/restore@v4
with:
path: fuzz/hfuzz_workspace
key: fuzz-corpus-${{ github.ref }}-${{ github.sha }}
restore-keys: |
fuzz-corpus-refs/heads/main-
# Only on the `main` branch, restores the latest corpus and also save a
# new, updated one.
- name: Restore/Save persistent honggfuzz corpus (Main)
if: ${{ github.ref == 'refs/heads/main' }}
uses: actions/cache@v4
with:
path: fuzz/hfuzz_workspace
key: fuzz-corpus-refs/heads/main-${{ github.sha }}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Where do we save to fuzz-corpus-refs/heads/main-? this includes the sha.

restore-keys: |
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Presumably when running on main we don't need the restore-keys trick?

fuzz-corpus-refs/heads/main-
- name: Sanity check fuzz targets on Rust 1.80
run: |
cd fuzz
RUSTFLAGS="--cfg=fuzzing --cfg=secp256k1_fuzz --cfg=hashes_fuzz" cargo test --verbose --color always --lib --bins -j8
cargo clean
- name: Run fuzzers
run: cd fuzz && ./ci-fuzz.sh && cd ..
- name: Upload honggfuzz corpus
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rather than only uploading, is there a way to make this directory persistent so that we can keep it between fuzz jobs?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure if we really need to persist the directory here. My understanding is that the fuzz job runs on the latest code changes on every PR, so the generated corpus is tailored to the code changes on that PR. If we persist the corpus from a previous run and use that on a new run, won't that produce incorrect/misleading coverage data?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think the point of the fuzz job is only to generate coverage data, but rather test the code :). Having a bit more coverage data from fuzzing than we "deserve" is okay, at least now that we split the coverage data out so that codecov shows fuzzing separately, and having persistent fuzzing corpus means our fuzzing is much more likely to catch issues.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right, how long do you think we can have this directory persisted? The upload-artifact action have a retention-days input that can be used to persist the artifact for a while. The default is 90 days but can be adjusted (https://github.com/actions/upload-artifact?tab=readme-ov-file#retention-period).

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe the simple "upload-artifact" task just stores data for this CI run. What I was thinking is some kind of persistent directory that's shared across jobs so that each CI fuzz task picks up the latest directory, does some fuzzing, finds new test cases, then uploads a new copy with more tests in it.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What I was thinking is some kind of persistent directory that's shared across jobs so that each CI fuzz task picks up the latest directory, does some fuzzing, finds new test cases, then uploads a new copy with more tests in it.

Makes sense. I pushed eea2e4b to handle this using Github's cache action (https://github.com/actions/cache?tab=readme-ov-file).

uses: actions/upload-artifact@v4
with:
name: hfuzz-corpus
path: fuzz/hfuzz_workspace

linting:
runs-on: ubuntu-latest
Expand Down
23 changes: 20 additions & 3 deletions contrib/generate_fuzz_coverage.sh
Original file line number Diff line number Diff line change
Expand Up @@ -62,9 +62,26 @@ if [ "$OUTPUT_CODECOV_JSON" = "0" ]; then
cargo llvm-cov --html --ignore-filename-regex "fuzz/" --output-dir "$OUTPUT_DIR"
echo "Coverage report generated in $OUTPUT_DIR/html/index.html"
else
cargo llvm-cov -j8 --codecov --ignore-filename-regex "fuzz/" --output-path "$OUTPUT_DIR/fuzz-codecov.json"
echo "Fuzz codecov report available at $OUTPUT_DIR/fuzz-codecov.json"
fi
# Clean previous coverage artifacts to ensure a fresh run.
cargo llvm-cov clean --workspace

# Import honggfuzz corpus if the artifact was downloaded.
if [ -d "hfuzz_workspace" ]; then
echo "Importing corpus from hfuzz_workspace..."
for target_dir in hfuzz_workspace/*; do
[ -d "$target_dir" ] || continue
src_name="$(basename "$target_dir")"
dest="${src_name%_target}"
mkdir -p "test_cases/$dest"
# Copy corpus files into the test_cases directory
find "$target_dir" -maxdepth 2 -type f -path "$target_dir/input/*" \
-print0 | xargs -0 -I{} cp -n {} "test_cases/$dest/"
done
fi

echo "Replaying imported corpus (if found) via tests to generate coverage..."
cargo llvm-cov -j8 --codecov --ignore-filename-regex "fuzz/" \
--output-path "$OUTPUT_DIR/fuzz-codecov.json" --tests

echo "Fuzz codecov report available at $OUTPUT_DIR/fuzz-codecov.json"
fi
15 changes: 4 additions & 11 deletions fuzz/ci-fuzz.sh
Original file line number Diff line number Diff line change
Expand Up @@ -26,24 +26,17 @@ cargo install --color always --force honggfuzz --no-default-features
# Because we're fuzzing relatively few iterations, the maximum possible
# compiler optimizations aren't necessary, so we turn off LTO
sed -i 's/lto = true//' Cargo.toml
sed -i 's/codegen-units = 1//' Cargo.toml

export HFUZZ_BUILD_ARGS="--features honggfuzz_fuzz"

cargo --color always hfuzz build -j8
for TARGET in src/bin/*.rs; do
FILENAME=$(basename $TARGET)
FILE="${FILENAME%.*}"
HFUZZ_RUN_ARGS="--exit_upon_crash -v -n8"
if [ "$FILE" = "chanmon_consistency_target" ]; then
HFUZZ_RUN_ARGS="$HFUZZ_RUN_ARGS -F 64 -N1000"
elif [ "$FILE" = "process_network_graph_target" -o "$FILE" = "full_stack_target" -o "$FILE" = "router_target" -o "$FILE" = "lsps_message_target" ]; then
HFUZZ_RUN_ARGS="$HFUZZ_RUN_ARGS -N10000"
elif [ "$FILE" = "indexedmap_target" ]; then
HFUZZ_RUN_ARGS="$HFUZZ_RUN_ARGS -N100000"
elif [ "$FILE" = "fs_store_target" ]; then
HFUZZ_RUN_ARGS="$HFUZZ_RUN_ARGS -F 64 -N10000"
else
HFUZZ_RUN_ARGS="$HFUZZ_RUN_ARGS -N1000000"
HFUZZ_RUN_ARGS="--exit_upon_crash -v -n8 --run_time 30"
if [ "$FILE" = "chanmon_consistency_target" -o "$FILE" = "fs_store_target" ]; then
HFUZZ_RUN_ARGS="$HFUZZ_RUN_ARGS -F 64"
fi
export HFUZZ_RUN_ARGS
cargo --color always hfuzz run $FILE
Expand Down
Loading