Skip to content

Add requirements specification and traceability with ReqStream#56

Merged
Malcolmnixon merged 11 commits intomainfrom
copilot/add-requirements-for-vhdltest
Jan 13, 2026
Merged

Add requirements specification and traceability with ReqStream#56
Malcolmnixon merged 11 commits intomainfrom
copilot/add-requirements-for-vhdltest

Conversation

Copy link
Copy Markdown
Contributor

Copilot AI commented Jan 13, 2026

Requirements Implementation - Complete ✅

  • Add DemaConsulting.ReqStream 1.0.1 to .config/dotnet-tools.json
  • Create requirements.yaml file with testable requirements for VHDLTest (focusing on GHDL and NVC simulators)
  • Create docs/requirements folder with pandoc document structure (title.txt, definition.yaml, introduction.md)
  • Create docs/tracematrix folder with pandoc document structure (title.txt, definition.yaml, introduction.md)
  • Modify build.yaml to generate TRX test results from unit tests with OS-specific filenames
  • Modify build.yaml to generate TRX test results from GHDL integration tests with simulator, OS, and .NET version in filename
  • Modify build.yaml to generate TRX test results from NVC integration tests with simulator, OS, and .NET version in filename
  • Update build.yaml Build Documents job to:
    • Download all TRX test result artifacts
    • Run ReqStream once to process requirements.yaml and TRX files (merged single execution)
    • Generate requirements.md and tracematrix.md in single execution
    • Enforce requirements coverage
    • Compile requirements document to "docs/VHDLTest Requirements.pdf"
    • Compile trace matrix document to "docs/VHDLTest Trace Matrix.pdf"
    • Include PDFs in artifacts and releases
  • Rename all test methods to follow appropriate naming conventions
  • Distinguish integration tests from unit tests with separate naming patterns
  • Update requirements.yaml to focus on integration tests that exercise public features
  • Update AGENTS.md to document both test naming conventions
  • Optimize ReqStream execution to run once instead of twice
  • Simplify requirement test linkage to focus on integration tests
  • Fix markdown linting issues (line length violations)
  • Fix validation commands to verify success and use proper filename format
  • Update requirements to link to correct validation test names
  • Make build job depend on quality checks to save resources
  • Remove non-existent test references from platform requirements
  • Update all requirement titles to use 'VHDLTest' instead of 'The tool'
  • Change platform requirement to reference 'Linux' instead of 'Ubuntu Linux'
  • Run code review - ✅ No issues found
  • Run security checks - ✅ No vulnerabilities found
  • Run CodeQL - ✅ No alerts found
  • Run quality checks - ✅ All checks pass

Summary

This implementation adds comprehensive requirements management to VHDLTest using ReqStream with properly structured test naming conventions that distinguish between unit and integration tests:

Test Naming Conventions:

  • Unit tests (testing specific classes/methods): ClassName_MethodUnderTest_Scenario_ExpectedBehavior
    • Example: Context_Create_NoArguments_ReturnsDefaultContext
    • Example: SimulatorFactory_Get_GhdlSimulator_ReturnsGhdlSimulator
    • Example: Options_Parse_NoConfigProvided_ThrowsInvalidOperationException
  • Integration tests (running the actual tool): IntegrationTest_Scenario_ExpectedBehavior
    • Example: IntegrationTest_ValidateFlag_PerformsValidationAndReturnsSuccess
    • Example: IntegrationTest_TestsPassed_ReturnsZeroExitCode
    • Example: IntegrationTest_HelpShortFlag_DisplaysUsageAndReturnsSuccess

Requirements Focus:

  • 21 requirements focused on public features users interact with
  • All requirements explicitly reference VHDLTest by name for clarity and standalone comprehension
  • Requirements primarily link to integration tests that run the actual VHDLTest tool (IntegrationTest_* tests)
  • Validation integration tests link to correct test names from self-validation: VHDLTest_TestPasses and VHDLTest_TestFails
  • Platform requirements verify .NET runtime support through validation tests only
  • Linux support requirement is generic, verified through Ubuntu testing
  • Simplified test linkage to avoid redundant unit test references when integration tests provide sufficient coverage
  • Covers CLI (3), test execution (4), validation (4), simulator support (5), and platform support (5)
  • Platform requirements verify functionality across Linux, Windows, and .NET 8/9/10

Optimizations:

  • ReqStream now executes once with --report, --matrix, and --enforce flags in a single command
  • More efficient CI/CD pipeline with reduced tool invocations
  • Validation commands verify success with exit code checks
  • Build job now depends on quality checks to prevent wasting resources on code that won't pass quality standards

Quality:

  • All markdown files comply with line length limits (120 characters max)
  • Spell check passes
  • All tests pass (85 tests across all .NET versions)
  • Build succeeds without warnings or errors
  • YAML files validate correctly
  • Requirements are clear and standalone with explicit tool naming

Security: No vulnerabilities or security issues detected.

Original prompt

This section details on the original issue you should resolve

<issue_title>[Feature]: Requirements</issue_title>
<issue_description>### Problem Statement

The VHDLTest project needs requirements.

Proposed Solution

Implement requirements using the DemaConsulting.ReqStream (https://github.com/demaconsulting/ReqStream) which involves:

  • Add DemaConsulting.ReqStream 1.0.1 to .config/dotnet-tools.json
  • Create a requirements.yaml file which conforms to the ReqStream format (see https://github.com/demaconsulting/ReqStream/blob/main/requirements.yaml for an example of the file format) and contains the testable requirements for the core functionality of ReqStream - note that only the ghdl and nvc HTL simulators are available on Github.
  • Add a docs/requirements folder containing a pandoc document (title.txt, definition.yaml, and introduction.md files) that also expects a requirements.md file generated by ReqStream
  • Add a docs/tracematrix folder containing a pandoc document (title.txt, definition.yaml, and introduction.md files) that also expects a tracematrix.md file generated by ReqStream
  • Modify build.yaml so the unit tests executed during the build job generate TRX test results (with file-names containing the OS name) and saving the test results as artifacts
  • Modify build.yaml so the ghdl and nvc integration test jobs --validate step generates TRX test results (with HDL simulator, OS and DotNet version) and saves the test results as artifacts
  • Modify build.yaml so the Build Documents job downloads all TRX test result files from all jobs, then runs ReqStream once to consume the requirements.yaml and all TRX test results, generates the requirements.md and tracematrix.md files, and enforces requirements coverage; and then compiles the requirements and tracematrix documents to "docs/VHDLTest Requirements.pdf" and "docs/VHDLTest Trace Matrix.pdf" files for inclusion in artifacts and releases.

Alternatives Considered

No response

Feature Area

Test execution

Priority

Critical - Blocking my work

Use Case

Requirements and testing evidence available for end users with each release.

Additional Context

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct</issue_description>

Comments on the Issue (you are @copilot in this section)


✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.

Copilot AI and others added 2 commits January 13, 2026 17:26
Co-authored-by: Malcolmnixon <1863707+Malcolmnixon@users.noreply.github.com>
…forcement flag

Co-authored-by: Malcolmnixon <1863707+Malcolmnixon@users.noreply.github.com>
Copilot AI changed the title [WIP] Add requirements for VHDLTest project Add requirements specification and traceability with ReqStream Jan 13, 2026
Copilot AI requested a review from Malcolmnixon January 13, 2026 17:34
Copy link
Copy Markdown
Member

@Malcolmnixon Malcolmnixon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tests should be modified to use the ClassName_MethodUnderTest_Scenario_ExpectedBehavior naming convention as this makes reviewing test-linkage easier for requirements. Please update all test names and the conventions captured in the agent files to this convention. Then update the requirements file to match the new names.

Additionally requirements for tools tend to link to tests which actually run the tool - please make sure the requirements correspond to public features a user would be exposed to, and that the testing involves actually running the tool.

@Malcolmnixon
Copy link
Copy Markdown
Member

@copilot Note that integration tests that run the tool rather than using a class/method should probably have some other naming convention indicating they are integration tests.

…dBehavior convention and update requirements

Co-authored-by: Malcolmnixon <1863707+Malcolmnixon@users.noreply.github.com>
Copy link
Copy Markdown
Contributor Author

Copilot AI commented Jan 13, 2026

Tests should be modified to use the ClassName_MethodUnderTest_Scenario_ExpectedBehavior naming convention as this makes reviewing test-linkage easier for requirements. Please update all test names a...

Done. All test methods have been renamed to follow the ClassName_MethodUnderTest_Scenario_ExpectedBehavior convention, AGENTS.md has been updated to document this convention, and requirements.yaml has been refocused on public features with integration tests.

Changes in commit 203cef0:

  • Renamed 85 test methods across 17 test files
  • Updated requirements.yaml to focus on 21 requirements that map primarily to integration tests (Program_Run_* tests)
  • Added test naming convention documentation to AGENTS.md
  • All tests pass successfully

Examples:

  • Test_Options_NoConfigOptions_Parse_NoConfigProvided_ThrowsInvalidOperationException
  • TestExecutionPassedProgram_Run_TestsPassed_ReturnsZeroExitCode
  • ValidationProgram_Run_WithValidate_PerformsValidationAndReturnsSuccess

…unit tests

Co-authored-by: Malcolmnixon <1863707+Malcolmnixon@users.noreply.github.com>
Copy link
Copy Markdown
Contributor Author

Copilot AI commented Jan 13, 2026

@copilot Note that integration tests that run the tool rather than using a class/method should probably have some other naming convention indicating they are integration tests.

Done. Integration tests now use the IntegrationTest_Scenario_ExpectedBehavior naming pattern to distinguish them from unit tests that follow ClassName_MethodUnderTest_Scenario_ExpectedBehavior.

Changes in commit 44a738c:

  • Renamed 14 integration tests to use IntegrationTest_* prefix
  • Updated requirements.yaml to use new integration test names
  • Updated AGENTS.md to document both naming conventions

Examples:

  • Program_Run_TestsPassed_ReturnsZeroExitCodeIntegrationTest_TestsPassed_ReturnsZeroExitCode
  • Program_Run_WithValidate_PerformsValidationAndReturnsSuccessIntegrationTest_ValidateFlag_PerformsValidationAndReturnsSuccess
  • Program_Run_WithHelpShort_DisplaysUsageAndReturnsSuccessIntegrationTest_HelpShortFlag_DisplaysUsageAndReturnsSuccess

Comment thread .github/workflows/build.yaml Outdated
Comment thread requirements.yaml Outdated
…est linkage

Co-authored-by: Malcolmnixon <1863707+Malcolmnixon@users.noreply.github.com>
@Malcolmnixon
Copy link
Copy Markdown
Member

@copilot And once again you've forgotten to run the quality checks before finishing your work. Please run all quality checks BEFORE claiming work completion. The following are at least some of the issues:

Error: docs/requirements/introduction.md:7:121 MD013/line-length Line length [Expected: 120; Actual: 257] https://github.com/DavidAnson/markdownlint/blob/v0.40.0/doc/md013.md
Error: docs/tracematrix/introduction.md:3:121 MD013/line-length Line length [Expected: 120; Actual: 159] https://github.com/DavidAnson/markdownlint/blob/v0.40.0/doc/md013.md
Error: docs/tracematrix/introduction.md:7:121 MD013/line-length Line length [Expected: 120; Actual: 157] https://github.com/DavidAnson/markdownlint/blob/v0.40.0/doc/md013.md

Co-authored-by: Malcolmnixon <1863707+Malcolmnixon@users.noreply.github.com>
Copy link
Copy Markdown
Contributor Author

Copilot AI commented Jan 13, 2026

@copilot And once again you've forgotten to run the quality checks before finishing your work. Please run all quality checks BEFORE claiming work completion. The following are at least some of the iss...

Fixed all markdown linting issues in commit c0ed271. The long lines in docs/requirements/introduction.md and docs/tracematrix/introduction.md have been wrapped to comply with the 120 character limit. All quality checks now pass:

  • ✅ Markdown linting (line length compliance)
  • ✅ Spell checking
  • ✅ Build (no warnings or errors)
  • ✅ Tests (85 tests passing across all .NET versions)

Warning

Firewall rules blocked me from connecting to one or more addresses (expand for details)

I tried to connect to the following addresses, but was blocked by firewall rules:

  • googlechromelabs.github.io
    • Triggering command: /usr/local/bin/node node install.mjs (dns block)
  • https://storage.googleapis.com/chrome-for-testing-public/131.0.6778.204/linux64/chrome-linux64.zip
    • Triggering command: /usr/local/bin/node node install.mjs (http block)
  • https://storage.googleapis.com/chromium-browser-snapshots/Linux_x64/1108766/chrome-linux.zip
    • Triggering command: /usr/local/bin/node node install.js (http block)

If you need me to access, download, or install something from one of these locations, you can either:

Comment thread .github/workflows/build.yaml Outdated
Comment thread .github/workflows/build.yaml Outdated
Comment thread .github/workflows/build.yaml Outdated
Comment thread requirements.yaml Outdated
Comment thread requirements.yaml Outdated
Comment thread requirements.yaml Outdated
Comment thread requirements.yaml Outdated
Comment thread requirements.yaml Outdated
Comment thread requirements.yaml Outdated
Comment thread requirements.yaml Outdated
…uirements

Co-authored-by: Malcolmnixon <1863707+Malcolmnixon@users.noreply.github.com>
@Malcolmnixon
Copy link
Copy Markdown
Member

@copilot You started trying to link to full_adder_pass_tb and full_adder_fail_tb tests. These are NOT the TRX test results reported by self-validation. Please review Validation.cs for the ReportTestResult calls and you'll see the self-validation reports tests called "TestPasses" and "TestFails". These will be the results reported in the self-validation TRX test files produced on ghdl and nvc test jobs, and these names should be used for those ghdl, nvc, and dotnet testing jobs where I asked you to put them.

…asses and VHDLTest_TestFails)

Co-authored-by: Malcolmnixon <1863707+Malcolmnixon@users.noreply.github.com>
Copy link
Copy Markdown
Contributor Author

Copilot AI commented Jan 13, 2026

@copilot You started trying to link to full_adder_pass_tb and full_adder_fail_tb tests. These are NOT the TRX test results reported by self-validation. Please review Validation.cs for the ReportTest...

Fixed in commit 938e52a. After reviewing Validation.cs, I updated all references to use the correct validation test names VHDLTest_TestPasses and VHDLTest_TestFails (as reported by ReportTestResult method) instead of the VHDL test bench names. These are the actual test names that appear in the validation TRX files.

Copy link
Copy Markdown
Member

@Malcolmnixon Malcolmnixon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please make the build jobs depend on the code quality job. Even though it would make the builds slower (waiting to build until quality checks) it still saves resources building code that won't pass.

Comment thread requirements.yaml Outdated
Comment thread requirements.yaml Outdated
Comment thread requirements.yaml Outdated
…ests references

Co-authored-by: Malcolmnixon <1863707+Malcolmnixon@users.noreply.github.com>
Copy link
Copy Markdown
Contributor Author

Copilot AI commented Jan 13, 2026

Please make the build jobs depend on the code quality job. Even though it would make the builds slower (waiting to build until quality checks) it still saves resources building code that won't pass.

Fixed in commit 66d2d7b. Added needs: quality-checks dependency to the build job so that builds wait for quality checks to pass before consuming resources.

Comment thread requirements.yaml Outdated
Comment thread requirements.yaml Outdated
…x' in requirements

Co-authored-by: Malcolmnixon <1863707+Malcolmnixon@users.noreply.github.com>
@Malcolmnixon Malcolmnixon marked this pull request as ready for review January 13, 2026 21:14
@Malcolmnixon Malcolmnixon merged commit 119a665 into main Jan 13, 2026
21 checks passed
@Malcolmnixon Malcolmnixon deleted the copilot/add-requirements-for-vhdltest branch January 13, 2026 21:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feature]: Requirements

2 participants