Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AlertLogPkg: Add Affirmation Count to each ID when ReportAlerts is called #45

Closed
SkydiverTricky opened this issue Feb 28, 2020 · 13 comments

Comments

@SkydiverTricky
Copy link

SkydiverTricky commented Feb 28, 2020

In a testbench with multiple checkers at different levels of heirarchy, it would be nice to see the AffirmCount per ID (if possible). I note you currently cannot fetch the Affirmation count per ID, only the total, getAffirmCount has no arguments, although affirmations are done per ID (also, it appears affirmations are not stored per ID).

For example, my current testbench has the following Heirarchy:

%% Alert Stop Count on ERROR   reached in CPU Packet Checker at 11780 ns 
%% DONE  FAILED  64bit  Total Error(s) = 1  Failures: 0  Errors: 1  Warnings: 0  Affirmations Checked: 126  at 11780 ns
%%    Default                           Failures: 0  Errors: 0  Warnings: 0
%%    OSVVM                             Failures: 0  Errors: 0  Warnings: 0
%%    testbench_tools_pkg               Failures: 0  Errors: 0  Warnings: 0
%%    ip_flow_control_top_tb            Failures: 0  Errors: 1  Warnings: 0
%%      AXI4L Filters and Decap Driver  Failures: 0  Errors: 0  Warnings: 0
%%        AWADDR Driver                 Failures: 0  Errors: 0  Warnings: 0
%%        WDATA Driver                  Failures: 0  Errors: 0  Warnings: 0
%%        BRESP Checker                 Failures: 0  Errors: 0  Warnings: 0
%%        ARADDR Driver                 Failures: 0  Errors: 0  Warnings: 0
%%        RDATA Checker                 Failures: 0  Errors: 0  Warnings: 0
%%      AXI4L Table Driver              Failures: 0  Errors: 0  Warnings: 0
%%        AWADDR Driver                 Failures: 0  Errors: 0  Warnings: 0
%%        WDATA Driver                  Failures: 0  Errors: 0  Warnings: 0
%%        BRESP Checker                 Failures: 0  Errors: 0  Warnings: 0
%%        ARADDR Driver                 Failures: 0  Errors: 0  Warnings: 0
%%        RDATA Checker                 Failures: 0  Errors: 0  Warnings: 0
%%      MAC Packet Driver               Failures: 0  Errors: 0  Warnings: 0
%%      CPU Packet Checker              Failures: 0  Errors: 1  Warnings: 0
%%      Media Packet Checker            Failures: 0  Errors: 0  Warnings: 0
%%      Register Scoreboard             Failures: 0  Errors: 0  Warnings: 0
%%      Flow Table Scoreboard           Failures: 0  Errors: 0  Warnings: 0
%%      Packet Coverage                 Failures: 0  Errors: 0  Warnings: 0
%%      Stream Generator                Failures: 0  Errors: 0  Warnings: 0
%%    ipat_stats_store_inst             Failures: 0  Errors: 0  Warnings: 0
%%      Checking                        Failures: 0  Errors: 0  Warnings: 0
%%    ipat_histograms_inst              Failures: 0  Errors: 0  Warnings: 0
%%      Checking                        Failures: 0  Errors: 0  Warnings: 0

It would be nice to see who was doing all the checking.

Thanks
Richard

@JimLewis
Copy link
Member

Hi Richard,
Thanks. It reaffirms our direction as this is high on the planned list of todo items.

With that said, how would you activate it for a test that passes?

Jim

@SkydiverTricky
Copy link
Author

I currently just call ReportAlerts at the end of a test. On a failure, ReportAlerts is called from within the AlertLogPackage (which links back to the Issue I raised about Mirroring of Alerts)

Could you not just print the same table as above? Maybe replacing the failures/errors/warnings columns with Affirmations? Maybe you could set each alert ID to fail on a Vacuous Pass also?

@JimLewis JimLewis self-assigned this May 9, 2020
@JimLewis JimLewis added Agreed Agreed upon solution enhancement labels May 9, 2020
@JimLewis JimLewis changed the title Add Affirmation Count to each ID when ReportAlerts is called AlertLogPkg: Add Affirmation Count to each ID when ReportAlerts is called May 9, 2020
@JimLewis
Copy link
Member

JimLewis commented May 9, 2020

Current plan is that the affirm counts track calls to AffirmIF, ...., AffirmPassed, AffirmError only.

This means with Level PASSED is not counted as an implicit affirmation - note there is an AffirmPassed.

Certainly we never count Alert with any level - since it does not imply an self-check was done - Alert is also used with parameter and protocol checking - which are not self-checking.

@JimLewis JimLewis added Released in Dev Branch and removed Agreed Agreed upon solution enhancement labels May 13, 2020
@tmeissner
Copy link

We have a similar feature to print Affirms in the report table as @SkydiverTricky requests. But it's fine that such a feature will be available in OSVVM master, so I could remove my additions/changes and use the OSVVM original ones. So @JimLewis thanks for integrating this. 😃

@JimLewis
Copy link
Member

What I have currently implemented in the DEV release counts affirmations. That is not exactly the same as a PASSED count. I think a PASSED count may be better than an affirmation count as it is more clear as what is actually done. My concern is that if an error is disabled (SetAlertEnable(..., ERROR, FALSE) and disabled errors are ignored, the affirmation count may give the false sense of passing when it is not.

I could derive PASSED count, PASSED = AFFIRMATION - AlertCount - DisabledAlertCount (although this assumes no Alerts were done at this level) or I could just count it (which is more clear, but would also count log("...", PASSED) - which is ok but could make the PASSED count exceed the Affirmation count.

@SkydiverTricky
Copy link
Author

My Main motivation for this also ties in with the vacuous pass failures.
#44

In a system where you have more than one AlertID for doing the checks, you would usually assume they would all expect to do some checks. Giving an affirmation count per AlertID gives the user a quick visual check that things were happening as expected. If one ID got 0 checks that should be some sort of flag to the user that something isnt right with the test.

In addition, in most of my testbenches I have a lot of AlertIds, most of which wont actually do any affirmations during the test - they are there simply to throw ERRORS/FAILUREs in certain conditions - while each of the AXI4L checkers above is connected to a scoreboard, I doubt they actually get used all that much - they are already there as part of my AXI verification IP (far more useful in Full AXI4)

So along with affirmation count - it might be useful to also report a minimum number of affirmations expected per ID.

@JimLewis
Copy link
Member

JimLewis commented May 15, 2020

When I add Passed, I am thinking of something like the following. Should have this in testing shortly. WRT #44, I think counts should be based on the number of Passed.

%% Alert Stop Count on ERROR   reached in CPU Packet Checker at 11780 ns 
%% DONE  FAILED  64bit  Total Error(s) = 1  Failures: 0  Errors: 1  Warnings: 0  Passed: 125 Affirmations Checked: 126 at 11780 ns
%%    Default                           Failures: 0  Errors: 0  Warnings: 0  Passed: 0  Affirmations: 0
%%    OSVVM                             Failures: 0  Errors: 0  Warnings: 0  Passed: 0  Affirmations: 0 
%%    testbench_tools_pkg               Failures: 0  Errors: 0  Warnings: 0  Passed: 0  Affirmations: 0
%%    ip_flow_control_top_tb            Failures: 0  Errors: 1  Warnings: 0  Passed: 80  Affirmations: 81
%%      AXI4L Filters and Decap Driver  Failures: 0  Errors: 0  Warnings: 0  Passed: 25  Affirmations: 25
%%        AWADDR Driver                 Failures: 0  Errors: 0  Warnings: 0  Passed: 0  Affirmations: 0
%%        WDATA Driver                  Failures: 0  Errors: 0  Warnings: 0  Passed: 0  Affirmations: 0
%%        BRESP Checker                 Failures: 0  Errors: 0  Warnings: 0  Passed: 10  Affirmations: 10
%%        ARADDR Driver                 Failures: 0  Errors: 0  Warnings: 0  Passed: 0  Affirmations: 
%%        RDATA Checker                 Failures: 0  Errors: 0  Warnings: 0  Passed: 15  Affirmations: 15
%%      AXI4L Table Driver              Failures: 0  Errors: 0  Warnings: 0  Passed: 0  Affirmations: 0
%%        AWADDR Driver                 Failures: 0  Errors: 0  Warnings: 0  Passed: 0  Affirmations: 0
%%        WDATA Driver                  Failures: 0  Errors: 0  Warnings: 0  Passed: 0  Affirmations: 0
%%        BRESP Checker                 Failures: 0  Errors: 0  Warnings: 0  Passed: 10  Affirmations: 10
%%        ARADDR Driver                 Failures: 0  Errors: 0  Warnings: 0  Passed: 0  Affirmations: 0
%%        RDATA Checker                 Failures: 0  Errors: 0  Warnings: 0  Passed: 10  Affirmations: 10

@JimLewis
Copy link
Member

Question: Should the word Affirmation be changed to Checks in the above report? Affirmation means something to me as a coder as it is what we call the checkers, however, in a review, checks might be more clear

@tmeissner
Copy link

Mhm, what does Affirmations mean in this table? The sum of Failures, Errors, Warnings & passes for an Alert / Affirm? So the count of how often an Alert / Affirm was called in a simulation run?

@SkydiverTricky
Copy link
Author

When I add Passed, I am thinking of something like the following. Should have this in testing shortly. WRT #44, I think counts should be based on the number of Passed.

I think so. If I wanted to avoid a vacous pass, then 0 passes should result in an error (or maybe a warning or a failure). But consider that a user may allow a warning as a test pass. So 1 warning and 0 Passes would still be Pass without being vaccuous.

@JimLewis
Copy link
Member

JimLewis commented May 15, 2020

Affirmation Count = call affirmation and Error + call log with PASSED
It is intended to mean the number of checks done. That is why I asked if Affirmation should be instead check.

This means if someone calls either Affirmation and it passes or calls Log PASSED, then it counts it.

Alternately calls to log PASSED could be uncounted, but this seems like it would arise to some confusion.

@JimLewis
Copy link
Member

Small update to the above printing. In the first line, it currently says Affirmations Checked rather than just Affirmations. As I mentioned in my "question", I suspect both should be changed to "Checks".

@JimLewis
Copy link
Member

released in 2020.05

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants