Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[doc] Add a sw/device/tests markdown #800

Merged
merged 1 commit into from Nov 5, 2019
Merged

[doc] Add a sw/device/tests markdown #800

merged 1 commit into from Nov 5, 2019

Conversation

tjaychen
Copy link

@tjaychen tjaychen commented Nov 2, 2019

No description provided.

@tjaychen
Copy link
Author

tjaychen commented Nov 2, 2019

adding some description to the software tests per @martin-lueker request.
I'm taking some liberties in making these "self contained" tests at the moment, but this is something we probably need to discuss more post launch on what the right split is.

I'm re-using @sriyerg testplanner stuff, since it seemed useful. Let me know if it's not quite the right place to use and should be taken out.

Also, it seems a little weird to me the testplan stuff is hardcoded into build_docs, is this something that might change in the future?

@@ -0,0 +1,32 @@
{
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok to keep this here for now, however I am intending for this to be moved to hw/top_earlgrey/data area.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

o i can move it there. Not a problem. Need a better name though... if you're okay with the conetent, how about we call these standalone_sw tests?

@@ -72,6 +72,7 @@
"hw/ip/i2c/data/i2c_testplan.hjson",
"hw/ip/rv_timer/data/rv_timer_testplan.hjson",
"hw/ip/uart/data/uart_testplan.hjson",
"sw/device/tests/self_contained_testplan.hjson",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't realize these need to be manually added to this script. Is this going to be fixed in future?
@gkelly

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would it help to auto-gather these files if we long term renamed them to hjson.testplan?

Copy link
Member

@asb asb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added a minor nit. It seemed good from my read-through and I tested the rendering locally, but I'll defer to reviews from others who have had more exposure to the test plan descriptions.

These platforms include - UVM DV, Verilator DV, FPGA, and eventually silicon.

To enable this cross-platform testing, each test is self-contained and do not require additional host or backend resources to verify correct behavior.
This suggests that any testing that would involve emulation of host capability, such as an external spi host or an external host to encrypt/decrypt data, is not appropriate for these types of self-contained tests.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

spi => SPI

## Self Checking Mechanism

Currently, the self-checking mechanism is accomplished through a console message.
When the tests pass, it will output 'Pass!'.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pass! -> PASS!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, 'PASS!\r\n' is the magic string. You'll also want to wrap it in backticks so it renders as code.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah thanks for the reminder.

@martin-lueker
Copy link
Contributor

Thanks for putting this together @tjaychen! One question: Does the described behavior accurately reflect the current test behavior, or is this planned behavior? When I loaded these tests into SPI the other day, I did not see a "PASS!" message on the UART. (Though, I certainly wouldn't rule out PEBKAC, in this case yet.) This is is pretty good confirmation of what I should be expecting in these tests. Also, should there a convention for broadcasting a message even if the test does not pass? Maybe an announcement naming the test.
(I know this is a documentation PR, so maybe this should be raised as a broader issue elsewhere, though regardless I think this test-launch-announcement could be useful.)

Copy link
Author

@tjaychen tjaychen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for putting this together @tjaychen! One question: Does the described behavior accurately reflect the current test behavior, or is this planned behavior? When I loaded these tests into SPI the other day, I did not see a "PASS!" message on the UART. (Though, I certainly wouldn't rule out PEBKAC, in this case yet.) This is is pretty good confirmation of what I should be expecting in these tests. Also, should there a convention for broadcasting a message even if the test does not pass? Maybe an announcement naming the test.
(I know this is a documentation PR, so maybe this should be raised as a broader issue elsewhere, though regardless I think this test-launch-announcement could be useful.)

yes you should see PASS! on all 3 of these tests right now. IF you don't, that either means the binary didn't run..or something else.

Can you actually file an issue with what you are seeing? Then we can take it from there.

@tjaychen
Copy link
Author

tjaychen commented Nov 4, 2019

o sorry i forgot to answer your other question.
the way this works now is that when we run the tests there is a built-in timeout. So if we don't see PASS within that time, the test is considered a failure.

But this only works if you invoke the tests with the pytest construct. There's another readme that is clarifies this.

@tjaychen tjaychen requested a review from gkelly November 4, 2019 17:44
@tjaychen
Copy link
Author

tjaychen commented Nov 4, 2019

@gkelly
i've updated the PASS comment, please take a look.

Copy link
Contributor

@gkelly gkelly left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One tiniest of nits.

The capturing and generation of this message will differ per testing target.
On FPGA / silicon, they will feed directly to a host machine through a physical UART conection, where the host can decipher the message correctness.
On Verilator, they will feed to a virtual UART terminal where the host can do the same.
On DV, they will feed to a fixed memory location where the DV backend can efficiently pickup the message without significant parsing overhead.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pickup -> pick up

@tjaychen tjaychen merged commit f2d7726 into lowRISC:master Nov 5, 2019
@tjaychen tjaychen deleted the swtests_readmes branch November 5, 2019 06:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants