New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add tests for mode parsing #793

Merged
merged 8 commits into from Dec 8, 2017

Conversation

Projects
None yet
2 participants
@ailin-nemui
Contributor

ailin-nemui commented Nov 26, 2017

Fixes #776

@ailin-nemui

This comment has been minimized.

Contributor

ailin-nemui commented Nov 26, 2017

@irssi/developers I'm now testing the "add auto-merge label and ping comment" workflow (see chat for the discussion)

@ailin-nemui

This comment has been minimized.

Contributor

ailin-nemui commented Nov 26, 2017

this PR which was created by @horgh will add a base test layer to irssi, which could hopefully ease the future addition of tests. See #767 for the relevant discussion. As you can see, there is currently 1 failing test, which might be fixed with #766

Due to our use of recursive make, we have one test report per directory sadly, instead of a big one. But I guess it would be fine for now

@horgh

This comment has been minimized.

Member

horgh commented Nov 28, 2017

Yeah, it looks like the first test case that fails is one that would be fixed by the other PR. I think the test case after it would fail too, but it's not being run. The problem is a ":" in it is not being stripped.

g_test_init(&argc, &argv, NULL);
for (i = 0; i < G_N_ELEMENTS(event_get_param_fixtures); i++) {

This comment has been minimized.

@horgh

horgh Nov 28, 2017

Member

I think we want to iterate over the other array here. That's why some of the test cases aren't running.

This comment has been minimized.

@ailin-nemui

ailin-nemui Nov 28, 2017

Contributor

omg! good catch

@horgh

This comment has been minimized.

Member

horgh commented Nov 28, 2017

This looks reasonable. Do we need the added files under util?

An improvement might be to write out names of test cases, rather than numbers. It would make seeing which failed a little easier. I realize in my original version it was probably not easy too!

I take it you would like to allow failing tests in PRs/master if that shows a bug? I'd usually expect tests to always be passing in master or before merging, even if there is a bug. Basically it would be nice if we could say that master's build is always passing.

Then for example with PRs we could say that all tests must pass in order to be merged. Otherwise it seems like there would be a risk that tests would be failing and we could assume it's for one reason but it's because of a new bug. Take what we have here for example: They're failing but it's not clear unless you happen to know about that other bug. And maybe we're right they would pass if we merged it, but it's not guaranteed.

@ailin-nemui

This comment has been minimized.

Contributor

ailin-nemui commented Nov 28, 2017

The utils are not strictly necessary, however they are the ones references in automake/TAP in this Stackoverflow post: https://stackoverflow.com/questions/19958861/how-to-properly-set-up-glib-testing-framework-with-autotools

The main advantage is, as far as I can tell, that they will output more details. Instead of (plain program, automake tests only)

FAIL: test-irc
============================================================================
Testsuite summary for irc/core
============================================================================
# TOTAL: 1
# PASS:  0
# SKIP:  0
# XFAIL: 0
# FAIL:  1
# XPASS: 0
# ERROR: 0

we get the full output visible:

PASS: test-irc 1 /test/event_get_params/0
PASS: test-irc 2 /test/event_get_params/1
PASS: test-irc 3 /test/event_get_params/2
PASS: test-irc 4 /test/event_get_params/3
PASS: test-irc 5 /test/event_get_params/4
PASS: test-irc 6 /test/event_get_params/5
PASS: test-irc 7 /test/event_get_params/6
PASS: test-irc 8 /test/event_get_params/7
PASS: test-irc 9 /test/event_get_params/8
FAIL: test-irc 10 /test/event_get_params/9
FAIL: test-irc 11 /test/event_get_params/10
FAIL: test-irc 12 /test/event_get_params/11
PASS: test-irc 13 /test/event_get_param/0
PASS: test-irc 14 /test/event_get_param/1
PASS: test-irc 15 /test/event_get_param/2
PASS: test-irc 16 /test/event_get_param/3
PASS: test-irc 17 /test/event_get_param/4
PASS: test-irc 18 /test/event_get_param/5
PASS: test-irc 19 /test/event_get_param/6
PASS: test-irc 20 /test/event_get_param/7
PASS: test-irc 21 /test/event_get_param/8
PASS: test-irc 22 /test/event_get_param/9
ERROR: test-irc - exited with status 1
============================================================================
Testsuite summary for irc/core
============================================================================
# TOTAL: 23
# PASS:  19
# SKIP:  0
# XFAIL: 0
# FAIL:  3
# XPASS: 0
# ERROR: 1
@ailin-nemui

This comment has been minimized.

Contributor

ailin-nemui commented Nov 28, 2017

unfortunately, as far as I can tell, there is no way to to have 2 badges: compile|passing tests|some fail

@ailin-nemui

This comment has been minimized.

Contributor

ailin-nemui commented Nov 28, 2017

if there are bugs, it's much better to have failing tests than to have no tests. however, the build (compile) should not fail on master. and new commits should ideally not make existing (already passed) tests fail. if it is easier to add tests of broken behaviour, we can use some ideas from test driven development with the anticipation of fixing all tests eventually

@horgh

This comment has been minimized.

Member

horgh commented Nov 28, 2017

Oh yeah, having that output is way better!

That's sort of what I was getting at with the badges idea. But what you say makes sense. We can fix the bugs!

@ailin-nemui

This comment has been minimized.

Contributor

ailin-nemui commented Nov 28, 2017

you can sort of spot it in the Travis detail view, four green is good compile, the red one are the tests. would be nice to be able to differentiate but idk how

@ailin-nemui

This comment has been minimized.

Contributor

ailin-nemui commented Nov 29, 2017

I found one tunable which is allow_failures in travis. not sure about it, now it will basically ignore the test result and mark our build as green as before

@ailin-nemui ailin-nemui merged commit ba6681d into irssi:master Dec 8, 2017

1 check passed

continuous-integration/travis-ci/pr The Travis CI build passed
Details

@ailin-nemui ailin-nemui deleted the ailin-nemui:tests branch Dec 8, 2017

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment