New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reinstate Travis for master #621

Closed
akrabat opened this Issue Nov 2, 2017 · 10 comments

Comments

Projects
None yet
3 participants
@akrabat
Member

akrabat commented Nov 2, 2017

We need to re-instate Travis tests for master. At a minimum, It needs to test for Python 2.7 with ReportLap 3.4.0.

For this to happen, every test needs to be run and the output PDF manually checked. If okay, then the hash needs to be updated using cd tests; python setmd5.py good output/{name-of-test.pdf} .

@akrabat akrabat added the help wanted label Nov 2, 2017

@akrabat

This comment has been minimized.

Member

akrabat commented Nov 2, 2017

@alexwlchan

This comment has been minimized.

Contributor

alexwlchan commented Nov 5, 2017

I had a quick go at getting Travis running against master here, based on some stuff in the dev branch, but I have no idea how complete a test coverage it might be getting. This was a quick on-the-train-to-work job: alexwlchan#3

Most of the tests fail with unknown MD5 hashes: https://travis-ci.org/alexwlchan/rst2pdf/builds/297502572

@akrabat

This comment has been minimized.

Member

akrabat commented Nov 5, 2017

I couldn't get dev to work with python 2, so that's interesting. I got this far with master: https://github.com/rst2pdf/rst2pdf/compare/master...akrabat:travis?expand=1

@alexwlchan

This comment has been minimized.

Contributor

alexwlchan commented Nov 5, 2017

I couldn't get dev to work with python 2, so that's interesting

I didn’t try to get dev running with Python 2, but Travis is running against that branch so I tried to cherry-pick the bits that looked useful from a dev..master diff.

I really don’t know much about the project yet, so this could be totally wrong!

@akrabat

This comment has been minimized.

Member

akrabat commented Nov 5, 2017

Very strange. I'll have to look at what's going on with my setup!

As per, #618, I think that getting a release out from master is the logical next step though.

Maybe dev is stable enough to release from though?

@akrabat

This comment has been minimized.

Member

akrabat commented Nov 5, 2017

Ah, your green-travis branch is from master, which explains why it works :)

@akrabat

This comment has been minimized.

Member

akrabat commented Nov 5, 2017

In order to minimise overlap of work, I've raised PR #624. The first 15 tests pass and the remainder need md5 hashes updated. Please raise PRs against my travis branch to fix things.

@akrabat

This comment has been minimized.

Member

akrabat commented Apr 22, 2018

This is the current list of tests that fail:

  • test_fancytitles.txt
  • test_image_units_svg.txt
  • test_issue_201.txt
  • test_issue_216.txt
  • test_issue_238.txt
  • test_issue_247.txt
  • test_issue_266_2.txt
  • test_issue_289.txt
  • test_issue_310.txt
  • test_issue_312_2.txt
  • test_issue_337.txt
  • test_issue_374.txt
  • test_issue_378.txt
  • test_issue_394_png.txt
  • test_issue_419.txt
  • test_issue_467.txt
  • test_issue_478.txt
  • test_lists.txt
  • test_math.txt
  • test_math2.txt
  • test_math_default_role.txt
  • test_style_width.txt
  • test_uml_extension.txt
  • sphinx-brokenlinks
  • sphinx-issue158
  • sphinx-issue166
  • sphinx-issue168
  • sphinx-issue172
  • sphinx-issue183
  • sphinx-issue196
  • sphinx-issue229
  • sphinx-issue251
  • sphinx-issue252
  • sphinx-issue254
  • sphinx-issue257
  • sphinx-issue280
  • sphinx-issue284
  • sphinx-issue285
  • sphinx-issue318
  • sphinx-issue319
  • sphinx-issue320
  • sphinx-issue360
  • sphinx-issue364
  • sphinx-issue367
  • sphinx-issue388
  • sphinx-markup
  • sphinx-multidoc

Each needs to be checked:

  • If the created PDF is valid, then the test needs to be marked as successful
  • If the created PDF is invalid, then the test needs to be marked as ignored, with a comment in the ignore file stating why it has failed. An issue also needs to be raised, so we can get it fixed.
@lornajane

This comment has been minimized.

Contributor

lornajane commented Apr 23, 2018

If anyone has time to try this on their own machine, I am using this two-line setup to compare my test outcomes with the ones listed above.
officialfail.txt is the list above in a text file.

From /rst2pdf/tests try this:

nosetests -i regulartest -i sphinxtest 2>&1 | grep "FAIL: " | cut -d' ' -f2 > myfailure.txt
wc -l myfailure.txt
diff officialfailure.txt myfailure.txt

This will show how many failed tests you have (there are 47 failures in the official list above) and then how your output differs (if at all).

For me the output looks like this:

$ wc -l myfailure.txt
47 myfailure.txt
$ diff officialfail.txt myfailure.txt 
1d0
< test_fancytitles.txt
9a9
> test_issue_311.txt
17a18
> test_issue_73.txt
23d23
< test_uml_extension.txt

This means that I have 2 extra passing tests: fancytitles and uml_extension - but also 2 additional failures issue_311 and issue_73 that I will try to look more closely at.

@akrabat akrabat removed the help wanted label Apr 30, 2018

@akrabat

This comment has been minimized.

Member

akrabat commented Apr 30, 2018

With #656 & #659, Travis now passes!

Issue #660 covers what to do about the tests that are skipped.

@akrabat akrabat closed this Apr 30, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment