New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Use updated SSL libraries #160

Closed
wants to merge 7 commits into
base: master
from

Conversation

Projects
None yet
3 participants
@graeme-a-stewart
Member

graeme-a-stewart commented Sep 18, 2017

Use some updated SSL libraries, compiled from 1.0.2l, to overcome observed SSL connection errors when validating some https websites.

@graeme-a-stewart

This comment has been minimized.

Show comment
Hide comment
@graeme-a-stewart

graeme-a-stewart Sep 19, 2017

Member
  • Using the new SSL libraries is working
  • Update of the certificates within the image is also working, however, it requires changing from a container to a VM (sudo: required), which makes the CI more resource heavy and slower

Despite these changes (which work) there are still strange errors appearing in htmlproofer that now do not seem to be related to underlying curl/ssl errors - I put some direct curl calls into the script to check that (e.g., https://travis-ci.org/HEP-SF/hep-sf.github.io/builds/276898283).

So still some more work to understand those problems.

Member

graeme-a-stewart commented Sep 19, 2017

  • Using the new SSL libraries is working
  • Update of the certificates within the image is also working, however, it requires changing from a container to a VM (sudo: required), which makes the CI more resource heavy and slower

Despite these changes (which work) there are still strange errors appearing in htmlproofer that now do not seem to be related to underlying curl/ssl errors - I put some direct curl calls into the script to check that (e.g., https://travis-ci.org/HEP-SF/hep-sf.github.io/builds/276898283).

So still some more work to understand those problems.

@jouvin

jouvin approved these changes Sep 21, 2017

@graeme-a-stewart I have the feeling that your last set of commits should probably be squashed into one or 2 as they seem very fine grained... but this is a minor detail.

@jouvin

This comment has been minimized.

Show comment
Hide comment
@jouvin

jouvin Sep 21, 2017

Contributor

@graeme-a-stewart is the last errors you are mentioning purely random or are they affecting always the same (limited) set of URLs? If this is always the same URLs (but I'm afraid this is not the case), I'd suggest to add them in the ignore list for the time being and merging what we have so far as it should already been a huge improvement...

Contributor

jouvin commented Sep 21, 2017

@graeme-a-stewart is the last errors you are mentioning purely random or are they affecting always the same (limited) set of URLs? If this is always the same URLs (but I'm afraid this is not the case), I'd suggest to add them in the ignore list for the time being and merging what we have so far as it should already been a huge improvement...

@graeme-a-stewart

This comment has been minimized.

Show comment
Hide comment
@graeme-a-stewart

graeme-a-stewart Sep 21, 2017

Member

Hi @jouvin - it's very consistently the same URLs that fail (e.g., https://conda.io/docs/), but these don't fail when directly accessed by curl, which is puzzling. A tidied up version of this PR should probably be taken to at least get the solved problems solved. I can prepare that tomorrow.

Member

graeme-a-stewart commented Sep 21, 2017

Hi @jouvin - it's very consistently the same URLs that fail (e.g., https://conda.io/docs/), but these don't fail when directly accessed by curl, which is puzzling. A tidied up version of this PR should probably be taken to at least get the solved problems solved. I can prepare that tomorrow.

@jouvin

This comment has been minimized.

Show comment
Hide comment
@jouvin

jouvin Sep 21, 2017

Contributor

If this is always the same URLs, I propose that for the time being, this PR includes them in the ignore list so that html_proofer can succeeds. After merging this first PR, we can try to follow up on the remaining issues...

Contributor

jouvin commented Sep 21, 2017

If this is always the same URLs, I propose that for the time being, this PR includes them in the ignore list so that html_proofer can succeeds. After merging this first PR, we can try to follow up on the remaining issues...

graeme-a-stewart added some commits Sep 12, 2017

Do not use url-ignore list anymore
See instructions in file for how to add the data-proofer-ignore tag
to any URLs that need it
Use updated SSL libraries
To overcome connection problems with some sites in Debian Trusty
add the shared libraries from a build of OpenSSL 1.0.2l.
These libraries are added into LD_LIBRARY_PATH by the
html-proofer wrapper script.
Add missing certificates with script to install them
Note - this must run as root, not as the travis user
Ignore websites that html-proofer cannot validate
These remain mysteries as they only fail inside the Travis CI environment,
yet an underlying `curl` is completely ok.
@graeme-a-stewart

This comment has been minimized.

Show comment
Hide comment
@graeme-a-stewart

graeme-a-stewart Sep 22, 2017

Member

Damn it! So I tidied up this branch, checked everything, then did a final commit to ignore all of the URLs with the mysterious failures (adding the data-proofer-ignore tag). But they all still fail!

Somehow they must be failing internally in html-proofer, which at least does explain why the error was not seen directly with curl. This is so frustrating!

Member

graeme-a-stewart commented Sep 22, 2017

Damn it! So I tidied up this branch, checked everything, then did a final commit to ignore all of the URLs with the mysterious failures (adding the data-proofer-ignore tag). But they all still fail!

Somehow they must be failing internally in html-proofer, which at least does explain why the error was not seen directly with curl. This is so frustrating!

@hegner

This comment has been minimized.

Show comment
Hide comment
@hegner

hegner Sep 22, 2017

Member

At this point in time I would just go for a docker image in travis... as it is now things are not debuggable and every iteration is costly

Member

hegner commented Sep 22, 2017

At this point in time I would just go for a docker image in travis... as it is now things are not debuggable and every iteration is costly

@jouvin

This comment has been minimized.

Show comment
Hide comment
@jouvin

jouvin Sep 22, 2017

Contributor

@graeme-a-stewart You should use the file .travis-scripts/url-ignore to define the URL to ignore. I suspect that the way the script sets this option override the option you defined. An it is much more flexible normally...

Contributor

jouvin commented Sep 22, 2017

@graeme-a-stewart You should use the file .travis-scripts/url-ignore to define the URL to ignore. I suspect that the way the script sets this option override the option you defined. An it is much more flexible normally...

@graeme-a-stewart

This comment has been minimized.

Show comment
Hide comment
@graeme-a-stewart

graeme-a-stewart Sep 25, 2017

Member

OK, let's try again in #162

Member

graeme-a-stewart commented Sep 25, 2017

OK, let's try again in #162

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment