Crawler to find links you can update to HTTPS
Switch branches/tags
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
httpsyet
internal/slack
slackhook
vendor/golang.org/x/net
.gitignore
.travis.yml
Gopkg.lock
Gopkg.toml
goreleaser.yml
license
main.go
readme.md

readme.md

📡 httpsyet 🔑

GoDoc Build Status Go Report Card

The web is moving to HTTPS, slowly. In a happy future, we will have secure connections only. Today, though, we still have to deal with HTTP. We are getting better. Thank you Let's Encrypt.

Now we only need to update all those http:// links on our pages to https://. Not all sites support HTTPS yet. But maybe they do tomorrow. How do we know? - httpsyet.

httpsyet -slack $SLACK_HOOK https://firstsite.com https://secondsite.biz http://thirdsite.net

This will crawl your sites recursively and for every http:// link, it will try if the URL is also available via HTTPS. A list of all URLs you can update is sent to Slack.

Set this up with your favorite job scheduler (Cron, sleepto, ...) to run once a month.

Find out more about the implementation.

Install

  • With Go:
go get qvl.io/httpsyet
brew install qvl/tap/httpsyet

Development

Make sure to use gofmt and create a Pull Request.

Dependencies

Use dep ensure -update && dep prune to update dependencies.

Releasing

Push a new Git tag and GoReleaser will automatically create a release.

License

MIT