You can clone with
Pages such as http://www.nationalgeographic.com contain links with href values that are invalid URIs.
The method absolutify_url throws URI::InvalidURIError (bad URI(is not URI?)) when trying to get the links for the page.
We should be able to catch those exceptions and don't add the link to the links array.
Issue #29 has the fix for this issue. I'm still getting used to Github and how to sync pull requests and issues
Thanks, closing this now that your patch is merged.
The trick on github is to include "fixes #28" in your commit message.
does not raise exception when a link has a weird href value. References