New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Function to parse URLs #3849

Closed
jetwash opened this Issue Aug 30, 2017 · 1 comment

Comments

Projects
None yet
2 participants
@jetwash

jetwash commented Aug 30, 2017

I'd like to be able to extract a domain name from a URL so I can tell people where they are going. I'm using a regex now, but obviously it's a bit naïve. It would be nice to have a function in Hugo to do this.

I figure it would probably make more sense to return a map with entries like scheme, host, port, path, query, etc., instead of restricting it to the domain. Basically, what Parse in net/url does.

It would also be very useful if it could include the effective TLD+1 in the return value, using the public suffix list. There is already a go package to make it easy.

@bep bep added the Enhancement label Aug 31, 2017

@bep

This comment has been minimized.

Show comment
Hide comment
@bep

bep Aug 31, 2017

Member

To be clear: We can add net.Parse (=> urls.Parse), but we're not doing some TLD+1.

Member

bep commented Aug 31, 2017

To be clear: We can add net.Parse (=> urls.Parse), but we're not doing some TLD+1.

moorereason added a commit to moorereason/hugo that referenced this issue Sep 24, 2017

tpl: Add urls.Parse function
Add a urls.Parse template function that front-ends url.Parse from the Go
stdlib.

Fixes #3849

@bep bep closed this in #3900 Sep 24, 2017

bep added a commit that referenced this issue Sep 24, 2017

tpl: Add urls.Parse function
Add a urls.Parse template function that front-ends url.Parse from the Go
stdlib.

Fixes #3849
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment