-
Notifications
You must be signed in to change notification settings - Fork 82
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Webfiles and Metatags: Generate robots.txt
#219
Labels
area:content
Related to Content API and Content Management
enhancement
New feature or request
help wanted
Extra attention is needed
Comments
Depends on #95 |
AZholtkevych
changed the title
Core -> Webfiles and Metatags: Generate robots.txt
Core -> Webfiles and Metatags: Generate 'robots.txt'
May 1, 2023
AZholtkevych
changed the title
Core -> Webfiles and Metatags: Generate 'robots.txt'
Core -> Webfiles and Metatags: Generate May 1, 2023
robots.txt
Generating the file itself is trivial but we need to research how to serve it on multiple domains |
@AZholtkevych we can remove the dependency of #95 and add dependency on #169 |
leandrocp
added
enhancement
New feature or request
help wanted
Extra attention is needed
and removed
proposal
Idea in research phase
labels
May 4, 2023
AZholtkevych
changed the title
Core -> Webfiles and Metatags: Generate
Webfiles and Metatags: Generate Oct 23, 2023
robots.txt
robots.txt
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
area:content
Related to Content API and Content Management
enhancement
New feature or request
help wanted
Extra attention is needed
Generate robots.txt for sites.
Each site will have its own robots.txt which must be resolved dynamically by adding a route
/robots.txt
tobeacon/lib/beacon/router.ex
Line 79 in 7790eb7
A request to that route should call
Beacon.Lifecycle.generate_robots_txt/1
which will provide a default implementation that should work for most scenarios:generate_robots_txt/1
should receivesite
as argument and callElixir.Beacon.Config.fetch!(site).endpoint.url()
to fetch current site url to be used as prefix for sitemap.xml location.Then that content should be served as txt.
Depends on #169
Refs
The text was updated successfully, but these errors were encountered: