Skip to content

Commit

Permalink
autogenerate robots.txt
Browse files Browse the repository at this point in the history
this automatically generates our robots.txt file, grabbing an updated list of ai scrapers to block each time
  • Loading branch information
nyancrimew committed Nov 10, 2024
1 parent 395fb36 commit 500a00f
Show file tree
Hide file tree
Showing 3 changed files with 20 additions and 5 deletions.
4 changes: 4 additions & 0 deletions src/_data/aibots.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
module.exports = async function() {
response = await fetch("https://raw.githubusercontent.com/ai-robots-txt/ai.robots.txt/refs/heads/main/robots.txt");
return response.text()
}
16 changes: 16 additions & 0 deletions src/robots.njk
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
---
permalink: /robots.txt
---
# omg haiiiii robots ^-^
# i love robots :3

# AI scrapers
{{ aibots }}

# everyone else
User-agent: *
Allow: /

# sitemaps
Sitemap: {{ site.url }}/sitemap.xml
Sitemap: {{ site.url }}/sitemap-news.xml
5 changes: 0 additions & 5 deletions src/static/robots.txt

This file was deleted.

0 comments on commit 500a00f

Please sign in to comment.