Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add search index #1777

Merged
merged 6 commits into from
May 17, 2024
Merged

Add search index #1777

merged 6 commits into from
May 17, 2024

Conversation

travisbeckham
Copy link
Collaborator

Add search index for 2.15 docs

Signed-off-by: Travis Beckham <travis@buoyant.io>
Signed-off-by: Travis Beckham <travis@buoyant.io>
@wmorgan
Copy link
Member

wmorgan commented May 16, 2024

So this will create a JSON file designed for programmatic search engine consumption, correct? Do we need to mark this as noindex or anything like that so that it doesn't appear in e.g. Google search results?

@travisbeckham
Copy link
Collaborator Author

So this will create a JSON file designed for programmatic search engine consumption, correct? Do we need to mark this as noindex or anything like that so that it doesn't appear in e.g. Google search results?

Good point. We could add a robots.txt file, but that might be ignored. Seems like the best way to deal with this is to add a header from the web server. Something like:

location ~* \.(json)$ {
 add_header X-Robots-Tag "noindex";
}

...but I'm a bit fuzzy on how the linkerd docs site actually get served. Want to just start with a robots file?

@travisbeckham
Copy link
Collaborator Author

On a related question, we're currently generating these json files for every section of the site. Do you know what they are for? SEO? https://linkerd.io/2.15/index.json

If so, this search index is overriding it. Let me rework this, so the search index get dumped in the root.

@wmorgan
Copy link
Member

wmorgan commented May 16, 2024

I am not sure what those files are... maybe some Hugo artifact? At any rate, there is already a robots.txt so the easiest approach is to just add a line there.

Signed-off-by: Travis Beckham <travis@buoyant.io>
Signed-off-by: Travis Beckham <travis@buoyant.io>
@travisbeckham
Copy link
Collaborator Author

Those json files are intentionally built, not a Hugo artifact, so I'll avoid clobbering them. I'll assume something reads them.

I've added the search index to robots.txt and moved it to the root.

Signed-off-by: Travis Beckham <travis@buoyant.io>
Signed-off-by: Travis Beckham <travis@buoyant.io>
@travisbeckham
Copy link
Collaborator Author

Ok, I think it ready for your review again.

I also fixed some deprecated function calls, and bumped the Hugo version.

@travisbeckham travisbeckham merged commit 9b214af into main May 17, 2024
7 checks passed
@travisbeckham travisbeckham deleted the travis/searchindex branch May 17, 2024 01:43
karr9 added a commit to karr9/website that referenced this pull request May 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants