Skip to content

parallel-web/parallel-llms-txt

Repository files navigation

Parallel llms.txt

parallel-web/parallel-llms-txt

This repo contains an entire context of all public Parallel assets, making our content equally as accessible to humans as to machines. It also serves as a template to serve your own agent-friendly website.

Tools to reduce hallucination using Parallel Context

How it works

This repo is extracted using the extract-from-sitemap CLI which is powered by the Parallel Extract API. This creates a larger llms.txt accompanied with linked markdown documents, which is then deployed to https://parallel.ai/llms.txt with docs + sdks + blogs. See llmtext.json for the source configuration.

Deployment

This repo is deployed as worker with static assets. Putting your websites context in a separate public repo is recommended because it has several benefits:

  • By making it separate you can make it open source such that it can be used by various third-party context tools like DeepWiki, gitmcp, and uithub. It also can be explored directly from GitHub, which can be useful for manual context collection.
  • By making it separate, you can make it open source without exposing the git history, and it won't interfere with the repo where you implement your main website.

For discovery on your apex domain, you can use a rewriter middleware to rewrite agent requests to your subdomain. For example for next.js, you can use next-agent-rewriter.

CI/CD

Source content changes, so to make this work for your website, you need to automate updating the extracted llmtext repo too.

To re-extract periodically, the 'build-and-reset' GitHub action is used. The current setup re-extracts every 6 hours, and resets github history, such that people won't be able to access old content from here. To deploy the final result each time it changes, we use the Cloudflare Git Integration.

About

Full context of parallel.ai public content made agent-friendly

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •