MIRRORED FROM: https://git.sp4ke.com/sp4ke/hugobot
hugobot is a an automated content fetch and aggregation bot for Hugo data driven websites. It has the following features:
- Use the
feedstable to register feeds that will periodically get fetched, stored and exported into the hugo project.
- Currently handles these types of feeds:
- Define your own feed types by implementing the
- Hugobot automatically fetches new posts from the registered.
- Sqlite is used for storage.
- The scheduler can handle any number of tasks and uses leveldb for caching/resuming jobs.
- Data is automatically exported to the configured Hugo website path.
- It can export
- All fields in the exported files can be customized.
- You can define custom output formats by using the
- You can register custom filters and post processing on exported posts to avoid changing the raw data stored in the db.
- You can force data export using the CLI.
hugobot also includes a webserver API that can be used with Hugo Data Driven Mode.
Insert and query data from the db. This is still a WIP, you can easily add the missing code on the API side to automate adding/querying data from the DB.
An example usage is the automated generation of Bitcoin addresses for new articles on bitcointechweekly.com
- Some commands are available through the CLI (
github.com/urfave/cli), you can add your own custom commands.
- See Docker files
First time usage
- The database is automatically generated the first time you run the program. You can add your feeds straight into the sqlite db using your favorite sqlite GUI or the provided web gui in the docker-compose file.
- PRs welcome, current priority is to add tests.
- Check the TODO section.
- Add tests.
- Handle more feed formats:
- TLS support in the API (not a priority, can be done with a reverse proxy).