A lightweight REST API written in Go that generates .crawljob files for JDownloader. Drop a download URL, get a crawljob file picked up automatically by JDownloader.
Built to run as a Docker container.
This is my first Go project, done for fun and learning.
A web interface is available at /. It offers a simple text field and one click action to start the download. The purpose of this project is to write the API, not create web interfaces.
- You send a
POST /jobsrequest with a download URL - The API validates the URL (scheme, allowed domains)
- A
.crawljobfile is generated and dropped into a watched folder - JDownloader picks it up and starts the download automatically
docker run -d \
-p 8080:8080 \
-e DESTINATION_FOLDER=/mnt/downloads \
-e CRAWLJOB_FOLDER=/mnt/crawljobs \
-v /your/download/path:/mnt/downloads \
-v /your/crawljob/path:/mnt/crawljobs \
ghcr.io/frostbyte0x/crawljob-api:latestgit clone https://github.com/FrostByte0x/crawljob-api
cd crawljob-api
go run main.go| Variable | Description | Default |
|---|---|---|
DESTINATION_FOLDER |
Download destination folder | . (current dir) |
CRAWLJOB_FOLDER |
Folder watched by JDownloader | . (current dir) |
ALLOWED_DOMAINS |
Allowed download domains | 1fichier.com,mega.nz |
Submit a download URL.
Request Body
{
"url": "https://1fichier.com/yourfile"
}Responses
| Code | Description |
|---|---|
201 Created |
Job file successfully created |
400 Bad Request |
Invalid URL or malformed body |
405 Method Not Allowed |
Only POST is accepted |
This can be changed in the Dockerfile configuration using ALLOWED_DOMAINS
Currently restricted to:
1fichier.commega.nz
Contact the server owner or set your own domain list to extend this.
crawljob-api/
├── main.go # Server entrypoint
├── handler/
│ ├── job.go # HTTP handler
│ ├── validator.go # URL validation
│ └── ui.go # HTTP handler for / (web interface)
├── model/
│ ├── crawljob.go # CrawlJob model + file generation
│ └── utils.go # Helpers
└── Dockerfile
MIT