Here you can find the source code of the telegram bot that interacts with the inference server that fetches the model from the roboflow server
- 12 Gb space for docker images
- Start Docker inference server on localhost:9001.
You can do it with:
docker-compose.yaml
- Set up environment variables for Java:
- API_KEY: Roboflow private key from
app.roboflow.com/<your_name>/settings/api
- BOT_TOKEN: Telegram bot API token retrieved from BotFather
- OWNER_ID: Your telegram id or anyone who will receive feedback and other admin information from users
- API_KEY: Roboflow private key from
- You can build the image of the bot from the Dockerfile using:
docker build . -t my-bot:latest
- Then you should replace
services.telegram-bot.image
value tomy-bot:latest
- And run everything with
docker-compose up
Send a picture (so-called photo) to the bot. Photo is the image with compression.
- Tune confidence and iou-threshold in application.yaml
- Change model-id to yours. You can find it on Model page of your dataset on Roboflow.