A REST API for running YOLO11 inference: object detection, segmentation, pose estimation, oriented bounding boxes, and classification.
This is a datamarkin project and is not affiliated with Ultralytics.
All endpoints require an API token.
curl -X POST "https://datamarkin.com/create-user-token" \
-H "Content-Type: application/json" \
-d '{"email": "your@email.com"}'Response (after email verification):
{"token": "a1b2c3d4e5f6..."}Pass the token in the X-API-Key header with every request:
X-API-Key: your_token_here
| Endpoint | Model | Description |
|---|---|---|
POST /ultralytics/yolo11x |
Detection | Object detection with bounding boxes |
POST /ultralytics/yolo11x-seg |
Segmentation | Instance segmentation with masks |
| Limit | Value |
|---|---|
| Per minute | 100 requests |
| Per hour | 1,000 requests |
| Per day | 10,000 requests |
| Per month | 100,000 requests |
Rate limits are applied per IP address.
| Requirement | Value |
|---|---|
| Max file size | 5 MB |
| Allowed formats | JPEG, PNG, WebP, BMP |
curl -X POST "https://vision.datamarkin.com/ultralytics/yolo11x" \
-H "X-API-Key: your_token_here" \
-F "file=@image.jpg"import requests
url = "https://vision.datamarkin.com/ultralytics/yolo11x"
headers = {"X-API-Key": "your_token_here"}
files = {"file": open("image.jpg", "rb")}
response = requests.post(url, headers=headers, files=files)
print(response.json())const formData = new FormData();
formData.append('file', fileInput.files[0]);
const response = await fetch('https://vision.datamarkin.com/ultralytics/yolo11x', {
method: 'POST',
headers: {
'X-API-Key': 'your_token_here'
},
body: formData
});
const result = await response.json();
console.log(result);All endpoints return JSON with detection results in pixelflow format.
| Code | Description |
|---|---|
| 400 | Invalid image format or file extension |
| 401 | Missing or invalid token |
| 413 | File too large (max 5 MB) |
| 429 | Rate limit exceeded |
| 500 | Server error |
curl https://vision.datamarkin.com/ultralyticsReturns:
{"status": "ok", "message": "YOLO Inference API is running"}