This is a demo of Grafana Loki, showcasing how to search logs and export metrics over them.
Clone the repo:
git clone git@github.com:noris-network/loki-demo.git
cd loki-demo
Prerequisites:
With this setup Loki, Promtail and the log_gen script will run locally, whereas Prometheus and Grafana will run inside Docker containers on the host network.
Run make install
to build ts_gen
and download Loki and
Promtail:
make install
Next start the services:
Each of the below commands should be run in a separate terminal window.
make run/loki
make run/promtail
make run/docker/up
make run/log_gen
If you have an AWS S3 bucket, you can store loki's chunks in it with.
make run/loki/s3 ACCESSKEY=<your aws access key> SECRETKEY=<aws secret key> \
S3ENDPOINT=<s3 endoint> BUCKETNAME=<bucket name>
Access Grafana via http://localhost:3000 and add the following datasources:
Datasource | Name | URL |
---|---|---|
Prometheus | Prometheus | http://localhost:9090 |
Loki | Loki | http://localhost:3100 |
Prometheus | Loki as Prometheus | http://localhost:3100/loki |
As of Grafana 6.4 the
Loki as Prometheus
datasource is necessary to use aggregation functions likerate
orcount
over LogQL results.
You can then import the sample dashboard in dashboards/log_gen.json
:
Going on Explore on the left hand side lets you evaluate the logs via LogQL:
{job="demo_log", service="api", level="error"} |= "cpu"
As of Grafana 6.4, LogQL functions need to be sent agaist the
Loki as Prometheus
datasource.
sum by (handler) (rate({job="demo_log", handler!=""})[5m])
First stop all services that run in the foreground:
CTRL+c
Then shutdown the docker-compose stack:
make run/docker/down
Running make clean
will remove the log file created by log_gen
, all
binaries, created Docker volumes and all data created by Loki and Promtail:
make clean