Extra tools supporting Monique.io
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Failed to load latest commit information.


Monique.io extra tools

PyPI version CircleCI

moniqueio is a set of extra command-line tools supporting Monique.io. It's not required for regular usage - Monique.io's API accepts a wide range of input formats sent directly. The extra functionality includes:

  • sending reports with system-level data (CPU usage, disk usage etc.), enabling setting up Monique.io as a basic replacement for traditional monitoring systems
  • parsing unit test run results into JSON (which can be directly submitted to Monique.io)
  • helpers for processing log files from cron
  • helpers for running cron jobs and reporting statuses
  • helpers for executing HTTP requests through curl and reporting meta-data like status codes and headers
  • helpers for constructing JSON documents
  • importing/exporting Monique.io's reports from/to JSON files.

For a higher level overview on how the tools can be used, see "What to monitor?" user guide

Installation and basic usage

The tool requires Python 2.7/3.x (x >=4) and Linux. It's available on PyPI, so the simplest method to install it is using pip (or easy_install):

$ sudo pip install moniqueio

Note that the command will probably install the moniqueio executable into /usr/local/bin directory. When calling it from a crontab, you can either use the full path /usr/local/bin/moniqueio or ensure the /usr/local/bin directory is in cron's PATH, for example by inserting this line:


The tool is invoked by supplying a command and possible options:

$ moniqueio <command> <command-options>

To see available commands, run

$ moniqueio --help

To see a command's help, run

$ moniqueio <command> --help

Some commands require an API key, which must be passed as an --api-key option before a command name, for example:

$ moniqueio --api-key 9fxvMi8aR3CZ5BsNj0rt0odW sysreports --tag-ip


Command sysreports

Sends reports containing system-level resources usage data: CPU, disk, network, file handles etc.

Sample usage in crontab:

*/15 * * * *    moniqueio --api-key 9fxvMi8aR3CZ5BsNj0rt0odW sysreports --tag-ip

The command submits the following reports to your Monique.io account:

  • sys.info - general information about the system - kernel version, distribution name, last system update, uptime
  • sys.cpu - cpu usage - since the previous generation of the report
  • sys.memory - detailed data on memory usage
  • sys.disk_free - disk space usage
  • sys.resources - misc. system resources - load average, a number of threads/processes, logged users
  • sys.network_interfaces - received/transmitted data grouped by network interfaces
  • sys.sockets - opened, allocated TCP/UDP sockets
  • sys.file_handles - allocated and max. allocated limit of file handles
  • sys.disk_stats - throughput and latencies of disks

To select a subset of reports to send, use the --reports option, for example:

$ moniqueio --api-key 9fxvMi8aR3CZ5BsNj0rt0odW sysreports --reports sys.cpu,sys.memory

To see the generated reports without sending them to Monique.io, append the --dry-run flag.

To add tags, use the --tag-ip option to add the ip:<public-ip> tag (using the autotags functionality) or specify tags manually using the --tags option.

Command unittest_summarize

Parse unit test output into a JSON document.

Sample usage in crontab:

*/30 * * * *    run_my_unit_tests 2>&1 | moniqueio unittest_summarize | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: --request POST --data-binary @- 'https://api.monique.io/reports/my_unittests_summary'

The command parses the output of the following runners:

  • Python's unittest
  • PHP's PHPUnit
  • Ruby's Test::Unit
  • Java's JUnit

Sample output:

  ["asserts", 8], 
  ["elapsed", 8.835], 
  ["errors", 1], 
  ["fails", 1], 
  ["skipped", 0], 
  ["tests", 4], 
  ["success", false] 

The command does not send the output to Monique.io, enabling further postprocessing. You can pipe the output to curl (see a crontab example above) to post the data.

Command newcontent

Prints new content that was appended to a file since a previous invocation of the command. Useful for processing logs from crontab.

Sample usage in crontab (monitoring a number of POST requests):

*/30 * * * *    moniqueio newcontent /var/log/access.log | grep POST | wc -l | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: --request POST --data-binary @- 'https://api.monique.io/reports/post_requests'

By default, the current file offset is stored in a file .newcontent.<filename> placed in the directory of the file (a calling user must have permissions for writing files into it). Alternatively, --data-dir option can specify the directory that should store the offset files, for example:

moniqueio --data-dir /var/lib/moniqueio-data newcontent /var/log/access.log

(the directory /var/lib/moniqueio-data must be created manually).

A flag --skip-update makes the command skip updating the current file offset - multiple consecutive calls will return the same content. This is useful for processing content using multiple commands, for example:

posts=$(moniqueio newcontent --skip-update /var/log/access.log | grep -c POST)
gets=$(moniqueio newcontent --skip-update /var/log/access.log | grep -c GET)
# no --skip-update for the last invocation
heads=$(moniqueio newcontent /var/log/access.log | grep -c HEAD)
echo "post $posts\nget $gets\nhead $heads" | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: --request POST --data-binary @- 'https://api.monique.io/reports/requests'

Multiple file arguments are supported.

Note that in some cases the newcontent command might miss outputting some content - it doesn't track log rotation (the counter is reset to zero when the file size is reduced) and lines that appear during calls with the --skip-update flag will be missed by the first invocations. For processing log files, it's usually an acceptable drawback.

Command run

Runs a specified command and prints a JSON document summarizing the run. Useful for monitoring a status of a cron job run.

Sample output:

$ moniqueio run 'du -sh' 
    "stdout_sample": "1.2M\t.\n",
    "stderr_sample": "",
    "return_code": 0,
    "elapsed": 0.203

Sample usage in crontab:

10 3 * * *    moniqueio run make_backup.sh | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: --request POST --data-binary @- 'https://api.monique.io/reports/job.make_backup'

Command curl

Make a HTTP request through curl and print a JSON document describing the run. The document will contain the response body, status code, headers, timing data and other meta-data. Useful when not only the content of a request must be monitored (for this the raw curl command is sufficient), but also status codes, headers or timing data.

Sample invocation:

$ moniqueio curl https://httpbin.org/post -XPOST --data 'x=2'
  "content": "{\n  \"args\": {}, \n  \"data\": \"\", \n  \"files\": {}, \n  \"form\": {\n    \"x\": \"2\"\n  }, \n  \"he$ aders\": {\n    \"Accept\": \"*/*\", \n    \"Content-Length\": \"3\", \n    \"Content-Type\": \"application/x-www-form-urlencoded\", \n    \"Host\": \"httpbin.org\", \n    \"User-Agent\": \"curl/7.35.0\"\n  }, \n  \"json\": null, \n  \"origin\": \"X.X.X.X\", \n  \"url\": \"https://httpbin.org/post\"\n}\n", 
  "content_type": "application/json", 
  "headers": {
    "access-control-allow-credentials": "true", 
    "access-control-allow-origin": "*", 
    "connection": "keep-alive", 
    "content-length": "353", 
    "content-type": "application/json", 
    "date": "Sun, 11 Dec 2016 20:10:39 GMT", 
    "server": "nginx"
  "http_code": 200, 
  "http_code_success": true, 
  "http_connect": 0, 
  "local_ip": "", 
  "local_port": 26846, 
  "num_connects": 1, 
  "num_redirects": 0, 
  "remote_ip": "", 
  "remote_port": 443, 
  "size_download": 353, 
  "size_header": 220, 
  "size_request": 151, 
  "size_upload": 3, 
  "speed_download": 528.0, 
  "speed_upload": 4.0, 
  "ssl_verify_result": 0, 
  "time_appconnect": 0.518, 
  "time_connect": 0.213, 
  "time_namelookup": 0.06, 
  "time_pretransfer": 0.518, 
  "time_redirect": 0.0, 
  "time_starttransfer": 0.667, 
  "time_total": 0.667, 
  "url_effective": "https://httpbin.org/post"

Sample usage in crontab:

*/15 * * * *    moniqueio curl -XPOST --data 'key1=val1' https://api.example.com/endpoint | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: -XPOST --data-binary @- https://api.monique.io/reports/endpoint_post

If you want to construct a single JSON document for multiple HTTP requests (and submit it is a single report instance), you can use the join helper command, for example:

$ moniqueio join "$(moniqueio curl http://example.com/a)" "$(moniqueio curl http://example.com/b)"

JSON helper: command join

Join multiple JSON documents into a JSON array.

The command is useful when you want to construct a single report instance from multiple JSON documents.

Sample usage:

$ moniqueio join '{"a": 1, "b": 2}' '{"a": 3}' '"string"'
    "a": 1, 
    "b": 2
    "a": 3

Commands export and exportall

Export report instances to *.json files.

Sample usage:

$ moniqueio --api-key 9fxvMi8aR3CZ5BsNj0rt0odW export --from-datetime='1 hour ago' sys.cpu
$ ls sys.cpu

The call creates a directory with the name of the exported report, containing a file <created_utc>__<id>.json for each exported report instance.

The exportall command exports the data of all reports (with a directory created for each of them). The selection of reports to export can be narrowed by supplying an argument interpreted as a prefix of report names to export.

Command import

Import report instances from directories containing *.json files.

Sample usage

$ moniqueio --api-key 9fxvMi8aR3CZ5BsNj0rt0odW import sys.cpu sys.disk_free

The *.json files could be created by an invocation of the export / exportall command, or could be created manually (the JSON content should be an object with the "rows" key, other keys are optional).

Target report names are taken from directory names.