No description, website, or topics provided.
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
images
src
.gitignore
README.md
pom.xml

README.md

Hive Queries from HipChat

ChatOps is huge these days [1]. DAPLAB is no exception to that trend. Concretely, we're using HipChat during our HackyThursday.

To integrate ChatOps with Data, the idea of the project was to allow anyone running Hive queries from our HipChat team room

Demo time :)

HipChat Custom Integration

HipChat allows the creation of so-called slash commands to quickly notify any REST endpoint. See https://blog.hipchat.com/2015/02/11/build-your-own-integration-with-hipchat/ for more details about this integration feature.

In the HipChat admin page, creating a slash command integration is really easy, and requires to define the name of the command, /query in our case, as well as an endpoint which will be requested at every /query message

Integration with hiveqlbot

Message format

The request sent to the REST contains a JSON payload looking like that:

{
    "event": "room_message",
    "item": {
        "message": {
            "date": "2016-04-28T20:21:51.866984+00:00",
            "from": {
                "id": 3450167,
                "links": {
                    "self": "https://api.hipchat.com/v2/user/3450167"
                },
                "mention_name": "BenoitPerroud",
                "name": "Benoit Perroud",
                "version": "00000000"
            },
            "id": "2b7b3ba8-d379-4e9c-91f8-bb332decb078",
            "mentions": [],
            "message": "/sql select * from bperroud.chobachoba_raw_logs where year = 2016 and month = 3 LIMIT 10",
            "type": "message"
        },
        "room": {
            "id": 2390200,
            "is_archived": false,
            "links": {
                "participants": "https://api.hipchat.com/v2/room/2390200/participant",
                "self": "https://api.hipchat.com/v2/room/2390200",
                "webhooks": "https://api.hipchat.com/v2/room/2390200/webhook"
            },
            "name": "HackyThursday",
            "privacy": "public",
            "version": "UA25ZO8M"
        }
    },
    "oauth_client_id": "b80f3c24-76ce-442a-9499-33dd895ccebf",
    "webhook_id": 4694751
}

Architecture

The architecture is pretty straight forward:

  1. A REST endpoint which will receive the message. We'll use SparkJava for that part.
  2. A service which will parse and process the Hive query. We'll add bonecp connection pool to add resiliency.
  3. A service which will post the answer back to HipChat. Hipchat-java will be used here.

Development Cycle

In order to get HipChat message to our development environment, we used [Ngrok)[https://ngrok.com]. This service allows you to make internal, development port publicly available on the internet.

./ngrok http 4567

Note: port 4567 is the default port for SparkJava

We need to remap the HipChat integration to the temporary enpoint

ngrock time :)

Try it out!

Build it

mvn clean install

This will generate a RPM. rpm -ivh to install it.

Configure it

Get the apitoken for your hipchat integration and create a application.conf file in /opt/daplab/hive-ql-bot/config. An example is given here.

Launch it

A convenience script is given to run HiveQLBot.

/opt/daplab/hive-ql-bot/bin/run.sh

Just mind that Hive will run with the permissions of the user launching the run.sh script, and hence will be allowed to query all the tables this user has access to.

Live demo

Login into our HipChat team room and try to run a query!

The HiveQLBot project is hosted with ❤ by the DAPLAB team.