Awesant is a log shipper for logstash.
Perl Shell Makefile
Latest commit 66e219d Aug 12, 2016 Jonny Schulz kicked some packages
Permalink
Failed to load latest commit information.
bin did a lot of changes for systemd Sep 25, 2014
debian
etc
perl
.gitignore Permanently ignore Makefile.config using .gitignore. Feb 15, 2013
ChangeLog Fixed: determine if systemctl is in use. Aug 12, 2016
ConfigSyntax.md Fixed ConfigSyntax.md Sep 16, 2014
Configure.PL
GlobalOptions.md
InputFileOptions.md fixed docu Feb 13, 2015
InputSocketOptions.md
LICENSE Added a LICENSE Nov 7, 2014
Makefile Fixed: determine if systemctl is in use. Aug 12, 2016
OutputRabbitmqOptions.md Expand script create-markdown to parse the description of the input/o… Aug 8, 2013
OutputRedisOptions.md
OutputScreenOptions.md Expand script create-markdown to parse the description of the input/o… Aug 8, 2013
OutputSocketOptions.md Implemented a failover mechanism for the redis output. Oct 25, 2013
README.md updated information where to download the latest releases Sep 22, 2015
awesant.spec Build 0.22 Aug 12, 2016
create-markdown
install-sh

README.md

Awesant

Awesant is a simple log shipper for logstash. It ships logs from multiple inputs to multiple outputs.

Latest releases

Please ignore the release list from https://github.com/bloonix/awesant/releases.

Look at https://download.bloonix.de/sources or https://download.bloonix.de/repos for deb or rpm packages.

Plugins and Options

Global options for each plugin:

Currently supported inputs:

Currently supported outputs:

Configuration syntax:

https://github.com/bloonix/awesant/blob/master/ConfigSyntax.md

Pre-installation

Awesant is written in Perl. So you have to install some Perl packages at first to run Awesant on your machine. Let us have a look at what you need to install:

Class::Accessor
Log::Handler
Params::Validate
IO::Socket
IO::Select
Sys::Hostname
Time::HiRes
JSON  (please install the JSON-XS Perl module for a better performance)

If you want to transport logs via SSL and the Socket output then you need to install IO::Socket::SSL.

If you want to transport logs to RabbitMQ you need to install Net::RabbitMQ.

You can install the packages with your favorite package manager, or from CPAN.

cpan -i Log::Handler
cpan -i Params::Validate
cpan -i ...

Installation

Just download the project and execute

perl Configure.PL
make
make install

Or create an RPM with

VERSION=0.18
git clone https://github.com/bloonix/awesant.git -b v$VERSION awesant-$VERSION
tar -czf awesant-$VERSION.tar.gz awesant-$VERSION
rpmbuild -ta awesant-$VERSION.tar.gz
rpm -i rpmbuild/RPMS/noarch/awesant...

Or create a deb-package with

VERSION=v0.18
git clone https://github.com/bloonix/awesant.git
cd awesant
git checkout $VERSION
dpkg-buildpackage

Install Awesant from a repository

You can install Awesant from the Bloonix repository if you like. The packages are available for Debian and CentOS like distributions.

You can find the repositories here:

https://bloonix.org/de/docs/installation/repositories.html

HowTo

Start and stop Awesant

/etc/init.d/awesant-agent [ start | stop | restart ]

Configuration

The main configuration file of the Awesant agent is

/etc/awesant/agent.conf

The configuration style is very simple. You can define inputs, outputs, a logger and some global configuration parameters.

Inputs are the log files you want to ship. Outputs are the transports you want to use to ship the log files.

Example configuration:

# Print some statistics to the log file.
# Default: no
#benchmark yes

# How often to poll inputs for new events.
# Default: 500 (ms)
#poll 500

# How many lines to request from the inputs by each poll.
# Default: 100 (count)
#lines 100

# How does poll and lines work in combination?
# If 100 lines could be read then the agent tries to read the next 100 lines.
# If less than 100 lines could be read then the end of the log file was reached
# and the agent waits 500 ms before the agent tries to read the next lines.

input {
    file {
        type syslog
        path /var/log/messages
        tags tag1, tag2
        add_field key1, value1, key2, value2
        # add_field {
        #    key1 value1
        #    key2 value2
        # }
    }
}

output {
    redis {
        type syslog
        key syslog
        host 192.168.10.10, 192.168.10.12 # failover
        port 6379
        database 0
        timeout 10
        password foobared
    }
    screen {
        type syslog
        send_to stdout
    }
}

logger {
    file {
        filename /var/log/awesant/agent.log
        maxlevel info
    }
}

With this agent configuration your logstash should be configured as follows:

input {
    redis {
        host => "192.168.10.10"
        port => 6379
        data_type => "list"
        db => 0
        key => "syslog"
        type => "redis"
        format => "json_event"
    }
}

Extended input and output configuration

  • It is possible to set a comma separated list of types for outputs.
  • It is possible to set wildcards for file inputs.
  • It is possible to set a wildcard as output type.
  • If a wildcard is used then Awesant will watch the glob pattern for new files.
  • If a new file is created then Awesant will tail the new file automatically, so you do not need to restart the service.

As example if you have different inputs, such as

input {
    file {
        type apache-access-log
        path /var/log/httpd/*access*log
    }
    file {
        type syslog
        path /var/log/messages
    }
}

then you can use one output for multiple inputs:

output {
    redis {
        type apache-access-log, syslog
        key logstash
        host 127.0.0.1
    }
}

In this case the redis-output is bound to the inputs 'apache-access-log' and 'syslog'.

If you want to match all input types you can use a wildcard as output type:

output {
    redis {
        type *      # match all input types
        key logstash
        host 127.0.0.1
    }
}

When the log events are pushed to the output the type of the input is used for the JSON event. This means that 'type' is set to the type of the input, not of the output.

SSL and authentication support

Do you want to ship your logs via SSL and via password authentication? That should not be a problem.

The scenario could looks like

Awesant output socket -> Awesant input socket -> Redis -> Logstash

You can use the script 'awesant-create-cert' to create a cert bundle for Awesant. This should be run from the machine where the redis and the input socket is to be run.

awesant-create-cert /etc/awesant 4096 10000

You should now copy the /etc/awesant/certs/ca.crt to /etc/awesant/certs/ca.crt on any of the log shipping machines you wish to configure.

scp /etc/awesant/certs/ca.crt root@logshippingnode:/etc/awesant/certs/

Awesant output socket

Create the following configuration on the machines with the log files you want to ship.

input {
    file {
        type foologs
        path /var/log/foo/*.log
    }
    file {
        type barlogs
        path /var/log/bar/*.log
    }
}

output {
    socket {
        type foologs, barlogs
        host redis-machine-remote-addr
        port 25801
        auth your-very-very-very-long-password
        ssl_ca_file /etc/awesant/certs/ssl.crt
        # This should be on by default, but we set it to be safe.
        ssl_verify_mode SSL_VERIFY_PEER
    }
}

Awesant input socket

Create the following configuration on the machine where the redis instance is running or on any host to transfer the logs to redis.

input {
    socket {
        type logstash
        host redis-machine-local-addr
        port 25801
        auth your-very-very-very-long-password
        ssl_cert_file /etc/awesant/certs/ssl.crt
        ssl_key_file  /etc/awesant/certs/ssl.key
        format json_event
        workers 10
    }
}

output {
    redis {
        type foologs, barlogs
        host 127.0.0.1
        port 6379
        database 0
        key logstash
    }
}

TODO

Rewrite Agent.pm! Too long methods... and improve the code.