Skip to content

Commit

Permalink
Merge pull request #2 from throughnothing/json
Browse files Browse the repository at this point in the history
Adds capability to log to a specific key. This is helpful for indexing logs.
  • Loading branch information
dsog committed Mar 20, 2012
2 parents 8d547d0 + 223c189 commit 467271c
Show file tree
Hide file tree
Showing 3 changed files with 58 additions and 6 deletions.
2 changes: 2 additions & 0 deletions Makefile.PL
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,9 @@ my %WriteMakefileArgs = (
"PREREQ_PM" => {
"Dancer::Hook" => 0,
"Dancer::Logger::Abstract" => 0,
"DateTime" => 0,
"JSON" => 0,
"Try::Tiny" => 0,
"base" => 0,
"strict" => 0,
"warnings" => 0
Expand Down
16 changes: 16 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,22 @@ in the output. It does this by queueing everything up and adding an
`after` hook that calls the `flush` function, which causes the logger
to output the log line for the current request.

In your `config.yml` simply put:

logger: 'consoleAggregator'

To use this log module. Then you can debug like this:

debug { field1 => "data" };
debug to_json({ field2 => "data" });
debug "Raw Data";

And this module will log something like this:

{ "field1" : "data", "field2" : "data", "messages" : [ "Raw Data" ] }

All on one line.

# AUTHORS

- Khaled Hussein <khaled.hussein@gmail.com>
Expand Down
46 changes: 40 additions & 6 deletions lib/Dancer/Logger/ConsoleAggregator.pm
Original file line number Diff line number Diff line change
Expand Up @@ -2,22 +2,40 @@ use strict;
use warnings;
package Dancer::Logger::ConsoleAggregator;
use Dancer::Hook;
use JSON qw(to_json);
use DateTime;
use JSON qw(to_json from_json);
use Try::Tiny;

use base 'Dancer::Logger::Abstract';

# ABSTRACT: Dancer Console Logger that aggregates each requests logs to 1 line.

my $log_message = [];
my ($log_message, $strings) = ({}, []);

sub _log {
my ($self, $level, $message) = @_;
push (@$log_message => {$level => $message});
no warnings;
no strict;
my ($self, $level, $message, $obj) = @_;
try {
# If its a perl object stringified
$obj = eval $message;
} catch {
# If its json stringified
$obj = from_json($message);
};
# If its just a string
push( @$strings, $message ) if( !$obj );

map { $log_message->{$_} = $obj->{$_} } keys %$obj if $obj;
}

sub flush {
print STDERR to_json($log_message) ."\n";
$log_message = [];
if( @$strings > 0 && scalar( keys %$log_message ) > 0){
$log_message->{timestamp} = DateTime->now . 'Z';
$log_message->{messages} = $strings;
print STDERR to_json($log_message) ."\n";
}
($log_message, $strings) = ({}, []);
}

sub init {
Expand All @@ -31,6 +49,22 @@ in the output. It does this by queueing everything up and adding an
C<after> hook that calls the C<flush> function, which causes the logger
to output the log line for the current request.
In your C<config.yml> simply put:
logger: 'consoleAggregator'
To use this log module. Then you can debug like this:
debug { field1 => "data" };
debug to_json({ field2 => "data" });
debug "Raw Data";
And this module will log something like this:
{ "field1" : "data", "field2" : "data", "messages" : [ "Raw Data" ] }
All on one line.
=cut
1;

0 comments on commit 467271c

Please sign in to comment.