Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LogAgent version 2.0.90 not setting logsourse with the -n option #110

Closed
raovaibhav opened this issue Dec 7, 2017 · 10 comments
Closed

LogAgent version 2.0.90 not setting logsourse with the -n option #110

raovaibhav opened this issue Dec 7, 2017 · 10 comments

Comments

@raovaibhav
Copy link

Hi,

I used this command with logagent version 2.0.87 and it works perfect

cat /opt/apache_access.txt | /usr/bin/logagent -p -n myapp -f /opt/pattern.yml

But with version 2.0.90 I installed on a separate machine the same command does not set the logsourse and neither gives the output in pretty format.

I am not sure if I am missing something? Or this has been changed?

Commands to install logagent on the RHEL 7.2 machine:
curl --silent --location https://rpm.nodesource.com/setup_6.x | sudo bash -
sudo yum -y install nodejs
sudo npm i -g @sematext/logagent

Regards,
Vaibhav

@otisg
Copy link
Member

otisg commented Dec 7, 2017

Maybe the input, that apache_access.txt, is different on those 2 machines?
And/or maybe the pattern.yml you used is different?

@raovaibhav
Copy link
Author

raovaibhav commented Dec 8, 2017

Negative. I copied the same file 2 both locations and tried

Machine with 2.0.87:
[root@raova02-S7229 opt]# cat /opt/apache_access_new.txt | /usr/bin/logagent -p -n myapp -f /opt/apache.yml
2017-12-08T08:16:50.998Z pid[52468] init filter: inputFilter
2017-12-08T08:16:51.001Z pid[52468] init filter: outputFilter
2017-12-08T08:16:51.002Z pid[52468] init plugins
2017-12-08T08:16:51.003Z pid[52468] ../lib/plugins/input/stdin
2017-12-08T08:16:51.038Z pid[52468] ../lib/plugins/output/stdout
2017-12-08T08:16:51.151Z pid[52468] Logagent initialized
{
"logSource": "myapp",
"_type": "data",
"clientip": "62.209.128.3",
"remote_id": "-",
"user": "-",
"http_method": "GET",
"request": "/servicedesk/web/jsp/apps/OrderStatus.jsp?orderid=12PPHUA2321",
"http_version": "1.1",
"response_code": 200,
"bytes": 75,
"@timestamp": "2017-10-12T17:00:06.000Z",
"tenant_id": "AXA-USERSTORE",
"logtype": "apache_access",
"file": "myapp",
"timestamp": 1507827606
}

Machine with 2.0.90:
[root@raova02-I193779 logagent]# cat /opt/logagent/apache_access_new.txt | /usr/bin/logagent -p -n myapp -f /opt/logagent/apache.yml
2017-12-08T08:19:22.092Z pid[50983] init filter: inputFilter
2017-12-08T08:19:22.097Z pid[50983] init filter: outputFilter
2017-12-08T08:19:22.100Z pid[50983] init plugins
2017-12-08T08:19:22.101Z pid[50983] ../lib/plugins/input/stdin
2017-12-08T08:19:22.162Z pid[50983] ../lib/plugins/output/stdout
2017-12-08T08:19:22.351Z pid[50983] Logagent initialized
{"@timestamp":"2017-12-08T08:19:22.392Z","message":"0:0:0:0:0:0:0:1 - - [30/Sep/2017:10:47:18 -0400] "POST /onboarding/products HTTP/1.1" 204 -","logSource":"unknown","tenant_id":"AAA","file":"unknown","timestamp":1512721162.392,"_type":"data"}
{"@timestamp":"2017-12-08T08:19:22.393Z","message":"127.0.0.1 - - [30/Sep/2017:10:47:23 -0400] "GET /kron/groups/elasticsearch/jobs/ES-Index-Rollover HTTP/1.1" 200 17","logSource":"unknown","tenant_id":"AAA","file":"unknown","timestamp":1512721162.393,"_type":"data"}
{"@timestamp":"2017-12-08T08:19:22.394Z","message":"0:0:0:0:0:0:0:1 - - [30/Sep/2017:10:47:24 -0400] "POST /onboarding/tenants HTTP/1.1" 204 -","logSource":"unknown","tenant_id":"AAA","file":"unknown","timestamp":1512721162.394,"_type":"data"}
{"@timestamp":"2017-12-08T08:19:22.394Z","message":"0:0:0:0:0:0:0:1 - - [30/Sep/2017:10:47:24 -0400] "POST /onboarding/doc_type HTTP/1.1" 204 -","logSource":"unknown","tenant_id":"AAA","file":"unknown","timestamp":1512721162.394,"_type":"data"}
{"@timestamp":"2017-12-08T08:19:22.858Z","message":"0:0:0:0:0:0:0:1 - - [30/Sep/2017:10:47:25 -0400] "POST /onboarding/doc_type HTTP/1.1" 204 -","logSource":"unknown","tenant_id":"AAA","file":"unknown","timestamp":1512721162.858,"_type":"data"}

Even if doesnt parse it should atleast apply the "-P" and post stdout in pretty? also this parsing works fine if i run on the same machine with this command:

[root@raova02-I193779 logagent]# /usr/bin/logagent --config /opt/logagent/conf/logagent.conf
2017-12-08T08:23:02.066Z pid[51164] add files to plugin list
2017-12-08T08:23:02.072Z pid[51164] init filter: inputFilter
2017-12-08T08:23:02.075Z pid[51164] init filter: outputFilter
2017-12-08T08:23:02.075Z pid[51164] init plugins
2017-12-08T08:23:02.076Z pid[51164] ../lib/plugins/input/stdin
2017-12-08T08:23:02.137Z pid[51164] ../lib/plugins/output/stdout
2017-12-08T08:23:02.159Z pid[51164] ../lib/plugins/output/elasticsearch
2017-12-08T08:23:02.483Z pid[51164] ../lib/plugins/input/files
2017-12-08T08:23:02.621Z pid[51164] using glob pattern: {apache_access_new.txt,/opt/logs/*.log}
2017-12-08T08:23:02.726Z pid[51164] Logagent report: pid[51164] 1023 ms 0 lines parsed. 0 lines/s 0.000 MB/s - empty lines: 0
{"logSource":"apache_access_new.txt","_type":"data","clientip":"0:0:0:0:0:0:0:1","remote_id":"-","user":"-","http_method":"POST","request":"/onboarding/products","http_version":"1.1","response_code":204,"bytes":"-","@timestamp":"2017-09-30T14:47:18.000Z","tenant_id":"AXA-USERSTORE","logtype":"apache_access","file":"apache_access_new.txt","timestamp":1506782838}
{"logSource":"apache_access_new.txt","_type":"data","clientip":"127.0.0.1","remote_id":"-","user":"-","http_method":"GET","request":"/kron/groups/elasticsearch/jobs/ES-Index-Rollover","http_version":"1.1","response_code":200,"bytes":17,"@timestamp":"2017-09-30T14:47:23.000Z","tenant_id":"AXA-USERSTORE","logtype":"apache_access","file":"apache_access_new.txt","timestamp":1506782843}
{"logSource":"apache_access_new.txt","_type":"data","clientip":"0:0:0:0:0:0:0:1","remote_id":"-","user":"-","http_method":"POST","request":"/onboarding/tenants","http_version":"1.1","response_code":204,"bytes":"-","@timestamp":"2017-09-30T14:47:24.000Z","tenant_id":"AXA-USERSTORE","logtype":"apache_access","file":"apache_access_new.txt","timestamp":1506782844}
{"logSource":"apache_access_new.txt","_type":"data","clientip":"0:0:0:0:0:0:0:1","remote_id":"-","user":"-","http_method":"POST","request":"/onboarding/doc_type","http_version":"1.1","response_code":204,"bytes":"-","@timestamp":"2017-09-30T14:47:24.000Z","tenant_id":"AXA-USERSTORE","logtype":"apache_access","file":"apache_access_new.txt","timestamp":1506782844}
{"logSource":"apache_access_new.txt","_type":"data","clientip":"0:0:0:0:0:0:0:1","remote_id":"-","user":"-","http_method":"POST","request":"/onboarding/doc_type","http_version":"1.1","response_code":204,"bytes":"-","@timestamp":"2017-09-30T14:47:25.000Z","tenant_id":"AXA-USERSTORE","logtype":"apache_access","file":"apache_access_new.txt","timestamp":1506782845}

NOTE: here also the stdout is not in "pretty" format but the patterns file that has been configured in the conf file is able to parse the same log file when we update the monitored apache file after we run logagent (since it only reads new file changes)

@raovaibhav
Copy link
Author

Ok I am also not able to send anything to elastic either :(
Can someone tell me how to uninstall this version of LA and install version 2.0.87... ill just check if it fixes the issue or there is something else.

Itd be great if you can refer me documentation to do the above

@megastef
Copy link
Contributor

megastef commented Dec 8, 2017

@raovaibhav An updated version of the documentation is here: https://sematext.com/docs/logagent/output-elasticsearch/

Note you have to add module: elasticsearch in the Elasticsearch output module configuration.
We recently made plugin loading more generic. Instead of having a fixed path in config like output.elasticsearch you could have now N elasticsearch modules with different configurations.

@megastef
Copy link
Contributor

megastef commented Dec 8, 2017

@raovaibhav we just released 2.0.91 to npm. Command line arguments should work again as expected like -p for pretty JSON, -n for log sourceName. Please adjust you Elasticsearch config by adding module: elasticsearch.

Uninstalling logagent (if you like to run it as service use sudo for global installation):
sudo npm rm -g @sematext/logagent
Install logagent:
sudo npm i -g @sematext/logagent

@megastef megastef reopened this Dec 8, 2017
@megastef
Copy link
Contributor

megastef commented Dec 8, 2017

@raovaibhav Sorry, the commit did close the issue -> opened again. We will close the issue after your feedback.

@raovaibhav
Copy link
Author

raovaibhav commented Dec 8, 2017

Thanks @megastef
Adding the module:elasticsearch and version 2.0.91 fixed the issue.
Thanks for the prompt resolution.

Some querries more I have if possile kindly provide inputs:
Querry1:
I noticed that running sudo npm i -g @sematext/logagent, does not install it as a system service(systemd). We have to run sudo logagent-setup -u http://localhost:9200 -i INDEX_NAME? i want to use it as a systemctl service. How can I do that properly?
Also uninstall doesn't work incase its installed as a system service with the logagent-setup command? how can i clean it.

Querry 2:
Since we have option to post different files to different indexes, it is possile to post files that fail parsing to a separate index.
Example: i have 2 files that have kafka, only 1 of them parses logic. Can i post them to different indexes using something else other than the file name ( in my case i set up a field "logtype" and assign generic) incase parsing fails but still both files flow to the same index ( having same name as .*kafka.* )

Itd be great if someone can help me with this too.

@megastef
Copy link
Contributor

megastef commented Dec 8, 2017

Hi,

  1. logagent-uninstall will remove the systemd/upstart/launchd service and removes the npm package.
  2. You could add an output filter function, which checks for "logtype=generic", and set then context.index=TOKEN_FOR_GENERIC_LOGS . See: https://sematext.com/docs/logagent/filters/#output-filter
    The context.index field is interpreted by the ES output module. Elasticsearch output module has support to route logs from different files (logSource), but not by your custom field "logtype". We could extend Elasticsearch output module to be flexible with the field used for routing pattern.

@raovaibhav
Copy link
Author

Thanks a lot @megastef.. I'll try it out. Closing the issues

@raovaibhav
Copy link
Author

raovaibhav commented Dec 11, 2017

Hi @megastef

I tried routing with your suggestion but it doesn't seem to work

outputFilter:
module: !!js/function >
function (context, config, eventEmitter, data, callback) {
if (data.logtype === "generic") {
context.index = "alllogs"
callback(null, data)
}
}
output:
# index logs in Elasticsearch or Logsene
elasticsearch:
module: elasticsearch
url: http://localhost:9200
# default index (Logsene token) to use:
index: alllogs
indices:
Kafka:
# list of RegEx mathich logSource / filename
# all logs matching logSource name will be indexed to above index
- .*kafka.*

Still logs that have generic set in logtype are being routed to the kafka index based on the logfile name. Any help would be appreciated

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants