Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Logstash udp size limit #1505

Closed
bossy78 opened this issue Jul 7, 2014 · 10 comments
Closed

Logstash udp size limit #1505

bossy78 opened this issue Jul 7, 2014 · 10 comments

Comments

@bossy78
Copy link

bossy78 commented Jul 7, 2014

I am monitoring my companies applications. They send their logs via an udp appender (log4net) and my logstash agent receives them via an udp input. Everything was working well: my logs were groked and then stored in elasticsearch and displayed in kibana. Until today, I noticed that "big" logs where truncated, all the caracters after the 8151-th caracter are not displayed in the "message" field event, and so I am getting grokparsefailure. So basically, my question is the following : Does logstash have a limit size for each event/message received via udp, or more generally, via any input.

Thanks in advance, for your help.

@bossy78
Copy link
Author

bossy78 commented Jul 7, 2014

For the udp case, I think that I have found the solution :
-increase the buffer_size parameter in udp.rb file.

I cannot test it now, but I will tell you if it works.

@untergeek
Copy link
Member

You shouldn't have to edit the plugin file. There is a buffer_size configuration directive for the udp plugin already. 

http://logstash.net/docs/1.4.2/inputs/udp
—Aaron

On Mon, Jul 7, 2014 at 7:43 AM, bossy78 notifications@github.com wrote:

For the udp case, I think that I have found the solution :
-increase the buffer_size parameter in udp.rb file.

I cannot test it now, but I will tell you if it works.

Reply to this email directly or view it on GitHub:
#1505 (comment)

@jordansissel
Copy link
Contributor

The read size of udp can be changed:
http://logstash.net/docs/1.4.2/inputs/udp#buffer_size

-Jordan

On Monday, July 7, 2014, bossy78 notifications@github.com wrote:

I am monitoring my companies applications. They send their logs via an udp
appender (log4net) and my logstash agent receives them via an udp input.
Everything was working well: my logs were groked and then stored in
elasticsearch and displayed in kibana. Until today, I noticed that "big"
logs where truncated, all the caracters after the 8151-th caracter are not
displayed in the "message" field event, and so I am getting
grokparsefailure. So basically, my question is the following : Does
logstash have a limit size for each event/message received via udp, or more
generally, via any input.

Thanks in advance, for your help.


Reply to this email directly or view it on GitHub
#1505.

@bossy78
Copy link
Author

bossy78 commented Jul 8, 2014

Ok thanks for your answers.

@bossy78 bossy78 closed this as completed Jul 8, 2014
@domenkozar
Copy link

Why is the limit 8k and not 64k like the RFC specifies?

@jordansissel jordansissel changed the title Logstash event/message size limit Logstash udp size limit Nov 20, 2014
@jordansissel
Copy link
Contributor

@iElectric There is no such thing as a "limit" here. It's a setting which is tunable. I picked 8k for a default for reasons that are perhaps unimportant. You are welcome to change the size of a read as noted here: #1505 (comment)

I am open to changing the default value. If you wish the default value changed, please open a new ticket where we can discuss this. :)

@domenkozar
Copy link

Sure, see #2111

@Redsandro
Copy link

I'm trying to get one xml file into logstash, but it keeps splitting up into pieces.

I've raised the 'limit' like so:

input {
	udp {
		port => 12203
		buffer_size => 16777216
	}
}

However, when I insert a file for testing like so:

cat 71_kb_file.xml | nc -uq8 -O100000 localhost 12203

I'm still getting 35 partials. Is udp not suitable for getting bigger XML logs across?

The gelf input seems to take huge logs fine, and it uses udp too. How can I make udp chain chunks together?

@Farfaday
Copy link

Farfaday commented Jan 4, 2019

@Redsandro did you achieve that in the meantime ? I have a similar need.
Thanks!

@Redsandro
Copy link

@Farfaday I think the problem was using nc which caused buffering into chunks. Try it like this:

curl -H "content-type: text/xml" -XPUT 'http://localhost:12203' -d@71_kb_file.xml

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants