Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support batch mode in HTTP Transport #1970

Closed
BBE78 opened this issue Nov 27, 2021 · 9 comments
Closed

Support batch mode in HTTP Transport #1970

BBE78 opened this issue Nov 27, 2021 · 9 comments

Comments

@BBE78
Copy link
Contributor

BBE78 commented Nov 27, 2021

What's the feature?

For performance reason, a batch mode should be available on http transport to break the 1 log = 1 request and send http request by configurable batches.

What problem is the feature intended to solve?

When using http transport, a request is sent to the http server for every log call. So 100 logs, 100 http requests.

Is the absence of this feature blocking you or your team? If so, how?

Not blocking but it has a bad user experience impact on our web application. We're using winston http transport to collect all our client logs in our backend log file. During page loading, our web application is a little bit verbose in "info" level. We've seen that the http log requests has a direct impact on the load page time (ie disabling http logs made load pagefaster).

Is this feature similar to an existing feature in another tool?

Yes, take a look at winston-splunk-http-transport

Is this a feature you're prepared to implement, with support from us?

That's possible... :)

@DABH
Copy link
Contributor

DABH commented Dec 15, 2021

I think the maintainers are unlikely to have time to implement this ourselves, but we'd welcome a PR that adds this type of functionality in a clean way. Thanks!

@BBE78
Copy link
Contributor Author

BBE78 commented Dec 24, 2021

Hi @DABH, I've a fix available for this issue. Could you please give the needed authorizations to assign, create a branch and provide a PR?
Thanks!

@BBE78
Copy link
Contributor Author

BBE78 commented Dec 24, 2021

I think that I did the PR : #1998

@BBE78
Copy link
Contributor Author

BBE78 commented Dec 24, 2021

With my PR, HTTP batch mode is now supported. Logs are sent to the HTTP endpoint when the first of the following batch option is reached:

  • number of logs reached
  • batch timeout reached

@BBE78
Copy link
Contributor Author

BBE78 commented Jan 8, 2022

@DABH could you please have a look on the PR?

@BBE78
Copy link
Contributor Author

BBE78 commented Jan 15, 2022

@DABH thanks for reviewing and accepting my PR, so now when do you plan to make a new version? Hope it will not take 1 year and a half!

@DABH
Copy link
Contributor

DABH commented Jan 15, 2022

Yep, we have some other maintainers now, so a release should happen in the near future. Ideally we can fix the child logger / metadata issues that @maverick1872 is taking the lead on, and then we will cut a new release that includes your PR.

@BBE78
Copy link
Contributor Author

BBE78 commented Jan 16, 2022

Great !

@wbt
Copy link
Contributor

wbt commented Jan 27, 2022

I'm going to mark this as closed by #1998, with #2044 for tracking the release. Thanks for your contribution!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants