Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] File Source fails to process large files. #707

Closed
cmanning09 opened this issue Dec 7, 2021 · 0 comments · Fixed by #4256
Closed

[BUG] File Source fails to process large files. #707

cmanning09 opened this issue Dec 7, 2021 · 0 comments · Fixed by #4256
Assignees
Labels
bug Something isn't working
Milestone

Comments

@cmanning09
Copy link
Contributor

Describe the bug
A Data Prepper pipeline with a File Source cannot handle a file which is has more lines than the buffer capacity. The file source does not manage the Buffer Timeouts effectively. The File Source will throw an exception once the buffer is full.

To Reproduce
Steps to reproduce the behavior:

  1. Create a file that has a couple thousand lines of data
  2. Build and run a simple pipeline with a file source and stdout sink.
  3. See Error: "Error processing the input file path ..."

Expected behavior
The File source can handle timeouts from a full buffer and wait until there is room to add more lines to the buffer.

@cmanning09 cmanning09 added bug Something isn't working untriaged labels Dec 7, 2021
@dlvenable dlvenable added this to the v2.7 milestone Mar 8, 2024
@dlvenable dlvenable removed the backlog label Mar 8, 2024
@dlvenable dlvenable self-assigned this Mar 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Development

Successfully merging a pull request may close this issue.

2 participants