-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add example of swapping from linereader's next_batch() method #1
Comments
HI @dimo414! Thanks for making an issue! I think that the Example in the README does actually do what you want. It looks like Does that answer the question? (annotated example) use grep_cli::stdout;
use ripline::{
line_buffer::{LineBufferBuilder, LineBufferReader},
lines::LineIter,
LineTerminator,
};
use std::{env, error::Error, fs::File, io::Write, path::PathBuf};
use termcolor::ColorChoice;
fn main() -> Result<(), Box<dyn Error>> {
let path = PathBuf::from(env::args().nth(1).expect("Failed to provide input file"));
let mut out = stdout(ColorChoice::Never);
let reader = File::open(&path)?;
let terminator = LineTerminator::byte(b'\n');
let mut line_buffer = LineBufferBuilder::new().build();
let mut lb_reader = LineBufferReader::new(reader, &mut line_buffer);
while lb_reader.fill()? {
// It's the lb_reader.buffer() here that returns &[u8]. LineIter is just an optimized zero-copy iterator for iterating over those lines. You could use the returned buffer in the same way you were using the buffer from `next_batch`
let lines = LineIter::new(terminator.as_byte(), lb_reader.buffer());
for line in lines {
out.write_all(line)?;
}
lb_reader.consume_all();
}
Ok(())
} |
Thanks for the help! I've migrated over to ripline and Some thoughts:
|
I'd like to swap from linereader to this library due to the silently capped line lengths issue you mention in the README, but it's not obvious to me how to replicate that function using the APIs in ripline. It'd be helpful if you could add an example of processing batches of lines at a time. Thanks!
The text was updated successfully, but these errors were encountered: