Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Vulnerability to attacks because there is no size limit #186

Closed
BijanVan opened this issue Mar 6, 2018 · 10 comments
Closed

Vulnerability to attacks because there is no size limit #186

BijanVan opened this issue Mar 6, 2018 · 10 comments

Comments

@BijanVan
Copy link

BijanVan commented Mar 6, 2018

let n = try_nb!(self.io.read_line(&mut self.line));

read_line function seems vulnerable to attack. If an attacker sends data continually without "\r\n" the size of buffer (String) would keep growing without a limit.

@carllerche
Copy link
Member

I agree, this should probably be fixed!

@BijanVan
Copy link
Author

BijanVan commented Mar 6, 2018

I think BufRead requires something like:
fn read_line(&mut self, buf: &mut String, max_length: usize) -> Result
in addition to:
fn read_line(&mut self, buf: &mut String) -> Result

https://doc.rust-lang.org/stable/std/io/trait.BufRead.html#method.read_line

@carllerche
Copy link
Member

I would probably not use the impl in std and write our own.

@kpp
Copy link
Contributor

kpp commented Mar 8, 2018

This issue should be created in futures repo, because they copied Lines combinator: https://github.com/rust-lang-nursery/futures-rs/blob/master/futures-util/src/io/lines.rs#L48

@carllerche
Copy link
Member

My guess is that the futures team is not interested in this change as they are focused on mirroring std and the issue exists in std as well.

/cc @cramertj @aturon

If it isn't resolved there, Tokio can provide an alternate, more robust, implementation.

@cramertj
Copy link
Contributor

IMO both std and futures should probably also offer a read_line_max or similar that allows specifying a maximum line size. I think it makes sense to offer this (or similar) functionality on the tokio side for now.

@carllerche carllerche added the E-help-wanted Call for participation: Help is requested to fix this issue. label May 11, 2018
@jasondavies
Copy link
Contributor

It sounds like the best way forward might be to add an optional max_line_length parameter to the lines combinator in futures-util? It seems that std will not add read_line_max (from the above linked issue) so I think Read::take could be used instead.

I'm not clear on why the same lines utility combinator exists in tokio-io and futures-util -- is the former going to be dropped?

@BijanVan
Copy link
Author

This could be a possible solution:

use bytes::BytesMut;
use futures::{Async, Poll, Stream};
use std::io::{self, BufRead};

use AsyncRead;

/// Combinator created by the top-level `lines` method which is a stream over
/// the lines of text on an I/O object.
#[derive(Debug)]
pub struct Lines<A> {
    io: A,
    line: BytesMut,
    max_length: usize,
}

/// Creates a new stream from the I/O object given representing the lines of
/// input that are found on `A`.
///
/// This method takes an asynchronous I/O object, `a`, and returns a `Stream` of
/// lines that the object contains. The returned stream will reach its end once
/// `a` reaches EOF.
pub fn lines_with_max_length<A>(a: A, max_length: usize) -> Lines<A>
where
    A: AsyncRead + BufRead,
{
    Lines {
        io: a,
        line: BytesMut::with_capacity(max_length),
        max_length: max_length,
    }
}

impl<A> Lines<A> {
    /// Returns the underlying I/O object.
    ///
    /// Note that this may lose data already read into internal buffers. It's
    /// recommended to only call this once the stream has reached its end.
    pub fn into_inner(self) -> A {
        self.io
    }
}

impl<A> Stream for Lines<A>
where
    A: AsyncRead + BufRead,
{
    type Item = String;
    type Error = io::Error;

    fn poll(&mut self) -> Poll<Option<String>, io::Error> {
        loop {
            if self.line.len() > self.max_length {
                return Err(io::Error::new(
                    io::ErrorKind::Other,
                    "Maximum length error.",
                ));
            }

            let n = try_ready!(self.io.read_buf(&mut self.line));
            if n == 0 && self.line.len() == 0 {
                return Ok(None.into());
            }

            let pos = self
                .line
                .windows(2)
                .enumerate()
                .find(|&(_, bytes)| bytes == b"\r\n")
                .map(|(i, _)| i);

            if let Some(pos) = pos {
                let mut line = self.line.split_to(pos + 2);
                line.split_off(pos);
                return Ok(Async::Ready(Some(
                    String::from_utf8_lossy(&line).to_string(),
                )));
            }
        }
    }
}

@carllerche
Copy link
Member

This should be possible to add to LinesCodec once #360 lands.

@tobz tobz closed this as completed in #590 Sep 12, 2018
hawkw added a commit that referenced this issue Sep 21, 2018
## Motivation

Currently, there is a potential denial of service vulnerability in the
`lines` codec. Since there is no bound on the buffer that holds data
before it is split into a new line, an attacker could send an unbounded
amount of data without sending a `\n` character. 

## Solution

This branch adds a `new_with_max_length` constructor for `LinesCodec`
that configures a limit on the maximum number of bytes per line. When
the limit is reached, the the overly long line will be discarded (in 
`max_length`-sized increments until a newline character or the end of the
buffer is reached. It was also necessary to add some special-case logic
to avoid creating an empty line when the length limit is reached at the 
character immediately _before_ a `\n` character.

Additionally, this branch adds new tests for this function, including a
test for changing the line limit in-flight.

## Notes

This branch makes the following changes from my original PR with
this change (#590):

- The whole too-long line is discarded at once in the first call to `decode`
  that encounters it.
- Only one error is emitted per too-long line.
- Made all the changes requested by @carllerche in
  #590 (comment)

Fixes: #186 

Signed-off-by: Eliza Weisman <eliza@buoyant.io>
@Darksonn Darksonn removed the E-help-wanted Call for participation: Help is requested to fix this issue. label Sep 25, 2020
@spion
Copy link

spion commented Dec 19, 2021

While now there is an easier way to read lines with a maximum limit, this still doesn't address the issue that the easiest to use API will result with unsafe behavior. Instead of buffered reader, you need to be aware of the LinesCodec and FramedReader. Unfortunately, there is nothing about using read_line from Buffered Reader that will at least point you in the right direction (such as maybe via compiler warnings) - as far as I can tell?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants