New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support returning a Stream #149

Open
sorin-davidoi opened this Issue Aug 29, 2018 · 4 comments

Comments

Projects
None yet
3 participants
@sorin-davidoi

sorin-davidoi commented Aug 29, 2018

Would be an interesting direction to explore. The main use-case would be streaming HTML to the browsers without blocking on database calls.

let name = futures::future::ok("<p>Maud</p>");

let stream = html_stream! {
  h1 { "Greetings, " }
  (name)
  p { "How are you?" } 
};

@lfairy lfairy added the enhancement label Aug 29, 2018

@lfairy

This comment has been minimized.

Owner

lfairy commented Sep 5, 2018

What's your intended use case for this feature?

My first thought would be to apply pagination or use Ajax calls instead. Streaming raises some difficult design issues that I'd rather not deal with unless there's a strong need for it.

In particular, if the stream raises an error in the middle of rendering a page, then a naive solution would leave the page cut off. That's not a nice user experience IMO. There are ways to handle this problem (e.g. display an error message inline), but the options at that point are more constrained than if all the data was fetched in advance.

(Sorry for the lack of response. I should have asked this question before applying the label.)

@sorin-davidoi sorin-davidoi referenced this issue Sep 5, 2018

Closed

Implement streaming #150

2 of 6 tasks complete
@sorin-davidoi

This comment has been minimized.

sorin-davidoi commented Sep 5, 2018

What's your intended use case for this feature?

The main use-case is to improve performance - don't want to have a blank page for 2 seconds while the database query is running. Would also want to keep as much of the logic server-side, so no Ajax calls and maintaining state on the client.

I think that if we enforce that all the futures have the type Future<Item = Markup, Error = Markup> then it is up to the user to decide what to do if there is an error (error message inline, ignore).

@anxiousmodernman

This comment has been minimized.

Contributor

anxiousmodernman commented Sep 22, 2018

I have also encountered this use-case when using home-grown logging interfaces that render
large tables of data in pure HTML server-side.

That exact use case might be kind of an anti-pattern, but I'm sympathetic to this change. Mostly because, in theory, it should be feasible to yield valid chunks of HTML tokens as a stream. The question is if it can be done in an elegant way, and if it can work nicely with various frameworks.

On the framework front, async support is uneven. Rocket uses its own interface for streaming responses, for instance. Which frameworks are you targeting?

@sorin-davidoi

This comment has been minimized.

sorin-davidoi commented Sep 22, 2018

I was looking to target hyper, which I kind of got working with #150 and

#[cfg(feature = "hyper")]
mod hyper_support {
    use PreEscaped;
    use hyper::body::Payload;
    use hyper::{Chunk, Error};
    use futures::Async;

    impl Payload for PreEscaped<String> {
        type Data = Chunk;
        type Error = Error;
        fn poll_data(&mut self) -> Result<Async<Option<Self::Data>>, Self::Error> {
            Ok(Async::Ready(Some(Chunk::from(self.0.clone()))))
        }
    }
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment