Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stream support? #1150

Closed
stevenvachon opened this Issue Dec 12, 2015 · 7 comments

Comments

Projects
None yet
4 participants
@stevenvachon
Copy link

stevenvachon commented Dec 12, 2015

Would be nice if it could parse chunks.

@kpdecker

This comment has been minimized.

Copy link
Collaborator

kpdecker commented Dec 12, 2015

It's not exactly clear what this would entail or what the benefit would be. What is your use case?

@stevenvachon

This comment has been minimized.

Copy link
Author

stevenvachon commented Dec 12, 2015

Faster parsing via file/http stream input. A use case could be build scripts and/or compatibility with gulp.

@kpdecker

This comment has been minimized.

Copy link
Collaborator

kpdecker commented Dec 12, 2015

Reading the template is not the slow part of the compilation pipeline, it's the actual parsing and if you're putting in template that are large enough that you have concerns about the IO wait time or the cost of loading the template into memory prior to parsing, I'd start to wonder if there is too much data hard coded in the template.

That and our parser generator does not support a stream mode so the cost/benefit here doesn't really work out in my head.

@kpdecker kpdecker closed this Dec 12, 2015

@stevenvachon

This comment has been minimized.

Copy link
Author

stevenvachon commented Dec 12, 2015

With a stream, many templates could be parsed concurrently with less memory consumption and garbage collection because the template string never ends up in memory.

@kpdecker

This comment has been minimized.

Copy link
Collaborator

kpdecker commented Dec 12, 2015

It's a micro-optimization at best, if you are pushing megabytes of
templates into the parser, your going to see more issues from being CPU
bound in the parser itself.

If someone from the community were to investigate, we might consider
accepting it, but I fear that would basically be a full rewrite of the
parser.

On Fri, Dec 11, 2015 at 9:23 PM Steven Vachon notifications@github.com
wrote:

With a stream, many templates could be parsed concurrently with less
memory consumption and garbage collection.


Reply to this email directly or view it on GitHub
#1150 (comment)
.

@jakearchibald

This comment has been minimized.

Copy link

jakearchibald commented Feb 29, 2016

With streaming coming to the client & service worker, this may be worth revisiting?https://jakearchibald.com/2016/streams-ftw/#creating-one-stream-from-multiple-sources-to-supercharge-page-render-times

Also, dustjs was of real benefit when I built https://wiki-offline.jakearchibald.com/ (explaination https://www.youtube.com/watch?v=d5_6yHixpsQ&feature=youtu.be&t=4m22s), since some of my data was slower to arrive and I wanted a fast first render.

Dustjs handles promises & streams as template values, but unfortunately I have nothing nice to say about dustjs's syntax.

@mathiasbynens

This comment has been minimized.

Copy link

mathiasbynens commented Apr 7, 2016

One more resource to add to the list: https://jakearchibald.com/2016/streaming-template-literals/ (it links to this issue)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.