File watcher! #42

merged 3 commits into from Sep 27, 2016


None yet

2 participants

dazinator commented Jul 15, 2016 edited

Ok so have a quick look at this!

I have made some quick tweaks to demo an idea.

PreProcessManager now watches the input files, and a delegate is invoked whenever one is edited.

To demonstrate this, start the website running, and put a break point somewhere in here within PreProcessManager.ProcessFile()

 _fileSystemHelper.Watch(file, (f) =>
                    var x = f;
                    var message = "some file changed..";


With the site running, edit a file like test1.js - you should find the break point is hit.

I think we can use this to:

  1. Re-execute the pipeline and update the cache / bundle files when things change?
  2. Invalidate the client side cache some how, so the next request gets the new files?

Both those features would be big wins for development mode.

Need your input on this in order to work out if its truly viable, and how best to do it.

Also I made some minor tweeks. Smidge often assumes that an input file (IWebFile) is located on the physical disk. For example it would do File.Exists() checks. I changed this so that Smidge gets files through the IFileProvider which returns an IFileInfo. It uses IFileInfo.Exists instead. The beauty of this is that paths provided to smidge do not need to relate to physical file system, they just need to be files accessbile through the IFileProvider - so in theory they can come from anywhere as this is just a type of virtual directory. The default IFileProvider is normaly a PhsyicalFileProvider ofcourse, however in more complicated applications, it could be a CompositeFileProvider, providing access to files from many different sources - azure blob storage, embedded resources etc etc

Smidge does still rely on the Physical disk ofcourse, when it is creating the cache / bundle files, and thats absolutely fine.


Sounds great!!!I'm on a business trip at the moment so will have a look as soon as i can

@dazinator dazinator Simplified ReverseMapPath function a little bit, and fixed up tests.
Removed a couple of tests to do with mappath as they no longer make sense, as rather than mapping a request path to some physical path, we now actually use the request path to get an IFileInfo object from the IFileProvider. An IFileInfo object need not have a physical path - we only care if it exists or not - and getting its bytes[], not what its physical path may be.

No worries.
I have hit an odd issue - on my work pc (worked fine at home on the weekend), I run the Smidge.Web project - and on the UglifyNodeMinifier:


It never gets past the await statement to nodeservices. i.e the return result is never hit. No idea why! I'll play around.

dazinator commented Jul 18, 2016 edited

Ah ok - put in a try catch and this explains it:


You will probably want to handle this if making uglifyjs the default!


yes indeed :) will need to do more testing, config docs, etc... Prob won't make it the default, so far just experimenting


Just putting a brain dump here, feel free to ignore.

In Dev mode - smidge doesn't do any pre-processing or bundling, it just serves the original files fed to it without any modification.

In which case watching files in Dev mode is probably not very beneficial at the moment. But that also means that you can't use Smidge in development as a solution for compilation of ts, less or sass -as it's pre-processors aren't executed. So you have to have some other compilation process during development to compile these files, even though Smidge is technically capable. Potential here for allowing smidge to execute pre-processing during dev mode.

So, that aside, in Release mode, Smidge iterates each input file, processes it through the pipleine, and writes the resultant output for the file to a cache file (in cache folder). It also, then produces the bundle file itself, in the gzip folder - which is a combination of each cache file, joined together, in order.

So far, I have found a way to detect changes to any input file, and re-execute the pipeline against that file, and to update the cached output.

But I have not yet found a way to:

  1. Reproduce a new bundle file (the file in the gzip folder) with the changed contents.
  2. Invalidate the client side cache.

For 2) i think it might be as simple as appending something to the version property used in the generated urls.
For 1) i think I need to try and get the bundle the file is part of, then somehow reproduce the bundle file, by combining all the "cache" files for each file in the bundle.

@dazinator dazinator Fixed a bug in FileSystemHelper
It was assuming the IFileInfo's had a PhysicalPath when checking the extension of the file. This is not necessarily true as this property can be null depending on the file provider. Fixed by changing it to check the file Name instead.

Thanks for all of this! I've been working through a bunch of this last night looks like I can make this all possible. There's a bit of refactoring needed - but it will be really great as the end result. As you noted before there is no current good way to handle things like TypeScript, etc... because the pre-processor pipeline is ignore in these cases and the original design only catered for normal js/css requests. I'm updating the code now to allow:

  • Global Smidge options - these are already in place
  • Bundle options - this is half in place but will be extended. Now each bundle can have options assigned to it: File watching options, compression & combination options. And these options can be specified for either Debug or Production modes. These options will be optional to set and will use the defaults options otherwise.

With that in place it will give far more granular control over what happens to a bundle and also be able to better handle file watching. I'll keep plugging away at this and let you know how I go.



Very nice! Looking forward to trying it out once its ready. Let me know if I can help out with anything! In NetPack i have written some code to detect whether nodejs is installed, as well as code to detect if npm modules are installed (and optionally install them) that you are welcome to pinch if you need to. Also have a basic typescript compiler there that passes typescript as strings to nodejs rather than file paths.


Fantastic! might have to borrow some of that :) Hopefully will find some more time to work on this in the next day or two


I've been trying to get some of these changes pushed and have ended up refactoring a few things and noticed that I have a little problem. Currently, each IPreProcessor is has a request scoped lifetime, that's because some of these processors needs access to the current request so that the processor knows how to process things like URLs. This is an issue because the PreProcessPipelineFactory is a singleton which accepts a list of IPreProcessor instances.

There's a couple ways to try to fix this:

  • Change IPreProcessor to not rely on the current request and make them singletons - but I'm unsure how to make this possible due to the reliance on processing relative/absolute URLs
  • Change IPreProcess to singletons and add an HttpRequest parameter to the FileProcessContext which gets passed around so that during processing they have access to the request
  • ... i'm sure there's a few other scenarios to fix this up


When performing file watching and wanting to re-process, this happens in a background thread which has no Request. So the only way to get file watching really working is to make the pre-processors that need to know about the request (i.e. some of the css ones) to somehow not rely on it at all.

I've pushed changes to a new branch: this currently builds but does not run, you can see my commit comments here: 35b9abe

The changes from your PR have been put into this branch but now need to figure out what to do with these request based pre-processors. It's been a long time since I looked at how the css urls stuff works so not sure how possible it will be to run them without knowing the request. It might be possible for them to work just by knowing the site's base URL - but to do that will have to be a custom interface to return the site's base URL and have the value be populated by custom middleware on the first request, and/or configurable by options.

dazinator commented Aug 7, 2016 edited

Interesting, I havent got as far as dealing with Css in NetPack yet. It does surprise me that it needs the current request to preprocess the files. I suspect what it really needs is some configuration options like basepath etcs. Allowing a preprocessing step to take some configurable options is the path I have taken in NetPack. I have an "rjs" branch now with a working typescript demo page, and also an experimental rjs preprocessor for optimising Amd modules. I allow each preprocessing pipe to be configured as the pipeline is built so that options that a particular pipe needs like the baseUrl etc and anything else can be passed in ahead of time (i.e before any request). The pipeline is then flushed (processed) on application startup so that all the pre-processing happens before the initial request. A watcher process (if you have watching enabled) then runs as a background task to re-flush pipelines whose inputs changes.

Its funny because i was about to add a Combine Pipe, and then a though occured that as these preprocessed files vare visible via the IFileProvider, I could plug smidge in to handle the bundling aspect of those preprocessed files. Somewhat of a hybrid solution but it might make a fun experiment.

The only problem (in my mind) with smidges bundling capability for me at present is that it doesnt currently handle source maps? I have been looking at a c# .Net solution to that problem and think I have found a solution to handle bundling with sourcemap support purely in C# (no calls out to nodejs). Let me know if yoy are interested more on that topic.

P.s when doing typescript preprocessing, I ended up having to write an NPM package for an in memory typescript compiler that does in memory compilation of a set of typescript files rather than individual teanspile strings individually. I found this was the only way to do a proper compilation of a typescript project that does type checking and other things. Transpiling individual strings loses some safety aspects. Because smidge processes individual files and then combines them at the end, I am not sure how easy doing something like a typescript project compilation (i.e multiple files in a single step) would currently fit into the smidge architecture?

@Shazwazza Shazwazza added a commit that referenced this pull request Sep 27, 2016
@Shazwazza File watcher! #42
Feature Request: Add Options to Control Etag and Cache-Control headers. #48
This is a huge update, this adds fluent builders for defining debug vs production options, adds lots of new options for file watching and caching, gets file watching working including refreshing of persistent files.
@Shazwazza Shazwazza merged commit 3cb8378 into Shazwazza:master Sep 27, 2016

1 check failed

continuous-integration/appveyor/pr AppVeyor build failed

I've got this all working now :) Mostly in this rev: c89986f but also in others.

So file watching is working, it will refresh the disk persisted files too and depending on the cache settings you use you'd see the changes straight away. For example: c89986f#diff-30fbbf9f392f23bc5a5b6d71102e6bd4R99

This change also fixes the problem of debug/production, so now you can have whatever options you want for debug/production including ensuring that composite processing is enabled for debug meaning that the pipeline will execute, so this means things like TypeScript, Less, or whatever will work nicely in debug.

I haven't got around to looking at much more (i.e. source maps) but next up will be further defining the options such as in-memory (non persistent) #46 and also cache busting options, etc...

This will all be part of Smidge 2.0, now just need to find more time ;)

@Shazwazza Shazwazza added this to the 2.0 milestone Sep 27, 2016

Nice work.. Thats really cool stuff..I'll check it out.
Dealing with the existing source maps when combining already preprocessed files into a bundle file, isn't actually too difficult thanks to something called an index source map, which essentially lets you inline the existing .map files in a new .map file which you produce for the bundle file. I'd be willing to help with that for v2 if you need a hand with that as I its something I eventually got working in a branch of NetPack.


Sounds wonderful. I probably won't have time to look at source map stuff for v2 so if it's something you could get in there that'd be amazing

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment