-
Notifications
You must be signed in to change notification settings - Fork 10k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support serving of pre-GZIP encoded files #2458
Comments
From @tugberkugurlu on Friday, July 25, 2014 1:24:16 PM 👍 for this one. |
From @nikmd23 on Wednesday, September 10, 2014 1:47:47 PM I'd also like to point out that this shouldn't be about "pre-gziping", but rather pre-compressing in general, no matter what the encoding format is. For example, Microsoft and Google have both played with improved encoding schemes ( What this means is, in reality, I might want to pre-compile a file into multiple formats and have the proper one selected based on the I think of this as being very analogous to keeping images in both |
From @resnyanskiy on Monday, January 18, 2016 6:37:35 PM I found simple workaround based on URL redirect supported in HTTP: class Startup
{
private StaticFileOptions StaticFileOptions
{
get
{
return new StaticFileOptions
{
OnPrepareResponse = OnPrepareResponse
};
}
}
private void OnPrepareResponse(StaticFileResponseContext context)
{
var file = context.File;
var request = context.Context.Request;
var response = context.Context.Response;
if (file.Name.EndsWith(".gz"))
{
response.Headers[HeaderNames.ContentEncoding] = "gzip";
return;
}
if (file.Name.IndexOf(".min.", StringComparison.OrdinalIgnoreCase) != -1)
{
var requestPath = request.Path.Value;
var filePath = file.PhysicalPath;
if (IsDevelopment)
{
if (File.Exists(filePath.Replace(".min.", ".")))
{
response.StatusCode = (int)HttpStatusCode.TemporaryRedirect;
response.Headers[HeaderNames.Location] = requestPath.Replace(".min.", ".");
}
}
else
{
var acceptEncoding = (string)request.Headers[HeaderNames.AcceptEncoding];
if (acceptEncoding.IndexOf("gzip", StringComparison.OrdinalIgnoreCase) != -1)
{
if (File.Exists(filePath + ".gz"))
{
response.StatusCode = (int)HttpStatusCode.MovedPermanently;
response.Headers[HeaderNames.Location] = requestPath + ".gz";
}
}
}
}
}
public void Configure(IApplicationBuilder application)
{
application
.UseDefaultFiles()
.UseStaticFiles(StaticFileOptions)
}
} I used Wikipedia as reference. This approach also allows to use not minified files in development environment with no need to change links on client side. I found that currect implementation of 'aspnet/StaticFiles' lacks of two feature:
@davidfowl fyi. |
From @nikmd23 on Monday, January 18, 2016 7:30:14 PM In support of my comment above, both FireFox and Chrome will be shipping support for Brotli compression soon. |
From @jods4 on Monday, February 22, 2016 8:44:29 AM 👍 Actually, even a good story for serving gzipped static files (with a cache, of course, since it's static compression) is currently lacking. I'm saying this in the "conventional", "IIS-kind" of way, as opposed to the build-time gzip generation. For those coming here from Google with dynamic content compression needs, you might try this gist: |
From @JohannesRudolph on Tuesday, July 5, 2016 1:33:54 AM Would it be beyond the scope of this to suggest that StaticFiles should optionally allow cached "on-demand" compression of files? (I.e. gzip on first request and put result to file system cache, serve gzip requests from there) |
From @twilliamsgsnetx on Friday, July 22, 2016 4:39:03 AM Really looking forward to being able to do this. I can go through all the trouble of bundling, minifying, tree shaking and gzipping to get things down to an absolute minimum size... but I can't serve those .js.gz files and it's problematic. I'm also strugglin to get IIS to gzip itself as well, but that's for another area entirely. |
From @neyromant on Friday, July 29, 2016 10:26:24 PM |
From @dfaivre on Wednesday, February 15, 2017 4:34:00 AM With Angular (2) creating *.gz files when doing --prod builds, it would be great to have this out of the box. |
From @RehanSaeed on Wednesday, February 15, 2017 4:36:34 AM Other related use cases are serving pre-brotli compressed (.br) files and also serving WebP (.webp) files instead of PNG/JPG/etc. |
From @Tratcher on Friday, February 17, 2017 5:54:44 AM @JohannesRudolph you should now be able to combine ResponseCaching, ResponseCompression, and StaticFiles to achieve dynamic compression and caching of static files. @JunTaoLuo this would be a good combo to test. |
From @Tratcher on Friday, February 17, 2017 5:59:33 AM Can folks clarify if they expect the original url to contain the compression extension (e.g. ".gz")? Some of the above samples do and some don't. I assume content negotiation based on accept headers would be the more general case. |
From @JohannesRudolph on Friday, February 17, 2017 6:03:22 AM Great! Content-negotiation is (from my experience) the far more often-used On Fri, Feb 17, 2017 at 2:59 PM, Chris R notifications@github.com wrote:
|
From @Tratcher on Thursday, March 23, 2017 1:29:33 PM @JunTaoLuo can you do a sample? |
From @firecube on Sunday, April 2, 2017 5:22:57 AM A small example would be really appreciated here if possible. Thanks in advance. |
From @JunTaoLuo on Sunday, April 2, 2017 3:59:35 PM You can take a look at the sample I have created at https://github.com/JunTaoLuo/MiddlewaresSample which uses ResponseCaching, ResponseCompression and StaticFiles to create, cache and serve different representations of the same resource. Here's the sample output where I made 6 requests to
|
From @Tratcher on Sunday, April 2, 2017 8:24:55 PM Hmm, no logs for the compression middleware... |
From @joeaudette on Sunday, October 22, 2017 1:55:08 PM I came across this because I've started using webpack to pre-gzip my js and css. The solution I came up with was implementing a custom IFileProvider based on the code from CompositeFileProvider. I set it up like this:
it uses a convention, if the requested file ends with .min.js or .min.css it will look for the same file name with .gz on the end, and if found it returns that. Then later I got the idea to add logic to try to create the .gz file if it does not exist, and then return that on success else return the original file. It seems to be working well, would appreciate any feedback on the implementation, found here: My solution is using standard .min.js and .min.css urls, but the .gz file is served, I'm not using .gz in my urls. One known issue is that this solution is bypassing content negotiation, and just giving you gzip whether you like it or not, but really not a big issue in my view for real browsers. |
From @Tratcher on Sunday, October 22, 2017 6:59:05 PM That approach sounds like it would mess up the content-length and etag headers. |
From @joeaudette on Monday, October 23, 2017 4:35:28 AM @Tratcher could you elaborate on how that would get messed up? The IFileProvider is passing up the IFileInfo about the gz file so it would have the correct content length of the gz file, isn't that what it should have? We still have an opportunity to tweak the headers in OnPrepareResponse if there is something messed up, but I'm trying to understand what would be messed up and why. |
From @Tratcher on Monday, October 23, 2017 6:48:18 AM Content-length and gzip are rarely used together because the implications are really confusing. I'll need to verify, but I think the content-length is supposed to be the uncompressed length rather than the compressed length. StaticFiles also uses the length to calculate the etag, so your pre compressed file will have a different etag than the uncompressed version, even if the contents are the same. StaticFiles also has built in support for range headers, which refer to offsets in the uncompressed file. This won't work with compressed files. Managing pre compression in the file provider is inadequate, it needs to be built into StaticFiles to make the above scenarios work correctly (or at least be bypassed correctly). |
From @joeaudette on Monday, October 23, 2017 6:58:57 AM @Tratcher that makes sense for dynamic compression, but the issue here is pre-compressed static files and I think we would want content-length for any static file wouldn't we? and it should be used to calculate the etag I would think. In my scenario the .gz file is created by the webpack build process, while my FileProvider can be configured to generate the .gz file that is a secondary concern, mainly I am serving static files that are already gzipped. In my scenario without content negotiation the browser is only going to get the already compressed static file, the uncompressed file would not be returned unless the compressed file does not exist and could not be created. If the source file is modified newer than the compressed file I am regenerating the compressed file. |
From @joeaudette on Monday, October 23, 2017 7:12:19 AM @Tratcher so are you saying that for my pre-gzipped files I should remove the Content-Length and Accept-Range headers? |
From @Tratcher on Tuesday, October 24, 2017 1:52:30 PM The edit on that answer is really telling.. |
From @joeaudette on Tuesday, October 24, 2017 2:08:35 PM yes I saw that, you notice I said it would be logical, not that is how it is done, it sounds like in practice dynamic compression does it wrong, they should use transfer-encoding but they don't because of browsers :-D but nevertheless I think it will be ok to keep the Accept-Ranges header |
From @herecydev on Monday, December 4, 2017 1:30:06 AM Adding my implementation into the mix. Use case, client calls public class CompressionFileProvider : IFileProvider
{
private readonly IFileProvider _fileProvider;
private readonly IHttpContextAccessor _httpContextAccessor;
private readonly string _root;
public CompressionFileProvider(IHostingEnvironment hostingEnvironment, IHttpContextAccessor httpContextAccessor)
{
_fileProvider = hostingEnvironment.WebRootFileProvider;
_httpContextAccessor = httpContextAccessor;
_root = hostingEnvironment.WebRootPath;
}
public IDirectoryContents GetDirectoryContents(string subpath)
=> _fileProvider.GetDirectoryContents(subpath);
public IFileInfo GetFileInfo(string subpath)
{
if (_httpContextAccessor.HttpContext.Request.Headers.TryGetValue("Accept-Encoding", out var encodings))
{
if (encodings.Any(encoding => encoding.Contains("br")))
{
var compressedEncoding = _fileProvider.GetFileInfo(subpath + ".br");
if (compressedEncoding.Exists)
return compressedEncoding;
}
if (encodings.Any(encoding => encoding.Contains("gzip")))
{
var compressedEncoding = _fileProvider.GetFileInfo(subpath + ".gz");
if (compressedEncoding.Exists)
return compressedEncoding;
}
}
return _fileProvider.GetFileInfo(subpath);
}
public IChangeToken Watch(string filter)
=> _fileProvider.Watch(filter);
}
public static class ApplicationBuilderExtensions
{
public static IApplicationBuilder UseCompressedStaticFiles(this IApplicationBuilder applicationBuilder, IHostingEnvironment hostingEnvironment, IHttpContextAccessor httpContextAccessor)
{
return applicationBuilder.UseStaticFiles(new StaticFileOptions
{
FileProvider = new CompressionFileProvider(hostingEnvironment, httpContextAccessor),
OnPrepareResponse = ctx =>
{
var headers = ctx.Context.Response.Headers;
if (ctx.File.Name.EndsWith(".br"))
headers.Add("Content-Encoding", "br");
else if (ctx.File.Name.EndsWith(".gz"))
headers.Add("Content-Encoding", "gzip");
}
});
}
} |
I'm trying to use the However, the |
StaticFiles doesn't set the content-encoding headers, you have to do it yourself in the OnPrepareResponse event. ResponseCompression can't tell it's compressed unless you set that header. |
As requested by @DamianEdwards I am adding a note to remind him/others of the chain resulting from https://twitter.com/DamianEdwards/status/980814294756425728 which brings up the CRIME and BREACH exploits in HTTPS when using gzip compression on dynamic pages (any page containing user content). In the meantime: because of these security issues the official stance is disable compression for HTTPS by default. In my opinion, if you know you only put user-modified variables in text/html pages you could use dynamic compression and exclude HTML pages based on mime type. "As-is" rules apply... services.AddResponseCompression(options => {
options.EnableForHttps = true;
/* Includued defaults: "text/plain", "text/css", "application/javascript",
"text/html", "application/xml", "text/xml", "application/json", "text/json" */
options.MimeTypes = ResponseCompressionDefaults.MimeTypes
.Where(mime => !mime.Contains("text/html"))
.Concat(new[] {
"application/font-woff",
"font/woff2"
});
}); /cc @blowdart |
This could be enhanced by supporting compression in the static files middleware. |
@deedubb Are we not protected as the CRSF token is randomized? (see https://blog.qualys.com/ssllabs/2013/08/07/defending-against-the-breach-attack) |
I think the eventual fix here would be to introduce a new request feature that the static file middleware can set when serving files that indicates the response is from static content, thus allowing the compression middleware to safely compress responses only when the feature indicates it was static in nature. Of course it's still possible for another middleware in-between to change the content and introduce issues again. If we care enough about that we could have the feature actually poison the response headers and body (e.g. via wrapping) such that they can't be modified. Also, technically it isn't about what's static but what contains user manipulable content, so the feature should likely be built around that premise instead, allowing any response (including dynamic responses from things like MVC actions) to declare that they're free from user manipulation and thus safe to compress over HTTPS. |
I want to toss in my two cents here about this. From about half of the discussion that I read, people are trying to generate and cache the compressed files at application runtime and then go from there. To me that is wasteful. Using Gulp I generate my pre-compressed static files during build time and I publish those, then serve the correct one using my PreCompressedStaticFiles middleware. This way I don't have to deal with server-side generating, caching, etc. That is how I serve the CSS and JS files on my site. gulpfile.js/// <binding AfterBuild="global:compile" Clean="global:clean-all" ProjectOpened="global:watch" />
const rootNode = "./node_modules",
rootResources = "./Resources",
rootWww = "./wwwroot",
sourcesCss = [
`${rootNode}/normalize.css/normalize.css`,
`${rootResources}/Styles/**/*.css`
],
sourcesJs = [
`${rootResources}/Scripts/**/*.js`
],
sourcesLess = [
`${rootResources}/Styles/**/*.less`
],
targetCss = "styles.min.css",
targetJs = "scripts.min.js";
const del = require("del"),
gulp = require("gulp"),
gulpBrotli = require("gulp-brotli"),
gulpCleanCss = require("gulp-clean-css"),
gulpConcat = require("gulp-concat"),
gulpGzip = require("gulp-gzip"),
gulpLess = require("gulp-less"),
gulpRename = require("gulp-rename"),
gulpTerser = require("gulp-terser-js"),
rollup = require("rollup");
// ========================================================================
// LESS to CSS
// ========================================================================
/**
* Delete CSS files. Excludes all Node CSS files.
*/
gulp.task("css:clean", () => del([
`${rootWww}/**/*.css`,
`!${rootNode}/**/*.css`
]));
/**
* Delete all CSS files. Excludes all Node CSS files.
*/
gulp.task("css:clean-all", () => del([
`./**/*.css`,
`!${rootNode}/**/*.css`
]));
/**
* Compress minified CSS with Brotli.
*/
gulp.task("css:compress-brotli", () => gulp.src(`${rootWww}/${targetCss}`)
.pipe(gulpBrotli())
.pipe(gulp.dest(rootWww)));
/**
* Compress minified CSS with GZIP.
*/
gulp.task("css:compress-gzip", () => gulp.src(`${rootWww}/${targetCss}`)
.pipe(gulpGzip({
gzipOptions: {
level: 9
}
}))
.pipe(gulp.dest(rootWww)));
/**
* Concatenate all CSS files into a single CSS file.
*/
gulp.task("css:concatenate", () => gulp.src(sourcesCss)
.pipe(gulpConcat("concatenated.css"))
.pipe(gulp.dest(rootWww)));
/**
* Minify the concatenated CSS file.
*/
gulp.task("css:minify", () => gulp.src(`${rootWww}/concatenated.css`)
.pipe(gulpCleanCss({
level: {
1: {
specialComments: 0
}
}
}))
.pipe(gulpRename(targetCss))
.pipe(gulp.dest(rootWww)));
/**
* Delete all CSS files after minification.
*/
gulp.task("css:minify-clean", () => del([
`${rootWww}/**/*.css`,
`!${rootWww}/${targetCss}`
]));
/**
* Compile all CSS files.
*/
gulp.task("css:compile", gulp.series(
"css:concatenate",
"css:minify",
"css:minify-clean",
"css:compress-brotli",
"css:compress-gzip"
));
// ========================================================================
// LESS
// ========================================================================
/**
* Compile all LESS files.
*/
gulp.task("less:compile", () => gulp.src(sourcesLess, { base: "./" })
.pipe(gulpLess())
.pipe(gulp.dest("./")));
// ========================================================================
// JavaScript
// ========================================================================
/**
* Delete JS files. Excludes this gulpfile.js and
* all Node JS files.
*/
gulp.task("js:clean", () => del([
`${rootWww}/**/*.js`,
`!${rootNode}/**/*.js`
]));
/**
* Delete all JS files. Excludes this gulpfile.js and
* all Node JS files.
*/
gulp.task("js:clean-all", () => del([
`./**/*.js`,
"!./gulpfile.js",
`!${rootNode}/**/*.js`
]));
/**
* Compress minified JS with Brotli.
*/
gulp.task("js:compress-brotli", () => gulp.src(`${rootWww}/${targetJs}`)
.pipe(gulpBrotli())
.pipe(gulp.dest(rootWww)));
/**
* Compress minified JS with GZIP.
*/
gulp.task("js:compress-gzip", () => gulp.src(`${rootWww}/${targetJs}`)
.pipe(gulpGzip({
gzipOptions: {
level: 9
}
}))
.pipe(gulp.dest(rootWww)));
/**
* Delete all JS files after minification.
*/
gulp.task("js:minify-clean", () => del([
`${rootWww}/**/*.js`,
`!${rootWww}/${targetJs}`
]));
/**
* Minify the rolledup JS file.
*/
gulp.task("js:minify", () => gulp.src(`${rootWww}/scripts.rolledup.js`)
.pipe(gulpTerser({
ecma: 9
}))
.pipe(gulpRename(targetJs))
.pipe(gulp.dest(rootWww)));
/**
* Rollup the JS files.
*/
gulp.task("js:rollup", () => rollup.rollup({
input: `${rootResources}/Scripts/Default.js`
}).then(bundle => bundle.write({
file: `${rootWww}/scripts.rolledup.js`,
format: "iife",
sourcemap: false
})));
/**
* Compile all JS files.
*/
gulp.task("js:compile", gulp.series(
"js:rollup",
"js:minify",
"js:minify-clean",
"js:compress-brotli",
"js:compress-gzip"
));
// ========================================================================
// Global
// ========================================================================
/**
* Delete all CSS and JS files, with exclusions.
*/
gulp.task("global:clean-all", gulp.parallel(
"css:clean-all",
"js:clean-all"
));
/**
* Compile all CSS and JS files.
*/
gulp.task("global:compile", gulp.parallel(
gulp.series(
"less:compile",
"css:clean",
"css:compile"
),
gulp.series(
"js:clean",
"js:compile"
)
));
/**
* Watch the file system for CSS, JS, or LESS file changes.
*/
gulp.task("global:watch", () => {
gulp.watch(sourcesCss, gulp.series(
"css:clean",
"css:compile"
));
gulp.watch(sourcesJs, gulp.series(
"js:clean",
"js:compile"
));
gulp.watch(sourcesLess, gulp.series(
"less:compile"
));
}); package.json{
"devDependencies": {
"del": "6.0.0",
"gulp": "4.0.2",
"gulp-brotli": "3.0.0",
"gulp-clean-css": "4.3.0",
"gulp-concat": "2.6.1",
"gulp-gzip": "1.4.2",
"gulp-less": "5.0.0",
"gulp-rename": "2.0.0",
"gulp-terser-js": "5.2.2",
"normalize.css": "8.0.1",
"rollup": "2.53.2"
},
"name": "asp.net",
"private": true,
"version": "1.0.0"
} Startup.cspublic sealed class Startup {
public void Configure(
IApplicationBuilder app) {
app.UsePreCompressedStaticFiles()
.UseStaticFiles();
}
} web.config<?xml version="1.0" encoding="utf-8"?>
<configuration>
<system.webServer>
<httpCompression>
<staticTypes>
<add mimeType="text/css" enabled="false"/>
<add mimeType="text/javascript" enabled="false"/>
</staticTypes>
</httpCompression>
</system.webServer>
</configuration> |
@JunTaoLuo is a copy of https://github.com/JunTaoLuo/MiddlewaresSample still available somewhere? And are there any plans to support this natively in kestrel? Having precompressed .br and .gz files and serving them instead of recompressing the original file when a request with a matching accept-encoding header comes in should not be an unusual use case. |
Natively in Kestrel no? Via existing middleware in ASP.NET Core? Maybe in the future. |
I have created a nuget https://github.com/AnderssonPeter/CompressedStaticFiles that solves both the gzip/brotli and also alternative image files. Its based on neyromant's middelware! |
Fixed as part of #55558 |
Cool! any chance you could give us a few hints how to use the new functionality to achieve serving pre compressed content? |
From @DamianEdwards on Friday, July 25, 2014 11:42:51 AM
Pre-GZIPping files is seemingly becoming more popular. This involves running a tool ahead of deployment that creates GZIPped copies of suitable files in the site, e.g. site.js => site.js.gzip. Then the file serving aspect of the web server will serve the GZIPped file when appropriate.
Copied from original issue: aspnet/StaticFiles#7
The text was updated successfully, but these errors were encountered: