Permalink
Browse files

bin: Make `httpcompression` prevent caching

In some cases, intermediate proxies ignore the `Content-Encoding` header
altogether.

Sending `Cache-Control no-cache` as a request header solves this problem
by forcing validation of the resources in the intermediate proxies,
thereby, ensuring that every request is fetched from the origin server.

Ref. alrra/dotfiles#2. Closes #308.
  • Loading branch information...
1 parent 5cdfccf commit e20538715d3daf0e11ab1482abda2d0063833c99 @alrra alrra committed with Dec 7, 2013
Showing with 20 additions and 6 deletions.
  1. +20 −6 bin/httpcompression
View
@@ -2,22 +2,36 @@
# Test if HTTP compression (RFC 2616 + SDCH) is enabled for a given URL
-declare -r hUA="Mozilla/5.0 Gecko"
+# Useful Links:
+#
+# - HTTP/1.1 (RFC 2616) - Content Codings:
+# http://tools.ietf.org/html/rfc2616#section-3.5
+#
+# - SDCH Specification:
+# http://www.blogs.zeenor.com/wp-content/uploads/2011/01/Shared_Dictionary_Compression_over_HTTP.pdf
+
+# Usage:
+#
+# httpcompression URL
+
declare -r hAE="Accept-Encoding: gzip, deflate, sdch"
+declare -r hCC="Cache-Control: no-cache"
+declare -r hUA="Mozilla/5.0 Gecko"
declare -r maxConTime=15
declare -r maxTime=30
declare availDicts="" dict="" dictClientID="" dicts="" headers="" i="" \
indent="" url="" encoding="" urlHeaders=""
headers="$( curl --connect-timeout $maxConTime \
- -A "$hUA" `# Send a fake UA string for sites
- # that sniff it instead of using
- # the Accept-Encoding header` \
+ -A "$hUA" `# Send a fake UA string for sites that sniff it
+ # instead of using the Accept-Encoding header` \
-D - `# Get response headers` \
-H "$hAE" \
- -L `# If the page was moved to a different
- # location, redo the request` \
+ -H "$hCC" `# Prevent intermediate proxies from caching the
+ # response` \
+ -L `# If the page was moved to a different location,
+ # redo the request` \
-m $maxTime \
-s `# Don\'t show the progress meter` \
-S `# Show error messages` \

0 comments on commit e205387

Please sign in to comment.