Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: Compatibility with gzip_static of Nginx #236

Open
sous-studio opened this issue Jul 11, 2014 · 11 comments
Open

Feature Request: Compatibility with gzip_static of Nginx #236

sous-studio opened this issue Jul 11, 2014 · 11 comments

Comments

@sous-studio
Copy link

Hey, Raam! :-)

As promised, here are full list of my modifications. Done for personal use, so sorry, if a little bit informal.

I've enabled gzip_static, I found it could be a perfect addition to my favorite Quick Cache Pro plugin. Compress all at server, perhaps via cron (although, I do believe there is more elegant, yet not investigated by me, solution of inotify), disable otf compression and you'll reduce CPU load.

What were the caveats?
Naturally, if we gzip all qc's cache, and then cache something with qc (or purge the cache, for that matter), *.html files are all get updated, but Nginx is not aware of that - it just knows, there are *.gz files next to *.html's, which it takes and serves to an unsuspecting user, resulting in old (until the next cron) content to be displayed.

Aha, I thought, I should simply analyze qc's code, find the place responsible for cache clearance, and add "unlink [filename].gz" to that, which I've done in the following code:

quick-cache-pro.inc: @SInCE 140422 First documented version.

                if(($_dir_file->isFile() || $_dir_file->isLink()) && strpos($_dir_file->getSubPathname(), '/') !== FALSE)
                    // Don't delete files in the immediate directory; e.g. `qc-advanced-cache` or `.htaccess`, etc.
                    // Actual `http|https/...` cache files are nested. Files in the immediate directory are for other purposes.
                    if(!unlink($_dir_file->getPathname())) // Throw exception if unable to delete.
                        throw new \exception(sprintf(__('Unable to wipe file: `%1$s`.', $this->text_domain), $_dir_file->getPathname()));
                    else $counter++; // Increment counter for each file we wipe.
                else if($_dir_file->isDir()){ // Directories are last in the iteration.
                    if(!rmdir($_dir_file->getPathname())) // Throw exception if unable to delete.
                        throw new \exception(sprintf(__('Unable to wipe dir: `%1$s`.', $this->text_domain), $_dir_file->getPathname()));

this beautiful piece of code (wipe_cache function), we replace with the following code:

                if(($_dir_file->isFile() || $_dir_file->isLink()) && strpos($_dir_file->getSubPathname(), '/') !== FALSE){
                    // Check is passed - it is file, - now we just append .gz and purge the compressed copy.
                        @unlink($_dir_file->getPathname() . '.gz');
                    // Don't delete files in the immediate directory; e.g. `qc-advanced-cache` or `.htaccess`, etc.
                    // Actual `http|https/...` cache files are nested. Files in the immediate directory are for other purposes.
                    if(!unlink($_dir_file->getPathname())) // Throw exception if unable to delete.
                        throw new \exception(sprintf(__('Unable to wipe file: `%1$s`.', $this->text_domain), $_dir_file->getPathname()));
                    else $counter++; // Increment counter for each file we wipe.

                } else if($_dir_file->isDir()){ // Directories are last in the iteration.
                    if(!rmdir($_dir_file->getPathname())) // Throw exception if unable to delete.
                        throw new \exception(sprintf(__('Unable to wipe dir: `%1$s`.', $this->text_domain), $_dir_file->getPathname()));
                }

esentially, fixing the problem. I do agree, that this fix is quick-and-dirty, and more logical solution would be to include exception catch, like this:

                        if(!unlink($_dir_file->getPathname() . '.gz')) // Throw exception if unable to delete.
                        throw new \exception(sprintf(__('Unable to wipe file: `%1$s`.gz.', $this->text_domain), $_dir_file->getPathname()));

and perhaps increment $counter, or better yet, introduce second counter variable, that would help us to established how many files of cache and how many of their gzip counterparts were purged during "Clear cache", but that is out of our scope here.

So, JIC, I've grep quick-cache-pro.inc for other possible cache clearing function. To my deep surprise, I've found 13 in total; minus one, which unlinks advanced_cache_file (eg. not related here).
This seems too me a bit much, but later about it.

So, we see another piece of "@var $_dir_file \RecursiveDirectoryIterator For IDEs.", which we change in exactly same way as previous:

                if(($_dir_file->isFile() || $_dir_file->isLink())){
                    @unlink($_dir_file->getPathname() . '.gz');
                    if(!unlink($_dir_file->getPathname())) // Throw exception if unable to delete.
                        throw new \exception(sprintf(__('Unable to also wipe file: `%1$s`.', $this->text_domain), $_dir_file->getPathname()));
                    else $counter++; // Increment counter for each file we wipe.
                }   else if($_dir_file->isDir()){
                    if(!rmdir($_dir_file->getPathname())) // Throw exception if unable to delete.
                        throw new \exception(sprintf(__('Unable to also wipe dir: `%1$s`.', $this->text_domain), $_dir_file->getPathname()));
                }

Next function in question is wipe_htmlc_cache. It is responsible for wipping out of HTML Compressor's cache files. I personally don't use it, since it often breaks layout, yet, if you do use, you should apply this fix:

                    if(($_dir_file->isFile() || $_dir_file->isLink()) && strpos($_dir_file->getSubpathname(), '/') !== FALSE){
                        @unlink($_dir_file->getPathname() . '.gz');
                        // Don't delete files in the immediate directory; e.g. `.htaccess`, or anything else that's special.

Even easier, no need to bracket the if statement, just add unlink code. Another function comes up - clear_cache - to "Clear cache for current blog". As before, we just modify (add) one line:

                if(($_dir_file->isFile() || $_dir_file->isLink()) && strpos($_dir_file->getSubpathname(), '/') !== FALSE){
                    @unlink($_dir_file->getPathname() . '.gz');
                    // Don't delete files in the immediate directory; e.g. `qc-advanced-cache` or `.htaccess`, etc.

Clear_htmlc_cache - another repeat, so to say, gets modified like this:

                    if(($_dir_file->isFile() || $_dir_file->isLink()) && strpos($_dir_file->getSubpathname(), '/') !== FALSE){
                            @unlink($_dir_file->getPathname() . '.gz');
                        // Don't delete files in the immediate directory; e.g. `.htaccess`, or anything else that's special.

Purge_cache for current blog - folks, do we really need to repeat our code in every single function?

                if($_file->getMTime() < $max_age && strpos($_file->getSubpathname(), '/') !== FALSE)
                {
                    @unlink($_file->getPathname() . '.gz');
                    // Don't delete files in the immediate directory; e.g. `qc-advanced-cache` or `.htaccess`, etc.

Now comes auto_purge_post_cache - as title says, with similiar fix:

                // Actual `http|https/...` cache files are nested. Files in the immediate directory are for other purposes.
                @unlink($_file->getPathname() . '.gz');

                if(!unlink($_file->getPathname())) // Throw exception if unable to delete.

And for home page, comes auto_purge_home_page_cache:

                // Actual `http|https/...` cache files are nested. Files in the immediate directory are for other purposes.
                @unlink($_file->getPathname() . '.gz');

                if(!unlink($_file->getPathname())) // Throw exception if unable to delete.

Oh, of course auto_purge_posts_page_cache:

                // Actual `http|https/...` cache files are nested. Files in the immediate directory are for other purposes.
                @unlink($_file->getPathname() . '.gz');

                if(!unlink($_file->getPathname())) // Throw exception if unable to delete.

Auto_purge_author_page_cache - author's page gets cached too, you know?

                    // Actual `http|https/...` cache files are nested. Files in the immediate directory are for other purposes.
                    @unlink($_file->getPathname() . '.gz');

                    if(!unlink($_file->getPathname())) // Throw exception if unable to delete.

So do post terms, in auto_purge_post_terms_cache:

                    // Actual `http|https/...` cache files are nested. Files in the immediate directory are for other purposes.
                    @unlink($_file->getPathname() . '.gz');

                    if(!unlink($_file->getPathname())) // Throw exception if unable to delete.

And, finally, cache for user, in auto_purge_user_cache:

                // Actual `http|https/...` cache files are nested. Files in the immediate directory are for other purposes.
                @unlink($_file->getPathname() . '.gz');

                if(!unlink($_file->getPathname())) // Throw exception if unable to delete.

Now, for simplification of changes, I would propose using something unified, perhaps, after adding $options['gzip_static'] option, something like that:

..

public function real_unlink($filename, $flag = 0){
    if(!unlink($filename))
        throw new \exception(sprintf(__('Unable to also wipe file: `%1$s`.', $this->text_domain, $filename));
    if($options['gzip_static']){                        // Check if gzip_static option is set
        if(!$flag)
            $this->real_unlink($filename . '.gz', 1);
    }
}

Now this is just a quick code, untested, but I believe in the right direction.

Anyway, Cache is cleared, life is great! :-) But, huh, what will happen if qc removes cache? New pages will not have gzipped version, hence we'll have to wait for cron, or send over uncompressed data, or enforce it somehow...WHY!?

Solution is in /wp-content/advanced-cache.php

advanced-cache.php: @SInCE 140422 First documented version.

        $cache_file_tmp = $this->cache_file.'.'.uniqid('', TRUE).'.tmp'; // Cache creation is atomic; e.g. tmp file w/ rename.
        /*
         * This is NOT a 404, or it is 404 and the 404 cache file doesn't yet exist (so we need to create it).
         */
        $gzdata = gzencode(serialize(headers_list()).'<!--headers-->'.$cache, 1);file_put_contents($this->cache_file.'.gz', $gzdata);unset($gzdata);
        if($this->is_404) // This is a 404; let's create 404 cache file and symlink to it.
        {
            if(file_put_contents($cache_file_tmp, serialize(headers_list()).'<!--headers-->'.$cache) && rename($cache_file_tmp, $this->cache_file_404))
                if(symlink($this->cache_file_404, $this->cache_file)) // If this fails an exception will be thrown down below.
                    return $cache; // Return the newly built cache; with possible debug information also.

        } // NOT a 404; let's write a new cache file.
        else if(file_put_contents($cache_file_tmp, serialize(headers_list()).'<!--headers-->'.$cache) && rename($cache_file_tmp, $this->cache_file))
            return $cache; // Return the newly built cache; with possible debug information also.

Or, to be exact, in this line:

$gzdata = gzencode(serialize(headers_list()).'<!--headers-->'.$cache, 1);file_put_contents($this->cache_file.'.gz', $gzdata);

We use gzencode, to compress existing data (which is about to become a file), add '.gz' extension, and write side by side!

Actually, we can add here some error-control, JIC file cannot be overwritten for some reason, maybe something like this:

$gzdata = gzencode(serialize(headers_list()).'<!--headers-->'.$cache, 1); // Compress the data
if(file_put_contents($this->cache_file.'.gz', $gzdata) === false)
    throw new \exception(sprintf(__('Unable to wipe file: `%1$s`.', $this->text_domain), $this->cache_file.'.gz')); // Write new file, and process error

So, what do you think?

@sous-studio
Copy link
Author

Also, for future reference, and if anyone still forced to use Apache:

https://gist.github.com/bhollis/2200790

Above method should work with this config too.

@jaswrks
Copy link

jaswrks commented Jul 11, 2014

@sous-studio Thanks for sharing 👍

@sous-studio
Copy link
Author

You're welcome! :) Actually, feels kind of weird - first time for me to share my code with community. It is great responsibility...

@sous-studio
Copy link
Author

If you decide to add this function, here is an example of options code to add.

menu-pages.php:

        echo '<div class="plugin-menu-page-panel">'."\n";

        echo '   <div class="plugin-menu-page-panel-heading">'."\n";
        echo '      <i class="fa fa-gears"></i> '.__('GZIP Static Compression', plugin()->text_domain)."\n";
        echo '   </div>'."\n";

        echo '   <div class="plugin-menu-page-panel-body clearfix">'."\n";
        echo '      <i class="fa fa-gears fa-4x" style="float:right; margin: 0 0 0 25px;"></i>'."\n";
        echo '      <h3>'.__('Use server-side GZIP compression of static files?', plugin()->text_domain).'</h3>'."\n";
        echo '      <p>'.__('<strong>Tip:</strong> For the situation when you''d like to have Quick Cache store compressed copies of cached files, next to original files, for later use with something like gzip_static (Nginx). In Nginx, for this to be effective, you should define <code>gzip off; gzip_static on;</code>.', plugin()->text_domain).'</p>'."\n";
        echo '      <p><select name="'.esc_attr(__NAMESPACE__).'[save_options][gzip_static]">'."\n";
        echo '            <option value="0"'.selected(plugin()->options['gzip_static'], '0', FALSE).'>'.__('NO, don''t store compressed copies of cached files.', plugin()->text_domain).'</option>'."\n";
        echo '            <option value="1"'.selected(plugin()->options['gzip_static'], '1', FALSE).'>'.__('Yes, store compressed copies of cached files.', plugin()->text_domain).'</option>'."\n";
        echo '         </select></p>'."\n";
        echo '   </div>'."\n";

        echo '</div>'."\n";

quick-cache-pro.pot:

msgid "GZIP Static Compression"
msgstr ""

msgid "Use server-side GZIP compression of static files?"
msgstr ""

msgid "Tip: For the situation when you would d like to have Quick Cache store compressed copies of cached files, next to original files, for later use with something like gzip_static (Nginx). In Nginx, for this to be effective, you should define gzip off; gzip_static on;.
msgstr ""

msgid "No, do not store compressed copies of cached files."
msgstr ""

msgid "Yes, store compressed copies of cached files.', plugin()->text_domain)."
msgstr ""

quick-cache-pro.inc

'gzip_static'                         => '0', // `0|1`.

The rest is depends of whether you added real_unlink function from above, or not. If you did, you should just replace old unlink functions everywhere to new one and simplified. Eg.

$this->real_unlink($_dir_file->getPathname());
$counter++; // I do believe if exception gets thrown, code will execution will be ceased, hence if not, you can increment counter outside the loop. Please, correct me if I'm wrong.

instead of

                        @unlink($_dir_file->getPathname() . '.gz');
                    // Don't delete files in the immediate directory; e.g. `qc-advanced-cache` or `.htaccess`, etc.
                    // Actual `http|https/...` cache files are nested. Files in the immediate directory are for other purposes.
                    if(!unlink($_dir_file->getPathname())) // Throw exception if unable to delete.
                        throw new \exception(sprintf(__('Unable to wipe file: `%1$s`.', $this->text_domain), $_dir_file->getPathname()));
                    else $counter++; // Increment counter for each file we wipe.

Otherwise, you should add something like:

if($options['gzip_static'])
    @unlink($_dir_file->getPathname() . '.gz');

or

if($options['gzip_static'])
    @unlink($_file->getPathname() . '.gz');

or with throw error, if you like.

@jaswrks
Copy link

jaswrks commented Jul 11, 2014

@sous-studio Thanks! Hey, I'd like to invite you to contact me through this form; should you be interested in joining our team. Please see: http://www.websharks-inc.com/bizdev/

Any further details about yourself would be great; or even just some links to your work would be nice! Mostly though, just so we can connect privately :-)

@sous-studio
Copy link
Author

Hey! :-)

OK, I sent my details there.

On 11.07.2014 20:31, JasWSInc wrote:

@sous-studio https://github.com/sous-studio Thanks! Hey, I'd like to
invite you to contact me through this form; should you be interested
in joining our team. Please see: http://www.websharks-inc.com/bizdev/

Any further details about yourself would be great; or even just some
links to your work would be nice! Mostly though, just so we can
connect privately :-)


Reply to this email directly or view it on GitHub
#236 (comment).

@jaswrks
Copy link

jaswrks commented Jul 11, 2014

Great, thanks! I will be in touch shortly :-)

@raamdev
Copy link
Contributor

raamdev commented Jul 12, 2014

Hey @sous-studio,

Thanks so much for the feature request and all the code! I'll tag this issue as enhancement and review it as part of a future release cycle. :)

@sous-studio
Copy link
Author

Hey, Raam, :-)

Always welcome. In fact, I do believe we could optimize QC's code much farther, and add some features like Database Cache, Object Cache, .MO cache and stuff like that. Limits are set only imagination.

@raamdev
Copy link
Contributor

raamdev commented Jul 13, 2014

@sous-studio Thanks! Yes, I have many feature requests open here. If you have a few ideas for new Quick Cache features that are not already in GitHub, I'd love to see you add them in here so that I can start tracking those ideas.

You can post new feature requests here.

Also, in case you're interested in testing beta releases of Quick Cache before the official version comes out, please sign-up to be a beta tester here. :)

@ethanpil
Copy link

+1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants