Quick KeyCDN’s Cache Enabler test

cache enablerCache Enabler – WordPress Cache is a new page caching kid on the WordPress plugin block by the Switzerland-based KeyCDN. It’s based in part on Cachify (which has a strong user-base in Germany) but seems less complex/ flexible. What makes it unique though, is it that it allows one to serve pages with WEBP images (which are not supported by Safari, MS IE/ Edge or Firefox) instead of JPEG’s to browsers that support WEBP. To be able to do that, you’ll need to also install Optimus, an image optimization plugin that plugs into a freemium service by KeyCDN (you’ll need a premium account to convert to WEBP though).
I did some tests with Cache Enabler and it works great together with Autoptimize out of the box, especially after the latest release (1.1.0) which also hooks into AO’s autoptimize_action_cachepurged action to clear Cache Enabler’s cache if AO’s get purged (to avoid having pages in cache the refer to deleted autoptimized CSS/ JS-files).
Just not sure I agree with this text on the plugin’s settings page;

Avoid […] concatenation of your assets to benefit from parallelism of HTTP/2.

because based on previous tests by smarter people than me concatenation of assets can still make (a lot of) sense, even when on HTTP/2 🙂

HTTP/2 & JS/CSS optimization: eBay’s approach

Quick follow-up to my previous post about HTTP/2 and Autoptimize; I just read “Packaging for Performance“, an interesting article on Performance Calendar by eBay’s Senthil Padmanabhan. Well worth the read, but the summary; their research confirms bundling of JS/CSS still has clear performance benefits, but they did stop bluntly aggregating all in one file to improve cache-ability. This leaves them with;

  • one optimized JS and one optimized CSS file for the core libraries, used throughout eBay, high cache-ratio & payload
  • one optimized JS and one optimized CSS file for the “domain constants”, used on specific eBay segments, medium cache-ratio & payload
  • one optimized JS and one optimized CSS file for the “domain variables” containing fast changing code for specific segments, having lowest cache-ratio and payload

So yeah, I see a bright future for Autoptimization in the coming age of HTTP/2! :–)

HTTP/2, CSS/JS concatenation and Autoptimize

The web performance world is abuzz with HTTP/2, which should (among other improvements) do away with the latency that each separate HTTP-request introduces, thus rendering aggregation of e.g. CSS & JS an anti-pattern. But there’s at least one in depth facts and figures based article that is not ready to dismiss “packaging” just yet. So: testing, testing, testing!
Autoptimize will in the not too distant future very likely have a “don’t aggregate, just minimize”-option, but the proof of the pudding will always be in the eating testing; sometimes it will be better to aggregate and minify as we do now, sometimes only minifying will be the better approach. And maybe (often?) a combination of those will make most sense: suppose you have a site on which 90% of pages share 90% of JS code. In that case it will likely (testing, testing, testing!) help performance to aggregate & minify the 90% of JS while excluding all other JS from aggregation (and minifying that). Sounds like the new whitelist-filters in Autoptimize’s API will come in handy no? 😉