I use free tier of Cloudflare and all recommendations assume only those are available.
Why to cache statically generated blog?
My Blog is statically generated website served from Github Pages. As Github don’t allow to easily set my own domain (at least in free version), I needed some kind of proxy that:
- can serve page from my domain,
- will provide valid certificate for HTTPS.
I know HTTPS for static site is a non-sense, but to keep it performant with HTTP/2 and HTTP/3 - you need it.
Obviously, static site is pretty fast even with default Cloudflare configuration, but as I’m doing it for fun… let squeeze it 😄
What Cloudflare can offer us?
Cloudflare offers many options that can impact both security and performance of the website. I will go through the options I usually configure. Some of them can be easily turned on on any website, others will have more sense on rather static page.
Please, don’t turn one everything at the same time.
Security goes first 😄
Always Use HTTPS
Turn it ON.
I want it, you want it. It ensures all non encrypted requests will be redirected. Same could be achieved with a Page rule, but as you can only have 3 rules - you just save one.
HTTPS is crucial for other optimization options to work.
HTTP Strict Transport Security (HSTS)
HTTP Strict Transport Security1 (HSTS, RFC 6797) is a header which allows a website to specify and enforce security policy in client web browsers. It makes no impact for the performance, but improves security and allows to get better grade in SSL Test 😄
One critical consideration when using HSTS on Cloudflare is that once HSTS is turned on, your website must continue to have a valid HTTPS configuration conforming with the HSTS header to avoid making the website inaccessible to users!
I have it enabled.
Minimum TLS Version
TLS 1.0, which provides the best compatibility. I wanted to kick my score in SSL Test, so I bumped it to 1.2. Most modern browsers support it.
We have a rule to redirect everything to HTTPS so this one is a no-op, but might be left enabled.
Turn it ON.
This one is important. TLS 1.3 provides much faster initial connection and 0-RTT (check later).
Speed \ Optimization \ Content Optimization
Turn it ON.It does what it says.
It optimize font loading times for custom fonts. I don’t have them, so I don’t care. If you use custom fonts, you can try it, but measure if there’s any difference.
Turn it ON.
Play with it :)
On one page it was resulting in faster load times, but on my blog it made load times actually higher.
Play with it :)
It minifies text files by removing spaces, etc. As it’s done on the fly, it can slow down page load times for the first time. Follow up request should be cached by Cloudflare.
In my case, I minify all the files during the content generation, so I just don’t need it.
Caching \ Configuration
You can play with two other options on dynamically generated pages to avoid caching dynamic content.
For static page, Standard is fine.
Browser Cache TTL
It’s first place where we can enforce strong caching of our page in the user’s browser. I don’t like this option because it treat all the type files the same.
It won’t impact your page performance, but allows Cloudflare to share some of your page access statistics with Bing (which shares data with DuckDuckGo too)
M$ call this feature IndexNow.
Caching \ Cache Rules
This is where the game starts!
Most recommendations advice to configure Page rule and play around with Cache Level2 or Browser Cache TTL3 options. The problem is that in free version you can only have 3 rules and I already use one for redirection from HTTP to HTTPS.
Other con of using Page rules is the way wildcard matching works there. It’s possible to use a star sign (
*), to match multiple URLs. But to create rule matching image files like: JPG, JPEG, PNG, WEBP - I will need 4 rules. To handle CSS/JS and more, I will need even more. Dead end.
What I wanted to achieve. My page don’t change too often
And Github Pages sets cache TTL for images to only 10 days, it also do not set any cache for HTML/JS/CSS files. Github is quite reliable provider, but to improve world wide load times it’s nice to spread cache around the world and avoid requests to origin at all cost. Pages should be cached in the Cloudflare and served directly from there.
My target would be to:
- cache pages and static files in Cloudflare for 1 day,
- set browser cache for:
- HTML pages to 1 day
- CSS/JS to maybe 10-30 days (they don’t change frequently and each build provides new file)
- images of all sort 3-6 months (they don’t change at all, but are relatively big and it’s good to serve them from the edge locations)
I found that
Cache Rules options, which allow to better tune the way cache works in Cloudflare. Even more, I can add 10 rules for free and within those rules I can use
OR operator to make single rule to match multiple patterns. Perfect!
Caching images rule
Let tackle image caching first.
- On specific domain, go to
Cache rulesCreate a new
- Under When incoming requests match… add:
- add another rule and repeat for:
.icoand whatever more you need
- Expression Preview should be showing:
(ends_with(http.request.uri.path, ".webp")) or (ends_with(http.request.uri.path, ".png")) or (ends_with(http.request.uri.path, ".jpg")) or (ends_with(http.request.uri.path, ".jpeg")) or (ends_with(http.request.uri.path, ".ico"))
- Under Then… block
- set Cache eligibility -> Eligible for cache
- set Edge TTL -> Ignore cache-control header and use this TTL
- under Input time-to-live (TTL) dropdown select 1 day (or more)
- set Browser TTL -> Override origin and use this TTL
- under Input time-to-live (TTL) dropdown select 6 months (that’s minimal to make to stop https://pagespeed.web.dev/ from complaining)
- enable Serve stale content while revalidating
- Hit Deploy button
Check my config below:
What it does?
Matching URI’s by extension should be clear, then we enable cache, which we want too. Now is the interesting part.
- Edge TTL - by setting this option, we enforce edge Cloudflare servers to ignore headers returned from Github pages and store matching responses for 1 day. I could probably go with 7 or 30 days, but if I eventually change any image I won’t need to purge cache. Cloudflare will just keep files for 1 day on it’s proxies and revalidate them next day.
- Browser TTL - above we configure “the server side” cache. Here we tell our users browsers how to treat files from our page. At least for images it makes sense to set pretty long caching time. Google’s Page Speed test stopped complaining about too short caching time after I set it up to 6 months.
- Serve stale content while revalidating - my page is static. Content changes infrequently and when it does, it’s not critical if change will became visible slightly later. But with this option enabled Cloudflare will server whatever he have in their cache and if it’s expired, then it will fetch it and update cache. What’s important - user don’t need to wait. This changes time to first byte from around 1~1.5s to 0.1s.
Caching HTML/CSS/JS files
Caching text files differs from images. With images you want to cache them for a long time as they don’t change. If they do, they change spectacularly, usually with the new URL.
Changes to text documents happen more often, especially on a blog - small update, fixing some typos, etc. You want them to be visible fast. Not necessarily immediately, but fast.
Let build another rule. With CSS and JS files it’s easy - we match the end of
URI Path with their extensions. Done.
With HTML files it’s harder, because not all page URLs end with HTML extension. It’s also not possible to match rule by Content Type (which is such a pity). But we can match a
Request Header. We can use
Accept header as a hook, because when browsers request HTML files they usually do it like that:
Accept: text/html, application/xhtml+xml, application/xml;q=0.9, image/webp, */*;q=0.8
We can then match
Request Header if it contains
text/html - that would be quite probable HTML file 😄
I wouldn’t use it on any dynamically generated, production site without extensive testing. In my case a static site is safe to assume those are HTML files and nothing else.
Let take a look how does it look completely:
We can deploy the rule now.
Those are all options I use in the Cloudflare for my static pages. There are many more variations on the topic, like CLoudflare Pages which might be nice alternative to Github Pages and could further improve loading speed.
It’s good to have a small site or blog to play around before you will turn it all ON.
Use services like:
To verify the real impact of your changes. Having RUM monitoring might be even better.
Good luck and sub zero loading times for your pages!