Reliable CDN Tactics That Actually Make Your Site Faster

Selecting a CDN Provider Isn’t Just About the Closest Node

This bit took me a while to learn the hard way. I was convinced if I just picked the CDN with the most PoPs geographically close to my audience – say, Cloudflare for a mostly Euro and Southeast Asia traffic base – that would be the end of it. Load times would magically drop. Nope. Turns out, you can sit a data center on your user’s lap and still get trash TTFB if their DNS resolver gets redirected through a slow peering partner.

What you actually need to look into:

  • How many ISPs in your key regions peer directly with your chosen CDN
  • Whether your CDN caches HTML responses or just static assets
  • What their default cache TTLs are (many default to barely-enough 5 minutes)
  • If they support origin shield or regional replication layers
  • How aggressively they revalidate expired assets

When I moved one high-traffic AdSense site from StackPath to Bunny.net, I saw slightly worse metrics in Africa but vastly improved first-paint times from Indonesia, simply because Bunny wasn’t bouncing my DNS resolution across a relay in Qatar.

Using Cache Headers Correctly, or the Time I Wrecked My Own PerfScore

Google PageSpeed Insights chewed me out for months telling me I had uncacheable JS files, and I blamed the CDN. Turns out my cache-control headers were set to private,”max-age=600″, which is solidly dumb if you’re letting a CDN persist that content for other users. No idea how that slipped in, probably copied from a logged-in admin response header at some point.

“CDNs don’t override your cache control unless you tell them to.”

I’d assumed CDNs like Fastly or Cloudflare would enforce their own caching rules over mine, but that’s not true unless you’ve explicitly turned on override behavior. This is not well-documented on Cloudflare’s UI – you have to write a Page Rule or use Cache Rules in the newer interface with weird ambiguous options like “Edge Cache TTL” and “Browser Cache TTL”.

The fix? I set:

Cache-Control: public, max-age=31536000, immutable

on all unversioned static assets like fonts and JSON config files. Long story short, load speed improved, and I stopped getting those red flags in Search Console’s CWV reports (Web Vitals).

Second-Level Caching With Origin Shields Saved an Ad Load

If you’re using a smaller CDN or any reverse proxy that goes to an origin on a different continent, get smart about using origin shields. Bunny.net has a feature where one shield node handles cache misses across regions, so other edge nodes don’t hammer your origin at the same time. I actually didn’t realize these extra requests were enough to undercut some ad paint timings.

My AdSense revenue was weirdly low in Australia despite decent traffic. Turned out, when Cloudflare missed the cache in that region and fetched from my origin (in France), the delay was long enough to drop the “Largest Contentful Paint” past the AdSense load window sometimes. Switched regions, set Germany as the shield, and suddenly more ads started loading before the cutoff. I probably recovered 5-10% revenue just from fixing that.

BEWARE: Googlebot Doesn’t Honor JavaScript-Modified Headers

This one’s grim: if you’re dynamically injecting cache headers or meta values via JavaScript, Googlebot does not care. Learned this after spending an afternoon rewriting canonical tags with JS based on region assumptions. Result? Bot indexed the wrong versions anyway.

In terms of CDN performance, same applies when your edge modifies headers with Cloudflare Workers or similar approaches – unless the origin outputs the right HTTP headers, Googlebot may ignore it. Use server-side rewrites or edge middleware (like Cloudflare’s newer configuration layer or Fastly VCL) to actually generate headers at request time.

Debugging tip: Always check uncached responses with curl headers. The browser might be hiding stuff behind local caches or extensions (looking at you, uBlock Origin filtering weird prefetch behavior).

Serving Fonts and Icons via CDN Can Backfire If You Overcompress

I learned this ugly truth when some of my mobile users reported icons popping in late on my recipe site. It was because I was gzipping font files on both Apache and at the CDN layer. Double compression isn’t a win – it can introduce enough weirdness to break rendering on low-end Android devices, and even Chrome started throwing decode errors in the logs.

StackOverflow didn’t help – most of the answers say “just compress everything.” Nope. MDN has a note buried deep in the font guide that WOFF2 is already compressed, and stacking Brotli on top barely saves anything but might delay parsing. Turned off that extra compression and suddenly things stopped glitching.

Why Full Site Delivery Isn’t Worth It If You Have Logged-In Areas

CDNs love pushing full-site delivery — great for blogs or static sites, but if you’re running an AdSense-powered directory with account pages, it gets tricky. I once had logged-out users seeing private dashboards because I lazily cached full HTML at the edge for every GET request. Pretty disastrous.

Use cache key variation wisely. With Cloudflare, you can vary the cache by cookie contents, query strings, or even user-agent, but it’s a mess if not debugged fully. One obscure edge-case I hit: disabling query string based caching broke my reporting filters in the dashboard because those query-based states were rendered server-side, not JS. Whoops.

If you do go for “static-seeming” personalization (e.g., geolocation-specific offers), use edge includes or ESI-style fragment injection. I’ve started using Cloudflare Workers to inject user-specific banners after the body loads to avoid caching conflicts. It’s not perfect, but *cling* the dashboard didn’t leak again.

Watch for CDN-Induced Latency Spikes During Traffic Surges

I find this hilarious and horrifying: some CDNs start rate-limiting your own site during spikes. Especially true if your surge comes from real visitors, not bots. On a flash-sale traffic hit from a Facebook share, my edge nodes flagged certain endpoints (like sitemap.xml and robots.txt!) as high risk and applied basic DDoS protections through a challenge page. Helpful for bots, but also gave Googlebot a 403 just when I was trying to ride the wave.

Even though I’d unchecked every “protect all” type option in the dashboard, there’s some undocumented behavior in Cloudflare’s Bot Fight Mode (now part of “Security Level”) that kicks in anyway if the traffic entropy looks bot-like. I now whitelist specific user-agents using Firewall Rules and monitor edge error logs in real time. Essential if you’re fighting for milliseconds on LCP and ad paint metrics during your rare traffic spikes.

Quick CDTerms That Shouldn’t Trip You But Will

Some phrases in CDN configs are misleading enough to cause real-world pain. Here’s my cheat sheet of ones I’ve misread, misunderstood, or mentally skipped past until they bit me:

  • Browser Cache TTL ≠ edge cache (this only instructs the browser, not your CDN)
  • Edge Cache TTL ≠ permanent caching (still revalidates unless marked immutable)
  • No-cache ≠ Do not cache (means allow cache but re-fetch on validation)
  • Stale-while-revalidate can still hit origin if the re-fetch fails hard
  • Bypass cache on cookies → often breaks when cookies are present but irrelevant
  • Honor origin cache → useful only if your headers are correct to begin with

I’ve seen enough broken sites due to misjudging these options that I now screenshot my own cache panel before touching anything. Saves me from gaslighting Future Me later.

Similar Posts