Tracking Customer Acquisition Costs When Attribution Is a Joke

Tracking Customer Acquisition Costs When Attribution Is a Joke

Why CAC Gets Weird the Moment You Go Multichannel

At some point, the spreadsheet neatness collapses. First sign? Your organic TikTok blow-up doesn’t show up in any referral path. Zero attribution, just mysteriously spiking direct traffic that Google Analytics attributes to Mars. But your CRM says five new $200-a-month customers signed up that same night.

Customer acquisition cost (CAC) sounds like a math problem. But once you’re juggling channels — Google Ads, cold outreach, YouTube embeds, a giveaway on Reddit — that formula starts bleeding at the edges. You start asking yourself: Did that $100 Facebook post really drive those signups, or did people just read the top comment and Google our name later?

We had a week last year where we added a QR code landing page to our packaging insert, and BAM — 80 conversions by the end of the month. Google Analytics said 14 of them came from Email. Sure. Email.

Tagging URLs… Properly This Time

This seems basic right? Add UTM parameters like a civilized person. Yeah, except if you’ve ever tried tracking actual humans across multi-touch journeys, you know what happens. Someone clicks a UTM-tagged link from your Instagram bio, then gets distracted, opens your homepage later in Safari with no tags, and signs up. Welcome to “Direct Acquisition.”

https://yourdomain.com/signup?utm_source=ig&utm_medium=linkinbio&utm_campaign=summer_sale

Cool. Works until Safari wipes cross-site browsing data or until they open it on mobile first and finish on desktop later. Double points if an employee accidentally copies a UTM link into a Slack and now all their coworkers are skewing your attribution.

Here’s what we had to change:

  • Strip referral junk from internal Slack shares so they don’t contaminate real data
  • Add per-channel landing pages that store referral info in localStorage and pass it into hidden form fields during signup
  • Stop trusting Google Analytics as the single source

Also — and I swear this isn’t documented anywhere obvious — if you’re syncing UTM params into your CRM, make sure the hidden field isn’t being overwritten by autofill. We had a streak of entries where the Lead Source was “Chrome Autofill.”

CRM and Ad Platform Data Don’t Line Up (And Never Will)

The first time you run a CAC report from your CRM and compare it with what Meta’s Ads Manager says, prepare to enter the gaslighting arena. According to Facebook, they generated 142 conversions last quarter. According to our CRM, that number was 67.

This isn’t just a pixel-vs-actual-signups argument. Ad platforms claim conversions based on modelled behavior. Sometimes they credit themselves for someone just viewing the ad — not even clicking. Meanwhile, your CRM only logs them if they filled out a form, got confirmed, and entered your pipeline correctly.

We once had a phantom campaign show up in Google Ads with $600 in conversions on zero clicks. Turns out? An old display campaign was re-enabled under a shared budget — but landing page traffic had tracking blocked by a Content-Security-Policy misconfig. No cookies, no session persistence, so actual customers who signed up just looked like cold ghosts.

Actual CAC Calculations vs. Vibes-Adjusted Numbers

The accountant says your CAC is spend divided by new customers. Simple. But in practice, we often end up doing it three ways:

  1. Simple CAC = Total Ad Spend / # of New Customers
  2. Blended CAC = Total Marketing Costs (salaries, tools, etc.) / # of New Customers
  3. Adjusted CAC = Remove outliers, channel-specific breakdowns, cohort filtering

The problem is when those three answers tell wildly different stories. Our paid CAC one month was under thirty bucks — but blended CAC, once we added video editing and the influencer fees, ballooned to something like two hundred per customer.

Weirdest moment? We noticed a sudden drop in adjusted CAC because a payment bug mis-flagged hundreds of test signups from an internal staging link as actual leads. I nearly published the dashboard before seeing the same Gmail used 122 times.

Tracking Across Devices Is Just Broken

If someone starts on their phone and finishes on desktop, it’s over. Unless you have an auth-based system like Segment with identity stitching, most ad platforms won’t reconcile a user session that switches devices. I hate how often I have to say this to execs:

“Unless the user logs in on both devices or submits a form that passes identity tokens, we can’t track that switch. It just looks like two users.”

We tried to brute-force this with fingerprinting, but got blocked by GDPR compliance. Turns out pseudo-identifiers from Canvas API access triggered some very strong objections during our cookie audit.

I ended up returning to this idea: capture emails fast, store referral info early, and stitch later using backend ETL scripts. It’s annoying, yes. But when someone signs up, we check their email history across prior forms via hashed identifiers. It’s hacky, and duct-taped — but way more honest than pretending Facebook knows what’s going on here.

Lifetime Value Screws with Your CAC When Trial Users Lag

Here’s where CAC gets even fuzzier — slow-lag trial cycles. We run a freemium model, and a ton of signup volume doesn’t convert for weeks. Sometimes months. So when we calculate cost per activated customer, it’s already outdated because half the cohort is still floating in trial limbo.

One time, I re-ran a CAC report three months later and discovered we had underestimated payback period by about 2x. Most of those trial users converted late — after we had already increased spend based on false confidence.

I now run all CAC reporting with dynamic time windows using something like:

SELECT COUNT(user_id) AS converted_users
FROM users
WHERE signup_date BETWEEN '2023-08-01' AND '2023-08-31'
  AND conversion_date > signup_date
  AND conversion_date < signup_date + interval '60 days';

It’s not precise. But it beats waiting six weeks to know whether that podcast ad was a flaming failure or a sleeper win.

Dark Social Is Real, and It Murders Your Attribution

I had to explain “dark social” to a VP once. Basically: stuff that happens in private messages, DMs, Slack threads, untracked email chains — where URL parameters die and no third-party cookies survive. Someone shares your link in a founder group chat, and suddenly you’re seeing a spike in traffic with no referer. Again.

We installed Fullstory for a bit (with user consent, before anyone freaks out) and noticed a pattern: users from ‘direct’ traffic often tabbed in from WhatsApp web or copied links from emails. But none of the origin data persists. They might as well have teleported in.

So what do we do? Honestly, not much. Except:

  • Use distinct landing pages for every intentional campaign so spikes are easier to tie to guessable origins
  • Move from generic URLs to seeded/offline pages when doing partnerships or meetups
  • Build feedback loops into onboarding flows — one question: “Where’d you first hear about us?”

Most people will say “A friend,” “Reddit,” or “Not sure.” But every 1 in 10 gives you something gold, like: “A screenshot of your dashboard went viral in a Discord I’m in.”

One Channel Always Lies More Than the Others

For us, it was LinkedIn. There was a three-month stretch where the LinkedIn Campaign Manager numbers looked pristine — low CPCs, nice conversion rates. But leads added to our CRM via that flow were statistically garbage. The logic flaw? Our signup flow allowed prospecting script pre-fills to mark a signup as ‘converted’ before the user ever verified their email or passed activation criteria.

So what we had was: high signup numbers credited to LinkedIn, but high immediate churn due to fake or half-completed accounts. Platform logic said: “LinkedIn works!” Actual revenue? None.

Bonus bug I learned from someone in a Slack group: if you use iframe-based signup embeds in carousel ads, LinkedIn sometimes double-records the conversion. Depending on how the pixel loads, the embedded page firing twice can inflate your numbers by 100% — and there’s no flag unless you compare against CRM confirmed users.

Manual Spot Audits Save You When Dashboards Lie

I almost forgot to mention — some of the best CAC reality-checks I’ve had came from just manually vetting a random group of acquired users.

Take a group of 20 customers who signed up last week. Go look at their actual journeys. Did they come from the channel the dashboard says? Did they convert instantly or bounce? You’d be surprised how often you spot false attribution or just dead-weight leads being counted as wins.

Real example: We had a campaign tagged for Reddit that got credited with 30 new signups. But looking at the timestamps and IP cluster, 25 of them came from a student forum where one guy pasted the link. Totally unrelated to Reddit — just recycled link clout.

You can’t do this forever, but doing it monthly — even just once per campaign — gives you a sanity-check that keeps you from scaling trash data up into trash dollars.

Similar Posts