Real Problems Building Schema Markup for Local Services
1. Structured data pitfalls on homepage versus service pages
There’s this sneaky little issue I ran face-first into: putting LocalBusiness
schema on your homepage and only your homepage will not carry those benefits over to your service-specific pages. Even if your page structure is pristine and the internal linking hierarchy is clear, structured data is evaluated at the page level. Schema-rich snippets aren’t going to follow you around the site.
Example: I set up full JSON-LD schema on a home landscaping business homepage. Their service pages (“tree removal,” “stump grinding”) had solid content but no separate schema. Google was only pulling hours and reviews for the home page URL—click further and they were black holes.
The fix was dumb-simple and should’ve been obvious: replicate customized schema on each service page, slightly tweaking the business category and serviceDescription fields. You can reuse the business info, NAP details, etc., but don’t expect inheritance. Schema does not cascade like CSS.
“If you want Google to show reviews for Tree Removal, Tree Removal better have its own JSON.”
One really confusing moment: Search Console’s Rich Results Test validated everything on the homepage just fine. But when I dropped the service URLs into the snippet tool, it treated them like brand-new pages with no entity context. This is not documented clearly anywhere.
2. Google and Bing still disagree on schema category behavior
If you’re working with small businesses that straddle niche markets—like a plumbing contractor who also does HVAC—there’s a surprising difference in how engines interpret schema categories. I’ve noticed Bing tends to prioritize the additionalType
property more heavily, while Google leans into the first listed @type
.
In one setup, listing both Plumber
and HVACBusiness
in the @type
array led Google to truncate the results entirely. It refused to render rich results, and Search Console acted like I’d provided conflicting identity claims. Bing, on the other hand, happily surfaced both sets of information.
The weird fix? I had to fork the schema implementation: use HVACBusiness
as the primary @type
on the homepage, and spin up a separate Plumber
-focused service page with standalone markup. Never mix them inline, or Google’s validator just throws its hands up.
Bonus chaos: Microsoft Clarity recorded a drop in time-on-page that week, probably as Google’s pullback deranked those confused URLs. Schema quality = visibility, and schema messiness = dwell time death spiral.
3. How to make reviews work without triggering spam filters

Review aggregators are super sensitive turf. One time I had a site temporarily delisted from rich results after pulling in Google Places reviews via a plugin that embedded them into JSON. Looked perfect? Worked fine? Dead silent in the SERPs.
The undocumented no-go here: Google seems to track the source of the review data in the JSON-LD. If it’s clearly not native or first-party, they’ll ignore it—or worse, penalize. Reviews pulled from Google itself, even with attribution, are a trap. Scraping your own Maps widget? Still a trap.
What worked: implementing the Review
schema only for reviews that came in through the business’s own CMS submission form (even if it was just email copy+pasted). Tag those as author: { "@type": "Person", "name": "First name" }
and add an exact datePublished
. Avoid the temptation to over-format with stars unless they’re user-visible onscreen.
Here are some specific sanity-saving tips:
- Only mark up reviews for a single business per page
- Don’t mark up duplicate reviews used across multiple service pages
- Use plain text testimonials, not styled quote blocks
- Stick with
5-10
reviews per section to avoid looking unnatural - Don’t use
aggregateRating
unless it’s calculated from the same data source - Always put the company name exactly as listed in the
name
field
Honestly, manipulating review schema feels like the fastest way to walk into a Google Manual Action without realizing it until 3 weeks later.
4. Common validation errors that will not show up in Search Console
This one still drives me nuts. Schema shows as valid in Google’s official validator, but doesn’t render because the structured data fails a logic check—not a formatting one. Case in point: I had openingHoursSpecification
with “Monday-Saturday, 9AM–5PM” but I wrote it using lowercase day names. Totally valid JSON. Total fail.
Search Console didn’t even mark it as a warning. The data just… didn’t exist to Google. Only caught it when browsing the page’s raw code and wondering why the robot.txt checker was showing no schema consumption.
Edge cases like these aren’t rare:
- Quoted number ratings (e.g.
"5"
instead of5
) pass test, fail in practice - Missing
priceRange
doesn’t block validation… but blocks rich result appearance - Whitespace in phone numbers causes parsing failure when paired with country code
Your best shot here: use both search support tools and the in-browser inspector to cross-check what Googlebot sees. Or just throw it into ChatGPT with “What is wrong with this schema” and parse its hallucination into something useful.
5. Why service area businesses need separate schema logic
Here’s where I screwed up a multi-region pest control site. I replicated the same LocalBusiness info across all city landing pages—Fairfax, Arlington, Reston, etc. Within two weeks, half were ghosted in the local pack. Turns out: identical NAP data + different URLs without distinct geo
or areaServed
entries = spam signal.
Duplicate schema on location pages reads like doorway page SEO from 2010. Don’t do it.
The right move: instead of cloning the same entity, model each city page as its own service entry, connected to a shared Organization
. Flip the schema context:
{
"@context": "https://schema.org",
"@type": "Service",
"serviceType": "Rodent Removal",
"provider": {
"@type": "Organization",
"name": "BareHands Pest"...
},
"areaServed": {
"@type": "Place",
"name": "Fairfax, VA"
}
}
This let me place truly unique area pages with local-intent cues, without spamming Google with cloned LocalBusiness entries across 20 cities.
6. When schema markup conflicts with Google Business Profile
I thought I was being clever by updating structured data before my client got around to updating their Google Business Profile. Big mistake. The address in the schema reflected a move to a new office suite, while GBP was still showing the old one. Within 48 hours, rankings on brand-name keywords dipped hard.
Turns out, Google runs consistency checks between website schema and database entries in GBP. If the NAP details don’t match, their trust in either drops. Rich snippets disappeared, and consumer-touched search results led to confusion hell (“Is this the same place?”).
Even after syncing everything, it took weeks for rankings to bounce back. Here’s the general rule: treat GBP like the source of truth, and don’t let schema jump ahead. You can’t yank the rug out on Google’s internal address graph without consequence.
Side note: schema also shouldn’t try to override primary business hours if they’re misaligned with what’s in GBP. Even if it’s technically correct.
7. Debugging structured data with browser extensions and server headers
Browser extensions can break schema rendering. I once spent a whole day trying to figure out why schema wasn’t injected into the DOM, and it turned out uBlock Origin was silently stripping inline JSON-LD scripts. Not blocking all JS—just ones with suspiciously structured data objects. There’s no alert, no console error—it just vanishes.
So if you’re trying to manually verify on-page schema with browser tools, always check:
- View source, not just DOM-inspected content
- Test with fresh browser profile, extensions disabled
- Use Lighthouse in incognito for schema visibility, not just for performance
Also, watch out for misaligned server headers. An old client’s site had Content-Type: text/xml
on structured data pages—not HTML—due to a misconfigured NGINX location block. That alone suppressed indexing for two weeks. Rich results never had a shot.
Best debugging combo I’ve found: schema.org validator + Search Console + a local HAR file + Ctrl+U eyeballs.