Fixing Weird SEO Issues with Restaurant Menus and Maps

Fixing Weird SEO Issues with Restaurant Menus and Maps

Google thinks your location doesn’t exist

Twice last month a client’s restaurant disappeared from Google Maps. Not in the business listing backend — that was alive and well. But their visibility on search had totally nosedived. Turns out it was location marker drift. Even though their pin looked normal in the UI, the actual coordinates were slightly off — enough that Google treated the place as “not well-known in this area.”

So yeah, if you ever notice your impressions or Map views tank without warning, dig into where exactly your pin is dropped. Sometimes updating the address (even with the same info) re-triggers a fresh crawl and resets the location relevancy signals. Add a landmark in your business description while you’re at it. Apparently “across from the post office” works better than you’d think.

Menu PDFs are dead weight for SEO

Honestly, if you still upload your menu as a PDF, Google sheds a silent tear. Search crawlers don’t index the contents properly, users bounce faster because mobile devices suck at previewing them, and worst of all — Apple devices sometimes straight-up fail to open them in Safari if you’re forcing downloads without headers. (Ask me how I learned that one at 10pm.)

What works better:

  • Plain HTML menus, structured with schema.org’s <Menu> markup
  • Include dish names, descriptions, and pricing as normal text — no images-as-text nonsense
  • Put the daily specials or seasonal menus in a dedicated URL path, not a weird popup
  • Use a regular heading structure — don’t cram category names into span tags styled like headers

I once found a restaurant site where the menu was loaded via an inline Base64-encoded font within an iframe. The client paid an agency for “custom branding.” Cool font, zero indexability.

Restaurant schema: less is more, but only if it’s accurate

Google’s structured data guidelines for restaurants are as picky as a gluten-free toddler.

The JSON-LD block you drop in needs exactly the following (and nothing broken):

{
  "@context": "https://schema.org",
  "@type": "Restaurant",
  "name": "La Lunetta",
  "address": {
    "@type": "PostalAddress",
    "streetAddress": "456 Olive Ave",
    "addressLocality": "Los Angeles",
    "addressRegion": "CA",
    "postalCode": "90210"
  },
  "telephone": "+1-310-123-4567",
  "url": "https://lalunetta.example"
}

Now here’s what blew my mind: if you also include menu or acceptsReservations fields, but they 404 or redirect weirdly, Google will flag the entire schema as invalid. You won’t always see this in Search Console either — sometimes it silently fails to parse without warning.

Avoid auto-generators that shove in every possible key, because once I debugged a case where the friction point was "priceRange": "$$$" — the quotes made it parse as invalid in some user agents, something to do with the encoding on that site’s output buffer. Clean, tight schema or GTFO.

Location pages are not the same as multi-location SEO

I’ve had this argument more times than I can count. If your restaurant has multiple locations, you need more than a template page with different addresses swapped in.

Each location needs unique markup, slightly varied descriptions (yes, even if they all serve the same cheese sandwich), and a distinct embedded map. You also want individual GMB listings that tie back to these pages — otherwise Google just assumes you’re a brand with multiple mentions, not a chain with real space.

Here’s what I’ve seen work well:

  • URL structure like /locations/santa-monica, with breadcrumbs
  • Actual text about the neighborhood or clientele (“Minutes from Loyola Marymount” wins)
  • Location-specific images — don’t reuse interior shots across all subdomains
  • Separate business hours per location in schema markup

Weird thing I noticed: if you embed the same Google Street View iframe on three pages, Google sometimes collapses the map-rich snippets. Keep them visually distinct.

Crawlers hate interactive menus

Here’s the deal: sliders, accordions, tabbed interfaces — they all choke your SEO unless you’re implementing them with fallbacks. Menu items hidden behind display:none on load? Guess what — crawlers won’t see them unless something triggers that content at parse time.

I once debugged a burger joint where their burgers literally didn’t exist on Google. The JavaScript-populated tab system showed the categories — but the content came in via deferred async script injection. Looked smooth af. Also completely invisible to bots.

If you’re going to do “fancy” menus, fine — just make sure fallback content is visible server-side or when CSS/JavaScript doesn’t load. Better yet, design for progressive enhancement and let the content load first. This isn’t an app, it’s lunch.

NAP consistency isn’t just for citations

Name, Address, Phone — get this wrong across four directories and kiss your map pack goodbye.

But here’s something sneakier: even small inconsistencies in punctuation or suite formatting mess with Google’s entity recognition. Example: “123 Elm St., Suite B” vs “123 Elm Street Ste B.”

Google might figure it out. Or it might think Suite B is a different location entirely. I’ve seen map overlays with two pins on the same building because Yelp used “Suite A” and the site said “Unit 1.”

My unverified hunch: Google scrapes address mentions from embedded map iframes’ aria-labels and any visible footer text, prioritizing what matches GMB. So make sure whatever’s in your <footer> isn’t stale or formatted differently.

Mobile-first indexing breaks when your menu swaps via JS

This isn’t well documented, but Googlebot for mobile doesn’t always execute the same late-deferred JS blocks as desktop.

One Indian place’s mobile visibility plummeted because their menu layout flipped completely client-side on viewport resize. It used a cheesy React hook to hide the desktop menu and show a condensed format on small screens — but that second format relied on post-load hydration. Googlebot never waited for it.

Translation: their entire menu existed in JS-generated divs, triggered by a breakpoint. No SSR, no fallback. Mobile-first indexing saw… nothing. I ended up patching their site with a dumb server-rendered <noscript> table for crawlers.

Map embeds count toward content uniqueness

This one tripped me up for a franchise client. They used identical copy on all location pages (bad enough), but their CMS also reused the same Google Maps iframe embed — not even changing pin coordinates.

Google absolutely detected that and started devaluing them in SERPs because the content was 90-something percent identical, including the interactive iframe. Once we swapped in custom iframe embeds per page (with adjusted zoom level and center coords), visibility started to normalize again.

So yeah: embeds aren’t “safe” or invisible from a content perspective. Treat them like any other page element that contributes to perceived uniqueness. Also adjust the surrounding text — don’t just paste “Find us here” above each one.

Favicon mismatch delays rich result eligibility

Okay this was super weird and only happened once, but it burned two days.

A client’s structured review markup wasn’t triggering review stars in rich snippets. Everything else validated — schema passed with green checks and the content existed in visible HTML.

Eventually, I noticed they were loading a different favicon (square vs circle) depending on device — and one of the favicon link tags pointed to a dead CDN asset. Once we fixed the links and simplified the <head> to use a single valid favicon.ico path, stars started showing within 48 hours.

I don’t know if it was the favicon or a side effect of header bloat confusing the crawler. But after that, I always audit favicon delivery in my structured data troubleshooting checklist.

Similar Posts