Retail marketing case study: UK multi-store at 9.5x ROAS

UK retail marketing case study: how a 15-store retailer hit 9.5x blended ROAS with postcode targeting, LIA, and POS-matched attribution.

Reviewed for accuracy by Lorenzo Bonari · April 2026

Retail marketing case study: UK multi-store at 9.5x ROAS

Most retail case studies on the UK SERP are paywalled fashion reports, round-ups of US ecommerce brands, or listicles on pop-up stores. None show a CMO what good looks like with hard numbers, a named methodology, and an honest note on what attribution cannot see.

This is one. A multi-store UK retailer, 15 locations, furniture category. Before we started, the digital team was reporting 1.5x to 2.0x ROAS and the board was cutting budgets. Eight months in, blended ROAS hit 9.5x, store visits were up 215% year on year, and cost per store visit was down 40%.

Here is the methodology, the numbers we can defend, the numbers we cannot, and the questions a CMO should ask any agency claiming similar results.

A 15-store UK furniture retailer was running 1.5x to 2.0x ROAS while seven of its Google Business Profiles sat unverified and in-store revenue was entirely absent from its measurement stack. BYLT rebuilt the paid media foundation across four pillars: drive-time postcode targeting, Local Inventory Ads (live stock messages that remove the friction between research and an in-store visit) fed into Performance Max for store goals, clean location data and Merchant Centre feed hygiene, and a dual measurement layer combining Store Visit Conversions with Offline Conversion Import matched against POS transactions. Eight months later, blended ROAS reached 9.5x, store visits were up 215% year on year, and cost per store visit fell 40%. The case for this methodology holds across categories. The key constraint is always the measurement stack, not the channel mix.

The invisible revenue problem

Revenue that closes in-store is invisible to most retail measurement stacks, and that invisibility kills budget before it kills performance.

Every multi-store retailer I work with starts the same way. Online sales are measured to three decimal places. In-store revenue from the same customer, driven by the same ad, shows up as a vague feeling that "digital is doing something." Finance sees 1.5x ROAS and asks why the team is not cutting the budget.

The ROAS number is wrong. It measures the slice of the campaign that converts online, in a category where most revenue closes in-store. Across UK retail only 28.3% of sales happen online as of December 2025, so roughly seven in ten pounds change hands in a shop (source: https://brc.org.uk/market-intelligence/retail-in-numbers/). For considered-purchase categories like furniture, the in-store share is higher again.

When this retailer brought us in, seven of fifteen Google Business Profiles were unverified or linked to the wrong Merchant Centre account. Store Visit Conversions (Google's probabilistic measurement of ad-driven shop visits, using GPS, WiFi proximity, and opted-in location history from users with verified Google Business Profile signals) were off. Offline Conversion Tracking (the import of matched in-store transaction data from a retailer's POS system back into Google Ads as a conversion value) was not configured. Radius targeting sat at 15 miles, spending budget on postcodes separated from the store by toll bridges and competitor catchments.

The revenue was not invisible because digital was failing. It was invisible because nothing was measuring it.

The four-pillar methodology

The four pillars that fixed it are: strategy (postcode targeting), format (Local Inventory Ads and Performance Max), foundation (clean location data and feed hygiene), and measurement (Store Visit Conversions plus Offline Conversion Tracking). The same shape holds across every multi-store brief we have run.

Every multi-store retail paid media brief we have delivered since has used the same shape across furniture, pet supplies, automotive retail groups, and carpet retailers.

Strategy: postcode targeting and a local-national split

The first fix is replacing radius circles with drive-time postcode tiers that reflect physical reality.

Radius targeting draws a circle and ignores every physical reality inside it. Rivers, motorways, and catchment overlap with competitors all affect whether a postcode converts into a store visit. We replaced the 15-mile radius with drive-time postcode tiers built from a Google Sheets and Maps API tool, targeting each store's 20-minute catchment and excluding postcodes behind toll bridges or inside a competitor's dominant catchment.

Campaigns were then split into local and national. Local carried visit-today and click-and-collect messaging, bid against store-visit signals, and ran Local Inventory Ads (LIA) and Performance Max for store goals. National carried delivery messaging and standard Shopping and Search. The split let us budget each route to purchase on its own economics instead of optimising for neither.

Format: Local Inventory Ads and Performance Max for store goals

Local Inventory Ads turn a generic product ad into a live stock signal, which is the difference between a browsing click and an intent-to-purchase store visit.

Local Inventory Ads (LIA) are a Google Shopping format that shows a shopper whether a specific product is in stock at a nearby store, including price and in-store availability updated from the Merchant Centre feed. We implemented local inventory ads on the full SKU catalogue, with automated rules suppressing any ad where the nearest store had fewer than two units in stock. Every click was for a product the customer could walk in and buy the same day.

Performance Max for store goals (a Google Ads campaign type that serves across Search, Shopping, Maps, YouTube, and Display using a single asset group and store-goal bidding signal) replaced legacy Local Campaigns and carried the same feed into all inventory. The Performance Max bidding algorithm needed a defensible store-visit value to optimise against, which is where pillar four came in.

Foundation: clean location data and a healthy feed

Merchant Centre (Google's product data platform, which feeds Shopping, Local Inventory Ads, and Performance Max with accurate product availability, pricing, and store location data) issues will ground the whole stack before any bidding algorithm can run.

Seven stores had Google Business Profile issues. One had the wrong address. Two had opening hours that contradicted the in-store reality. Three had no category set. All fifteen NAP records were standardised, every profile re-verified, and the Merchant Centre feed audited for disapproved items, missing GTINs, and price mismatches.

LIA depends on feed hygiene. Performance Max depends on LIA. Store Visit Conversions depend on verified profiles. None of this gets a slide in an agency pitch deck, and none of it can be skipped.

Measurement: Store Visit Conversions plus Offline Conversion Tracking

Directional and verified are different numbers. A sound measurement stack keeps them in separate columns.

Store Visit Conversions give directional data from Google's probabilistic signals: GPS, WiFi near the store, Google Maps usage, opted-in location history, and verified Google Business Profile (source: https://support.google.com/google-ads/answer/6100636?hl=en-GB). Directional, not audited POS revenue.

We layered offline conversion tracking on top by integrating the retailer's POS export via Offline Conversion Tracking. Transactions matched back to the click via hashed email, and in-store revenue fed to the bidding algorithm as a conversion value. Performance Max and Search Smart Bidding then had the signal to spend more in postcodes driving in-store revenue, not just online revenue.

For how to use store visit data without over-claiming, keep probabilistic estimates and POS-matched revenue in separate columns.

The numbers

Once the measurement stack was live and the rebuild had eight weeks of data, the picture changed.

Metric Baseline After rebuild (8 months) Source
Blended ROAS 1.5x (online-only) 9.5x (online + POS-matched in-store) POS-matched Offline Conversion Tracking
Store visits Baseline +215% year on year Google Ads Store Visit Conversions (probabilistic)
Cost per store visit Baseline -40% Google Ads Store Visit Conversions (probabilistic)
In-store revenue ratio Not measured £8 in-store per £1 online POS-matched Offline Conversion Tracking

BYLT client data, 2025: Blended ROAS reached 9.5x (online + POS-matched in-store revenue) after an eight-month rebuild, up from a 1.5x online-only baseline.

BYLT client data, 2025: Store visits increased 215% year on year (source: Google Ads Store Visit Conversions, probabilistic estimate).

BYLT client data, 2025: Cost per store visit fell 40% over the same period (source: Google Ads Store Visit Conversions, probabilistic estimate).

BYLT client data, 2025: The +£124k figure quoted elsewhere is composite incremental in-store revenue across the BYLT retail client portfolio, not this single retailer. Portfolio, not case. The single-retailer metric is the £8 in-store per £1 online ratio via POS-matched Offline Conversion Tracking.

What the data cannot tell you

The Google Store Visit figure is a modelled estimate. Know which number is which before you put either in a board presentation.

The Google Store Visit figure is a modelled estimate, useful for directional optimisation, not for a CFO conversation about incremental revenue. Anyone presenting a probabilistic store-visit number to finance as if it were audited revenue is over-claiming.

POS-matched Offline Conversion Tracking is the defensible number. It needs a POS that can export transaction-level data with a hashed customer identifier, plus legal sign-off on matching that identifier back to ad clicks. Around 60% of the multi-store retailers we speak to have this ready. The rest need a six-to-eight-week integration before the measurement layer can run.

How to evaluate a retail marketing agency before you sign

The four questions below separate a retail paid media specialist from a generalist. Ask them before you brief any agency.

Four questions a CMO should put to any shortlisted agency. The answers separate a retail paid media specialist from a generalist.

  1. How do you handle attribution between online ad clicks and in-store revenue? The correct answer names Store Visit Conversions for directional data and Offline Conversion Tracking via POS integration for defensible revenue. If the answer stops at Store Visit Conversions, the agency is optimising against a modelled estimate only.
  2. Show me the postcode logic you would apply to our stores. If the response shows a radius map, ask about drive-time analysis and exclusions. Radius targeting is the default option in Google Ads. It is also the lazy option.
  3. How will you structure local and national campaigns? The split is the single most important structural decision in a multi-store account.
  4. What proof points can you share on hard store-visit metrics, not impressions or clicks? Ask for a named sector, store count, a before-and-after number, and the attribution method behind it.

Why this works across categories

The four-pillar methodology transfers across categories because the structural problem is the same: unmeasured in-store revenue creating the illusion of weak ROAS.

The same methodology has delivered comparable outcomes in pet supplies, automotive retail groups, and carpet retailers. Headline numbers move with the category, but the structural shape holds. Retailers assume their category is uniquely hard. The harder problem is almost always the measurement stack, not the channel mix.

For the full strategic view, the UK multi-store retailer paid media playbook covers the architecture end to end.

Sources.

  1. Retail in NumbersBritish Retail Consortium (accessed April 2026)
  2. About Store Visit ConversionsGoogle Ads Help (accessed April 2026)

Frequently asked questions.

What counts as a retail marketing case study worth learning from?
A credible retail marketing case study names the client or sector, discloses the attribution method, and shows hard store-visit and in-store revenue numbers alongside how they were measured. A good case study distinguishes between probabilistic Google Store Visit estimates and verified POS-matched Offline Conversion Tracking data, and explains the campaign methodology clearly enough for a CMO to evaluate whether it would transfer to their own estate.
Can a retailer trust Google's Store Visit Conversion number?
For directional optimisation, yes. For a CFO conversation about incremental revenue, no. Store Visit Conversions are a probabilistic estimate built from GPS, WiFi signals near the store, opted-in location history, and verified Google Business Profile data. Useful for bidding signals and performance trends. POS-matched Offline Conversion Tracking is the defensible number for finance.
How long does a retail O2O rebuild take before results show?
Foundation work (Google Business Profile verification, Merchant Centre feed audit, NAP standardisation) takes two to four weeks. The Offline Conversion Tracking measurement layer takes four to six weeks depending on POS access and legal sign-off on hashed identifier matching. Bidding signals start to compound at week six to eight. Eight months is a reasonable window to judge blended ROAS improvement.
What blended ROAS is realistic for a multi-store UK retailer?
Category-dependent. As a shape: if online-only ROAS is 1.5x to 2x and in-store revenue is 70% of total, blended ROAS in the 6x to 10x range after a full rebuild is realistic. Below 4x blended after eight months usually means the measurement stack is still incomplete, not that the channel mix is wrong.
What metrics matter most in a multi-store retail case study?
For CMO-level evaluation: blended ROAS (online plus POS-matched in-store revenue divided by ad spend), cost per store visit, and the in-store-to-online revenue ratio. These three metrics together show whether the campaign is generating real in-store footfall and whether the attribution stack is sound enough to trust. Impressions and CTR are noise at this stage.
Is this methodology Meta or Google?
Predominantly Google Ads: Search, Shopping, Performance Max, and Local Inventory Ads, plus Google O2O measurement via Store Visit Conversions and Offline Conversion Tracking. Meta is a secondary layer added once the Google stack is producing clean signal and the measurement foundation is stable.