When a customer searches "plumber near me," Google does not show them a list of every plumber in the city. It shows them three. Those three slots, the local Map Pack, are where 70% of local search clicks go. Everything below the pack might as well be on page two.
Reviews are one of the strongest signals Google uses to decide which three businesses earn those slots. This post breaks down exactly how the algorithm uses reviews, what matters more than you think, and what matters less.
What is the Map Pack and why does it matter
The Map Pack is the boxed set of three local businesses Google shows above organic search results when the query has local intent. Most "plumber near me," "best dentist," "barbershop downtown" searches trigger it.
The pack is not a separate algorithm from organic search, but it weights local signals much more heavily. A business with strong local SEO can outrank a business with stronger overall SEO when the search has local intent.
Google's three local ranking factors
Google has openly stated the local ranking factors: relevance, distance, and prominence.
Relevance is how well your business categories and listing match what the user searched. Distance is straightforward: how close are you to the searcher. Prominence is the catch-all for "how well-known and trusted is this business."
Reviews live almost entirely inside prominence, and prominence is where you have the most leverage. Distance is fixed. Relevance depends on getting your Google Business Profile categories right (a one-time setup). Prominence is the lever you pull every month.
How reviews feed into prominence
Google has confirmed publicly that reviews affect local rankings, but it has been intentionally vague about the exact weighting. From observed correlations across thousands of businesses and direct statements from Google, four review signals matter:
Review quantity. The total number of reviews your profile has. More is better, but with diminishing returns past a few hundred.
Review quality. Your average star rating. The difference between 4.6 and 4.8 is real, but the difference between 4.0 and 4.6 is much larger. Below 4.0, you are flagged as a problem business in the algorithm.
Review recency. How recently your last few reviews were posted. A business with 200 reviews from 2 years ago will lose to a business with 60 reviews from the last 3 months.
Owner response rate. How consistently you reply to reviews. This signals that the business is active and engaged.
The factor most owners underrate: review velocity
Review velocity is the rate at which new reviews come in over time. It is the single most underrated signal because most business owners think of reviews as a one-time campaign instead of a steady stream.
A business that gets 5 to 10 new reviews per month, every month, will outrank a business that got 100 reviews in a single month and nothing since. Google's algorithm interprets a steady stream as evidence that this is an active, healthy business. A spike followed by silence looks suspicious, sometimes literally — sudden review bursts can trigger fake-review filtering.
The actionable takeaway: aim for consistency, not volume. Five reviews a month for two years beats sixty reviews in one month and nothing after.
Why your average star rating matters less than you think (above 4.0)
Once you are above 4.0 stars, the gap between 4.4 and 4.7 has small effects on rankings. The gap between 3.8 and 4.2 has large effects. Below 4.0, Google starts treating you as a flagged business and your visibility drops sharply.
That means the highest-leverage move for most businesses is not chasing 4.9 — it is making sure no individual one-star or two-star review goes unanswered. Owner responses on bad reviews can soften the impact for both the algorithm and human readers.
Geographic spread of reviews
This is a more advanced factor, but worth knowing about. Reviews from customers across the geographic area you serve carry more weight than reviews concentrated in a single neighborhood. Google interprets geographic spread as evidence that you serve a broader market.
For service businesses with techs going to customer addresses, this happens naturally. For brick-and-mortar businesses with a fixed location, it is harder, and you should not stress about it. Just keep collecting reviews and the geographic patterns will sort themselves out.
Owner responses: an underused signal
Replying to reviews, especially negative ones, is the cheapest local SEO move available. Google has confirmed that consistent owner responses are part of how local search ranks businesses. The mechanism is straightforward: a business that replies to reviews is engaged, which makes it a higher-quality search result.
Reply to every review you get. Positive ones can be short ("Thanks, Sarah, glad we could help"). Negative ones should be calm and offer to take the conversation offline.
Putting it together: a sustainable review pipeline
Translating the algorithm into practice gives you four habits:
Send a review request after every completed job. Not "after most" — after every. The single biggest reason businesses have low review counts is that asking is inconsistent.
Reply to every review within 24 to 48 hours. Use templates so you do not have to think about it.
Aim for 5 to 10 new reviews per month, sustained. That is a more useful goal than "300 reviews total."
Watch for reviews that mention specific services or keywords. A review that says "great kitchen sink repair" reinforces relevance for that exact search term.
Reviews are not a one-time SEO project. They are the local equivalent of compound interest: small consistent inputs, surprising long-term outputs.
Nudge automates the after-job request and follow-up so the consistency happens without you remembering. The algorithm does the rest.