Why Most Property "Scores" Are Useless
Type a postcode into almost any UK property site and you'll get a number out of ten. It'll feel authoritative. It usually isn't. The formulas are black boxes, the inputs are undocumented, and — as we recently discovered in our own product — the number you see on the webpage sometimes disagrees with the number on the downloadable report.
We fixed it. Here's exactly how our investment score works, what goes in, and what we've done to make the web page and the PDF report tell you the same story every time.
The Ten Inputs
Our investment score combines ten independently-weighted data sources. Every input is pulled from an official or industry-recognised UK register — no mystery "proprietary data."
Any input we can't retrieve is simply dropped and the remaining weights rescale. So a rural postcode with no TfL data isn't punished — its score reflects the categories we actually have evidence for. We then publish a data completeness figure alongside the score so you know how much you're looking at.
Safety Is Not a Count
The single biggest trap in property-score design is treating crime as a raw count. "500 incidents" means one thing in a rural village and something entirely different in central London. A count without a benchmark is noise.
We use the ONS national percentile bands — percentage of the regional average — to bucket each postcode into Low / Below Average / Average / Above Average / High. Those are the same buckets the Home Office uses in its own crime statistics releases.
The categorical level then maps directly to a 1-10 safety score:
If you see "BELOW AVERAGE" on the crime-level badge, the safety number in the investment breakdown now has to agree — because it's derived from that same badge. Until recently that wasn't true in our product, and a well-meaning bug produced Safety scores of 1/10 in postcodes that were in fact below-average for crime. If you downloaded a report in the last couple of months and saw a disagreement, it's been fixed. Re-generate the report and the two will line up.
Why Yield and Growth Are Weighted Highest
Because those are the two numbers an investor actually cares about, and they're the two things people most often get wrong when eyeballing a property.
For rental yield, we take the median sold price in the postcode over the last 24 months and divide annualised median rent by it. The UK gross-yield average sits around 5%. Anything under 2% scores a 1, anything above 8% scores a 10 — a two-percentage-point gap in yield is genuinely enormous over a ten-year hold.
For price growth, we annualise the 5-year change from UK HPI data if we have it, and fall back to the 1-year change if we don't. Five-year annualised growth smooths out post-stamp-duty-holiday noise and covid dips. A rule of thumb: 3-5% pa is typical; sub-zero is a yellow flag; double-digit is usually mean-reverting.
Why the Report and the Web Page Now Agree
Until last week, we had three copies of the scoring logic: one behind the web page, one behind the API, one behind the PDF generator. Each copy was slightly different. That's how drift starts — and why the crime mismatch slipped through.
We've consolidated everything into a single module (`property-metrics.ts`) that takes the raw data from HM Land Registry, Police.uk, Ofsted, and the rest, and produces a single normalised object. The web page renders it. The PDF renders it. The API returns it. There's now exactly one place where the formulas live, and exactly one source of truth.
What this means for you:
What's Not in the Score (And Why)
A few things people often ask us to include, and why we don't:
Planning applications. Too noisy. A single large application two streets away can move a count by 50 without telling you anything about the property. We publish them on the report but don't feed them into the score.
Council tax band. Reflects historical valuations from 1991, not current desirability. We show the band so you can budget; we don't pretend it's a quality signal.
"Buzz" or sentiment. Some sites feed Twitter/Reddit mentions into their scores. That rewards areas with lots of Londoners complaining about them and punishes quiet places where things just work. Hard pass.
Subjective category weightings by "investor persona". Yield-first, capital-first, yield-only, BTL, HMO... we tested personas and found they mostly just re-rank the same top-quartile areas by a few places. We prefer to show you the breakdown and let you decide what matters, rather than pretending our weights for "young professional" buyer are somehow authoritative.
How to Read a Score
A few practical notes when you're looking at one of ours:
And whatever the number says, look at the breakdown. A 6.0 with Yield 9 / Safety 3 is a very different property from a 6.0 with Yield 5 / Safety 6.
Try It
Pull a free area report at postcodeproperty.ai. The score on the page and the score in the PDF are now literally the same calculation — we stake the product on it.
If you ever spot a disagreement, email us the postcode. We want to know.