CRE brokerage listings data quality – area test

CRE Innovation
Commercial property vacancies data quality

CRE brokerage listings data quality – area test

This post is about listings data quality – i.e. what quality of service are my commercial real estate (CRE) brokerage tenants getting.

If you’ve seen this post “How does your CRE brokerage’s data quality score?”, you will probably be thinking “Okay, so you how does this compare with my brokerage?”

You would not be the first to ask this! We take pleasure in answering your question below.

We took a brokerage’s data for a specific area, and mapped it against all the available vacant stock (thousands of units, and millions of square meters) sitting in the Gmaven data feed.

Step 1

We chose an area together with a brokerage:

Trial area for vacancy data quality in SA

Step 2.

We ran the tests

Here are the results:

CRE data quality area POC

We focused on one aspect of the data: units (aka vacancies or listings).

Here the stakes are big (poor listings data quality = egg on face risk), and velocity and volume of data changes is highest.

Below is another way of showing both the findings and outcomes available to the customer:

CRE data quality area assessment

Walking you through the above

  • 162 units came back within the “net thrown” – see step 1 above

Of these…

  • 22 had the wrong rental (e.g. asking R150 / sqm, when the rental is actually R130 / sqm)
  • 20 units had been double captured – (e.g. pockets of space duplicated)
  • 52 simply had out of date / wrong information (e.g. 55 sqms of space vacant, actually 550 sqms vacant, or the space is no longer public but is still sitting there as vacant)
  • 34 units did not match the public vacancies – they may be private stock
  • 6 units didn’t have enough info to validate (appeared to be incompletely captured)

This left around 28 units of the original 162 listings, that were reliable.

We also identified, alarmingly:

  • 76 units that were vacant, but hadn’t been captured. I.e. they would never have been introduced to tenants, could not feed to websites (assuming desired by brokerage), and would never display in a property brochure

Bottom line, for that specific area, excluding possible “private stock” (including shadow space), there were actually 104 vacant units.

In summary

CRE is one of the world’s most data-complex industries.

As such, it is incredibly difficult for humans, without powerful data tools and specialist processes, to maintain the high volume of CRE data by hand.

So, there is no shame in the findings. Listings / units / vacancies data quality is notoriously hard to manage.

And the good news… efficiency solutions exist.

Related posts