CRE brokerage listings data quality – area test

A (confidential) brokerage could not believe our published findings on a major CRE data overhaul project. The best way to establish the truth was to run a similar process on this brokerage’s “clean data”. Check our fairly startling findings below…
Commercial property vacancies data quality

This post is about listings data quality – i.e. what quality of service are my commercial real estate (CRE) brokerage tenants getting.

If you’ve seen this post “How does your CRE brokerage’s data quality score?”, you will probably be thinking “Okay, so you how does this compare with my brokerage?”

You would not be the first to ask this! We take pleasure in answering your question below.

We took a brokerage’s data for a specific area, and mapped it against all the available vacant stock (thousands of units, and millions of square meters) sitting in the Gmaven data feed.

Step 1

We chose an area together with a brokerage:

Trial area for vacancy data quality in SA

Step 2.

We ran the tests

Here are the results:

CRE data quality area POC

We focused on one aspect of the data: units (aka vacancies or listings).

Here the stakes are big (poor listings data quality = egg on face risk), and velocity and volume of data changes is highest.

Below is another way of showing both the findings and outcomes available to the customer:

CRE data quality area assessment

Walking you through the above

  • 162 units came back within the “net thrown” – see step 1 above

Of these…

  • 22 had the wrong rental (e.g. asking R150 / sqm, when the rental is actually R130 / sqm)
  • 20 units had been double captured – (e.g. pockets of space duplicated)
  • 52 simply had out of date / wrong information (e.g. 55 sqms of space vacant, actually 550 sqms vacant, or the space is no longer public but is still sitting there as vacant)
  • 34 units did not match the public vacancies – they may be private stock
  • 6 units didn’t have enough info to validate (appeared to be incompletely captured)

This left around 28 units of the original 162 listings, that were reliable.

We also identified, alarmingly:

  • 76 units that were vacant, but hadn’t been captured. I.e. they would never have been introduced to tenants, could not feed to websites (assuming desired by brokerage), and would never display in a property brochure

Bottom line, for that specific area, excluding possible “private stock” (including shadow space), there were actually 104 vacant units.

In summary

CRE is one of the world’s most data-complex industries.

As such, it is incredibly difficult for humans, without powerful data tools and specialist processes, to maintain the high volume of CRE data by hand.

So, there is no shame in the findings. Listings / units / vacancies data quality is notoriously hard to manage.

And the good news… efficiency solutions exist.

About the author

Related posts

Best vacancy schedules for commercial property
CRE Innovation
New: CRE vacancy schedules – best in breed

In a long-overdue industry first, PropTech is now delivering professionally-designed, automated vacancy schedules to the CRE industry. You time-starved CRE professionals can now get your vacancy schedules to work as hard as you do, giving your vacant space the best chance of finding the right tenant.

Commercial property brokers opportunity cost
CRE Innovation
What the best CRE brokerages know

How the best CRE brokerages are applying the economic principle of opportunity cost to automate, outsource and focus. The result: higher profits, happier staff. We go from the theoretical to the practical, unpacking this in hard numbers.

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.
You need to agree with the terms to proceed