CRE brokerage listings data quality – area test

Commercial property vacancies data quality

This post is about listings data quality – i.e. what quality of service are my commercial real estate (CRE) brokerage tenants getting.

If you’ve seen this post “How does your CRE brokerage’s data quality score?”, you will probably be thinking “Okay, so you how does this compare with my brokerage?”

We take pleasure in answering your question below.

We took a brokerage’s data, and mapped it against all the available vacant stock (thousands of units, and millions of square meters) sitting in the Gmaven data feed.

Step 1

We chose an area together with a brokerage:

Trial area for vacancy data quality in SA

Step 2.

We ran the tests

Here are the results:

CRE data quality area POC

We focused on one aspect of the data: units (aka vacancies or listings).

Here the stakes are big (poor listings data quality = egg on face risk), and velocity and volume of data changes is highest.

Below is another way of showing both the findings and outcomes available to the customer:

CRE data quality area assessment

In summary

CRE is one of the world’s most data-complex industries.

As such, it is incredibly difficult for humans, without powerful data tools and specialist processes, to maintain the high volume of CRE data by hand.

So, there is no shame in the findings. Listings / units / vacancies data quality is notoriously hard to manage.

And the good news… efficiency solutions exist.

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.
You need to agree with the terms to proceed

Related posts