This post is about listings data quality – i.e. what quality of service are my commercial real estate (CRE) brokerage tenants getting.
If you’ve seen this post “How does your CRE brokerage’s data quality score?”, you will probably be thinking “Okay, so you how does this compare with my brokerage?”
We take pleasure in answering your question below.
We took a brokerage’s data, and mapped it against all the available vacant stock (thousands of units, and millions of square meters) sitting in the Gmaven data feed.
We chose an area together with a brokerage:
We ran the tests
Here are the results:
We focused on one aspect of the data: units (aka vacancies or listings).
Here the stakes are big (poor listings data quality = egg on face risk), and velocity and volume of data changes is highest.
Below is another way of showing both the findings and outcomes available to the customer:
CRE is one of the world’s most data-complex industries.
As such, it is incredibly difficult for humans, without powerful data tools and specialist processes, to maintain the high volume of CRE data by hand.
So, there is no shame in the findings. Listings / units / vacancies data quality is notoriously hard to manage.
And the good news… efficiency solutions exist.