Because of the fluctuation we've experienced in geolocation data, we've innovated a normalization methodology outlined below.
Why is normalization necessary?
All location data providers have fluctuations in the volume and composition of devices and apps. People purchase new smartphones and retire old ones every day. New apps are created, while others fall out of favor. This adds a level of variability to the data provided by any location data. Normalization is the statistical process of removing this variability by “anchoring” the data against a known datasource and generating weights to create a more consistent, accurate and reliable data signal.
How does Zartico Normalize Location data?
We perform a two-pass normalization process, first normalizing the national device counts against the national population. Secondly, we perform a client-specific normalization against the resident population within a given destination.
How do we know it works?
Our validation process measures the correlation between visitor counts to specific POIs, or groups of POIs against another trusted data source, like STR hotel demand or occupancy. By confirming the signal we see from location data against another trusted benchmark, and quantifying our accuracy using external sources, this provides us with greater confidence in the reliability and consistency of the data you see in your modules and dynamic visualizations.
Need more help? Talk to a real human.