Let us say this first: we love Google Analytics (GA), we trust GA more than any alternative, and we hope that many of you are using GA.
Did you receive that? Yes? Then brace for the “bad news”:
Your Google Analytics is off, no matter how perfectly you set it up.
The news isnโt as bad as it might seem (hence the quote marks above). Yes, digital businesses rely on data like drivers rely on windshieldsโso being told your GA is wrong might feel like someone swapping your windshield for a funhouse mirror while youโre driving. But weโre not talking about total system failure here; the level of wrongness weโre describing is closer to someone tinting your windshield while youโre driving.
In Parts II and III, weโll break down the factors out of your control and the factors you CAN control (and fix). For now, here are two bite-size explanations of the “analytic accuracy problem” and why it happens naturally:
First: Google Analytics is an automated system which lacks human insight. This should be obvious, but there are still plenty of moments where (explicitly or implicitly) we expect perfection from software, especially when the name “Google” is stamped on it. But any programmer could summarize this whole problem in four letters: GIGO.
Garbage In, Garbage Out โ meaning that imperfect data and/or imperfect programming will inevitably yield imperfect results (like the data-science equivalent of “play stupid games, win stupid prizes”). Yes, the people who built GA certainly know WTF theyโre doing, but the programming of GA isnโt 100% perfect and, frankly, neither is the data being fed into GA from other sources (more on that in Part II).
The only alternative to programmatic GIGO is hiring a massive team of diligent, discerning accountants as GA account reps and data managersโwhich is still GIGO, and would also defeat the point of GA as software.
Second: some data points will be inaccurate, but this is “OK” because of GAโs overall precision. Letโs clarify the critical detail here: the difference between accuracy and precision.
Say you have a (perfect) 100g weight and a kitchen scale. If the scale is perfectly accurate, it will read 100g; the further the scaleโs readout deviates from 100g, the less accurate it is. But precision is different and totally independent of accuracy; if the scale is perfectly precise, it will always give the same readout (accurate or not) when weighing the same object. So if the scale says 100g every time, itโs perfectly accurate AND perfectly precise; if the scale says 95g every time, it’s only 95% accurate but itโs still perfectly precise.
As weโll explain further in Parts II and III, there are plenty of factors that interfere with the accuracy of GAโs compiled dataโand many of those factors are also beyond GAโs control. But the usefulness of Google Analytics rests on one statistical wisdom: if perfectly accurate data isnโt possible, you can still trust precise measurement tools to yield meaningful nuggets of insight. (Like the milder data-science equivalent of “you can trust a liar to lie.”)
Go back to the kitchen scale; if you place your 100g weight and the scale reads 95g, itโs inaccurate. But if it always reads 95g, at least you can trust the scale to be wrong in a consistent way (which you can then correct in your head as needed). That is, in a nutshell, the same reason we still trust Google Analytics.
0 Comments