Optimisation within marketing channels, conversion rate optimisation, attribution modelling. These practices have become possible, at least in their current form, thanks to the huge and rapid advancements in available data points over the years.
There is a popular saying within data fields, though. “Garbage in, garbage out.” In other words, the output is only as reliable as the input. This saying couldn’t be truer than when that garbage is the foundation of your decision making process relating to your marketing spend.
In digital marketing, we often talk about being data-driven or data-informed. That’s all well and good. However, considering how common it is for analytics data to be inaccurate and/or misleading, I often wonder whether every business truly benefits from using their analytics data as the aforementioned foundation for budget related decisions. John Wanamaker (1838-1922) famously coined the term:
Half the money I spend on advertising is wasted; the trouble is I don’t know which half.
Are we working under a pretence that what Wanamaker said is less relevant now thanks to the many data points we have access to and the shiny dashboards we cast onto screens around the office? A positive trend is all that matters, right? Again, I have to wonder if, and say it quietly, we wouldn’t sometimes be better just using our hunch in some cases.
Let me elaborate. If your measurement practices aren’t up to scratch (and let me assure you, a lot aren’t!), then guess what? Yep, garbage in….. What do I mean by measurement practices? Well, unless you’ve absolutely nailed the below practices at the point of data collection, everything which is built on that data is, at best, masquerading as useful insights and, at worst, tightening the noose around your business’s neck as decisions around budget allocations continue to be made based on poor quality data, allowing your competitors to steal a march on that all-important market share.
A simple way to remember how to ensure good data quality is by following the steps:
Do Audits, Track Anomalies. You read that right, it conveniently abbreviates to D.A.T.A!
I genuinely can’t remember the last time I opened a Google Analytics property and saw a flawless setup. In an ideal world, you’d have this automated, utilising the Google Analytics reporting API, although, a manual audit can sometimes uncover things an automated check would miss. So, how do you audit a Google Analytics property/view? Let me explain how I tend to structure an audit, though note that more important than how you do an audit is that you do an audit. An audit can often be quite in-depth, so I’m just going to cover what, in my opinion, are the key aspects under each heading:
- Here, the key is to cover all things related to settings. Does your hierarchy of account> property> views make sense? Google has some solid documentation on how to best structure this but, generally speaking, an account should be the brand/business, the property should be a site, app (or now both in one!) and a view is a subset of data. Again, you can’t go wrong here in following Google’s own guidelines which is, at a minimum, to ensure you have one master view for reporting and insights, one test view for testing changes prior to release in the main view and, finally, one completely raw view which is to remain unfiltered and there to provide a backup in the event something goes wrong with the master view. There are tons of reasons your view setup could be more complex than this but it’s good to have a minimum requirement which often helps when starting a new property.
- Have you correctly linked GA to your ads account?
- Are your referral exclusion lists up to date? If, say, you have an ecommerce store and the user flow from entrance to purchase can include a payment provider (such as PayPal) you need to ensure you’re adding them to the list of referral exclusions, otherwise, said payment provider is going to soon become one of your biggest sources of referrals rather than the actual channel which brought the visitor to your site! Really frustrating when you’re trying to evaluate channel performance.
- An often poorly maintained and ignored part of the settings section is the user access section. It’s essential from a data governance point of view to ensure that only the necessary people have access to GA at the required level, too. When giving access to external parties, it’s often worth considering whether it can be enough to only give them access to, for example, a certain view which has all the information they need but no more.
- Is Ecommerce and, if relevant, Enhanced Ecommerce enabled on all necessary views? A mistake I’ve learned from in the past is not enabling it in a test or raw view, meaning there is no way to troubleshoot ecom data in a main view where you suspect a filter/setting is messing with your transactional data.
- Have you enabled site search? Visitors are literally telling you what they want to buy from you in that magical search box. If you’re not collecting and analysing that data, you’re missing a golden opportunity to learn more about your visitors’ intentions as well as giving indicators about how easily navigated your site is.
- Filters. I’m not going to go into great detail about the various filters you may want to set up but you would often want to, for example, ensure internal traffic is excluded from your main view so as to not skew your data. Another example would be to ensure mixed casing isn’t an issue skewing your data such as ensuring a search for “Blue jeans” is treated the same as one for “blue jeans”.
- Ensure your goals are recording data and, more importantly, correct data. Thorough testing should be done with goals to ensure a correct setup.
- A thorough review of your channel setup should be done here too, ensuring they are defined as per your business needs.
- Is your share of new users particularly high or increasing over time? Are you seeing a high portion of new users on a particular browser? This could be a result of the recent changes in browser technologies, particularly prevalent on Safari browsers of more recent versions where ITP has started to take visible effect in the data coming in to GA. It could then be time to start looking at efforts to mitigate the effects of ITP through something like setting cookies server-side, thus avoiding the 24hr/7 day cap on impacted marketing cookies.
- Are you seeing high amounts of referral spam in your reports? Check acquisition> All traffic> Referrals and look out for any strange looking sites.
- I’m not particularly a fan of the bounce rate metric in GA, at least not as it comes out-of-the box. It can be semi-useful to analyse if you have an effective way to track user engagement beyond page navigation. With that said, the bounce rate metric, in my opinion, is a great way to spot tracking errors. Too high a bounce rate (>90%) could indicate issues with cookie persistence whereas too low a bounce rate (<10%) more often than not indicate duplicated tracking, a very common issue. Another reason for this is misunderstanding the nonInteraction field when setting up event tracking. For example, let’s say you want to track an auto-play video on a landing page. Not setting the nonInteraction field to “true” in this instance would ensure every such visit has a 0% bounce rate.
- Collecting anything deemed as PII* (personally identifiable information) in GA is strictly against the Google Analytics terms of service and can result in having accounts closed down. See Google’s advice on this to ensure you’re not doing anything untoward in this area.
- Duplicate transactions. I mentioned earlier the popular phrase “Garbage in, garbage out.” Duplicate transactions, like bounce rate, are one of my favourite data quality garbage detectors. By that, I mean this is one of my go-to reports when doing a quick scan over a property to check for poor data quality, as the presence of such tends to indicate further data quality issues.. Out of the box, GA doesn’t offer a standard report for this data. So, head over to custom reports and create a report with transaction id as the dimension and transactions as the metric. In usual circumstances, you’d expect to see 1 transaction for each transaction id. There can be valid instances where multiple transactions can take place for the same id, such as where a user has the ability to add items at a later date to a current transaction, but this is rare. So, if you’re seeing a lot of transaction ids with more than 1 transaction, you have a problem which needs resolving. There are various ways to resolve this but my advice is to speak to your developers and have them fix the issue at source to ensure a transaction can only be sent once regardless of, for example, how many times a thankyou page is loaded.
*(Per Google) PII includes, but is not limited to, information such as email addresses, personal mobile numbers, and social security numbers
Now, there are many more things worth checking in a GA audit, particularly related to Ecommerce reports, but in the interest of trying to prevent you from contributing negatively to our own bounce rate, I’ll save you the nitty gritty for now and stick to the above key factors.
Auditing your Google Analytics account gets you so far but you’ll need to ensure the data coming into Google Analytics is solid at the point of collection too. Doing an audit merely tells you where your data quality is at that point in time. What it doesn’t do is raise the alarm when things appear to be out-of-sync at a later date.
There are various ways to ensure you cover these bases, too. For starters, GA has a built-in custom alerts feature which you can customise and set up to ensure you are notified when certain changes (at thresholds you define) appear in your data. I’m also really excited about the new custom insights feature in the new App & Web property which, in my opinion, is even better and more flexible than custom alerts.
There are also 3rd party tools which validate and monitor onsite tagging & tracking and notify you of anomalies, enabling you to take action before your data becomes too skewed. It’s also possible, of course, to build your own tag monitor solution in Google Tag Manager and there are many guides out there showing you how to do just that. Attaching your own notification solution to anomalies in tag behaviour would allow you to track anomalies in all your martech tools, beyond just Google Analytics, too.
To summarise, I can’t stress enough how important data quality is, regardless of the purpose of your data collection and irrespective of your brand’s digital maturity. In short, ensure you sort out your measurement practices before you start the fancy stuff if you want to maximise the effects of your work. Remember to D.A.T.A!