While working through some of our retailers’ trading reports this week I was reminded of a continual process that retailers or their analytics agencies should be working on, of filtering out bot traffic.
This is important because often bot traffic has a 1 second average visit time, 100% bounce rate, no revenue and 1 page per visit. You can imagine a significant amount of this traffic would be skewing your average figures for all of these metrics and so hiding your ‘true’ figures.
Now then, Google Analytics does have a ‘magic’ checkbox in the settings for whatever view you’re looking at:
This checkbox though, right at the bottom of the screenshot above, is not a ‘solve all’ solution as I found recently digging around in some reports.
So as an exercise it’s worth doing the following:
Go into your normal analytics view where you pull your reports
Go to Audience > Technology > Network
Go to ‘advanced’
Filter the traffic by getting all traffic above 1000 sessions. The reason for this is that we do get the law of diminishing returns here. It’s not worth filtering out 20 sessions as it’s not skewing the numbers enough
Next, click on bounce rate to sort the table for the highest bounce rate
This gives us a list of opportunities that could be filtered out. Anything above 85% bounce rate is worth looking at and if it’s got little pages per session, no transactions, etc it’s likely to be a bot
Go to your view settings and create a filter for these network names
Rinse and repeat this exercise every few months to ensure we’re as accurate as can be.
You may think that this doesn’t change much but let me give you an example. You start working with a new partner who, as part of their service, scans your site to make sure it’s got the up to date stock and price position of your products. The way they do that is using a bot. As a provider that may not be huge, they may not qualify under Google’s catch all for ‘bots’ and so you need to filter manually.
Happy filtering, happy accurate metrics.