Google Analytics Bot & Spider Filtering – Should You Enable This Filter?

Just at the end of July, Google Analytics added a new feature to help improve the overall accuracy of your analytics – “Bot Filtering”.  This was announced on July 30, with the promise to have it completely rolled out by July 31.  Whether you’ve not heard of this until now, or saw a blog post about it earlier, or just noticed it while perusing your Google Analytics setup, you may be wondering just what it is, and if you should turn it on or not.  I hope that this post can clear up some of those questions.

Google Analytics Bot Filtering Feature

Improve the accuracy of your analytics by enabling the new Bots and Spiders filter.

What Is Google Analytics Bot Filtering?

Essentially, this is a new feature that attempts to improve the accuracy of your metrics, by filtering out recorded visits from known web crawlers and robots.  It tries to remove the non-human traffic from your analytics.

How does it do this?  By utilizing the IAB Spiders & Bots List to recognize known non-human traffic.  This is actually pretty cool, because unless you’re a member of the IAB (Interactive Advertising Bureau), it would cost a cool $14,000 per year to access this list.  It isn’t cheap for members either!  But Google is a subscriber and is able to use that list to filter out traffic from bots & spiders that would have previously been counted in your Analytics (if you so choose, see below).

Spiders, Bots, & JavaScript (Oh My!)

So many of you techies out there are probably thinking, “Big deal, Google Analytics uses JavaScript to record the visit, and bots don’t read JavaScript.”  It’s true that many bots do not read or process JavaScript.  But there are many that do.  And filtering them out will get you more accurate data.  Your current metrics very likely include traffic from some bots and/or spiders that do process JavaScript.

Will This Stop Referral Spam Then?

In the past few months, most of our clients have seen a fairly significant amount of traffic listed as a Referral from various versions of Semalt.com and Kambasoft.com.  These sessions have all the earmarkings of robots.  They have all been one page visits, mostly with the language set as something other than English.  They are clearly referral spam.

So the question is, will these visits be filtered out?  Some of it will be.  When the feature first came out, we noticed sharp decreases in Referral Spam for most of our clients.  But plenty still existed as well!  It seems that the IAB list is not necessarily very current for eliminating referral spam.

See our related blog post for a thorough explanation of Referral Spam with instructions on how to filter it out of your analytics.

 So Should I Turn the Filter On or Not?

In almost all situations, I think you should.  Including traffic from bots & spiders artificially skews your data.  It becomes less accurate.  For some people, this might be a good thing, but only if they are concerned about things that are not bottom-line focused.  For example, if an SEO firm is focused purely on traffic (instead of conversions), they might like to report that you have a higher number of visits than you really do (because traffic from bots & spiders is included).

But what you really want is an accurate picture of what’s happening, not artificially inflated numbers.

And there are many numbers that will improve by filtering out the non-human traffic.  Since this filter would not affect the number of conversions, but would typically decrease the total visits, your conversion rate would go up.  Note that that’s not quite the same as “improving”.  The new higher conversion rate is just more accurate.  It’s actually been a little higher all the time.  Now you’re just able to report it more accurately.

Your site’s bounce rate will likely decrease as well, because for the most part, each page view from a bot is its own session.  So the bounce rate from just the bots & spiders is typically 100%.  Removing those out will give you a more accurate (and lower) bounce rate for your site as a whole.

So Do I Have to Do Anything to Enable This?

Yes!  By default, this filter is turned off!  So if you want to turn this on, you must do so yourself.  And you must do it for each view and each property you have.  You can’t do anything to get this information for days in the past, but turning it on will start you gathering more accurate metrics from the day you do it, going forward.

To enable the filter, click on the Admin link at the top center of your Google Analytics screen.  This will show you all of your Accounts, Properties, and Views.  The filter is at the View level, so for each View in your Property, click on View Settings and check the box that says “Exclude all hits from known bots and spiders” toward the bottom of the page (as of now, right above the Site Search Settings section).  Make sure that the box is checked.  Then click “Save” at the bottom of the page.

Google Analytics Bot & Spider FIltering

To enable Bot & Spider Filtering, on your View Settings page, scroll down to the “Bot Filtering” section and check the box next to “Exclude all hits from known bots and spiders”.

Now repeat this for every View you have.  Almost!  Hopefully, you have one View that is just pure raw data (no filters, no parameters excluded, etc.) as a backup.  You want to keep this View filter-free, so do not turn on the Bots Filter for this View.  But I recommend your turning it on for every other View.

Now if you have more than one Property, go to the next Property and enable it for each of the Views for that Property (except your Backup View, as described above).

Option: Create a Duplicate View with Bots & Spiders Filter Turned Off

If you are providing information to a client, and want to be able to show them an apples-to-apples comparison to the previous month or year, or you just want to gauge the effect that bots and spiders have had on your numbers up to now, you could consider creating a new View to do that.

Create the new View to be identical to the View you use for reporting purposes.  Go to that “main” View and from the View Settings, scroll down to the bottom and click “Copy view”.  You will be prompted to give this new View a name, which could be something like “Web Site Data With Bots”.  Then go into the View Settings for that View, and uncheck the “Exclude all hits from known bots and spiders” box (since you had already checked it in the original View).

Copy A View in Google Analytics

To create a duplicate View in Google Analytics, from the View Settings page, scroll to the bottom and click “Copy view”.

Now you’ll have two Views that are identical in every way but one—the bots filter.  After a while, you can compare the two to see how much of your supposed traffic is actually coming from bots & spiders.  You can use this to estimate what your previous period’s number should have been if they had been more accurately reported by filtering the bots.

Just make sure that over time, you look less and less at the “with bots” View.  Remember—it’s not accurate.  Report the accurate numbers, even if they make some of your numbers look worse for a month or two.  It’s easy to explain the reason, and it’s far better to report numbers that are as accurate as possible.

This entry was posted in Analytics. Bookmark the permalink.