Google Analytics Launches Bot and Spider Filtering

It has been a long-established (yet misunderstood) given that Google Analytics filters out all of the bot and spider traffic from tracking into our precious reports. In fact, there’s a “Traffic from search engine robots” page in the GA help section that flat out says “robot traffic is not counted in Analytics when using a JavaScript tracking method”. So most users have gone on convinced that GA was excluding all bot and spider traffic, despite the fact that very page features text that states “If the search engine that crawls your site does activate JavaScript… you will receive search engine robot data in your reports.”

As of yesterday, Google is introducing a new option to enable bot and spider filtering more in line with what users already thought was happening!

For the longest time, since very little noticeable bot traffic was getting tracked, it went unnoticed by the majority of GA users. But over the last year or so, it seems like we’ve been catching unexpected bot traffic spikes in our clients’ accounts more and more often. And when those spikes occur, it’s often multiple client sites that are getting hit at the same time.

Those of us obsessed with data cleanliness have been fighting an endless filter-driven battle with bot tracking, digging through reports to find traffic spikes with 100% bounce rates from single browser versions in specific locations, so we could add them to our epic bot filtering list. But many Google Analytics users wouldn’t even know where to go to find the signs of bot traffic in their reports, and finding a trustworthy, comprehensive, up-to-date filter list online isn’t easy.

Luckily for all of us, Google is rolling out a new feature that will handle much of this problem. The new “Bot and Spider Filtering,” announced June 30th on Google Analytics’ Google+ page, promises to “exclude all hits that come from bots and spiders on the IAB know[n] bots and spiders list.” According to Google, the filtering feature will detect all hits that match the User Agents named in the list in the same way a profile filter would. The new feature will help users keep their traffic reporting nice and clean by only including the real number of visits to your site.

Google Analytics Bot Filtering SettingsThe best part about this new feature is how easy it is to implement. In the Admin section under “View Settings” for your current view, just scroll down and look for the “Bot Filtering” section and check the box next to “Exclude all hits from known bots and spiders”. That’s it! You can see the new option in the fascinating and exciting screenshot below.

It’s worth noting that the filtering isn’t retroactive. It won’t change any of your old data. But once you check that box, your data will be bot- and spider-free from here on out.

More than likely this option is already available in your account, as Google has promised the rollout will be complete by the end of the day on 7/31.

It’s not entirely clear exactly what kind of impact bots are having on tracked traffic for most sites, so I’m going to be stress-testing the new feature across a sample of our clients and comparing the numbers so we can get a sense of the average change across the accounts. Stay tuned for the results of those tests in a future blog post.

Happy reporting, everyone!

Cliff Karklin

About Cliff Karklin

Cliff Karklin is Manager of Web Analytics at Fathom with over 10 years of online marketing experience with an Analytics and Technical SEO focus. In addition to his work across the company, he heads up analytics and technical initiatives for the Consumer Brands and Ecommerce Team.When Cliff isn't at the office developing creative strategies and digging far too deeply into analytics, you can usually find him still connected to the online world, pouring over the latest industry news and testing new tools and processes.Follow Cliff on Twitter @CliffyKOnline

Leave a Reply