In the discourse of ad measurement, plenty has been written about IVT (invalid traffic). As a starting point, IVT is defined generally as traffic or associated media activity (e.g., advertising impression) that is generated by bots or any form of nonhuman traffic, or otherwise does not represent legitimate traffic that should be included in measurement counts (i.e., metrics associated to an ad, which includes impression counts, ad viewability, clicks, ad engagement and other outcomes). Please check out the IAB’s Ad Impression Measurement Guidelines for more detailed reading in this area.
One step in arriving at correct counts for ad events is filtering for known sources of traffic that is designed for a legitimate purpose, but must not be counted for advertising measurement. One category of such traffic is automated processes known as spiders and bots. A spider, also referred to as a web crawler is an Internet bot that systematically browses the World Wide Web and is typically operated by search engines for the purpose of Web indexing. Besides search engine indexes, there are other valid reasons for web crawlers to operate. For example, the IAB Tech Lab operates a web crawler that reads and aggregates ads.txt files. This is a valid operation; however, when it comes to ad campaign measurement, or website analytics, these crawlers represent traffic that we don’t want to be counted as the “opportunity to see” the advertiser’s message has not been achieved.
The IAB Tech Lab publishes a comprehensive list of such Spiders and Robots that helps companies identify automated traffic such as search engine crawlers, monitoring tools, and other non-human traffic that they don’t want included in their analytics and billable counts.
The IAB Tech Lab Spiders and Robots provides the industry two main purposes. First, the spiders and robots list consists of two text files: one for valid browsers or user agents and one for known robots. These lists are intended to be used together to comply with the “dual pass” approach to filtering as defined in the IAB’s Ad Impression Measurement Guidelines (i.e., identify valid transactions using the valid browser list and then filter/remove invalid transactions using the known robots list). Second, the spiders and robots list supports the MRC’s General Invalid Traffic Detection and Filtration Standard by providing a common industry resource and list for facilitating IVT detection and filtration.
The IAB Tech Lab Spiders and Robots list is available under an annual subscription model – you can find more information here: https://iabtechlab.com/software/iababc-international-spiders-and-bots-list/
The best practices for implementing these two lists can be found here:
IAB/ABC International Spiders & Bots List
The list is updated monthly to reflect changes that are brought to the attention of one of our partners (AAM and ABC UK), the IAB Policy Board or the IAB Tech Lab.
We encourage those conducting ad measurement to use the list, as it is one of the core requirements for compliance with the MRC’s Invalid Traffic Detection and Filtration process.For any questions or comments on how to use the list, email spiders.bots@auditedmedia.com or spiders@iabtechlab.com
ABOUT THE AUTHOR