If you see something, say something.

If you are working on stress testing a specification or analyzing adoption data of a Tech Lab standard, we’d love to hear from you!

We appreciated the considerable interest in our recent explainer about the use of ads.txt (Authorized Digital Sellers) and other complementary standards. 

The explainer was published in the context of independent research about the state of ads.txt files and their maintenance by publishers. We couldn’t help but take the opportunity to deep dive into the examples and stress test the application of ads.txt (even erroneous entries) and other standards like sellers.json in the realm of a programmatic transaction, to understand how a media buyer could or should act when checking ads.txt files.

The research brought to light the reality about the state of adoption of ads.txt. What we learned is that ads.txt is widely adopted but may not be properly maintained by all publishers that have adopted the standard. Our standards are created to systematically solve industry challenges at scale and it is critical that they are properly implemented to ensure integrity in the digital media supply chain and to yield expected results for brands and publishers.

We have been increasingly focused on wide and accurate adoption of our standards, which requires understanding their usage–intended and unintended, correct and incorrect. Over the last two years, we have developed various methods to encourage and enhance the quality of implementations through beta testing periods and compliance programs. And our current product roadmap continues to build on that to have wider and deeper coverage.

We yearn to learn from the feedback we receive from our members, the general public, media and specialist researchers for several reasons:

  • Scale: Given the scale of the industry supply chain, we realize that other organizations and engaged individuals can help support our goal of broad, compliant, and dependable adoption of critical standards. Therefore we always appreciate input, feedback, and findings from those with specific, relevant expertise.
  • Complexity: The digital supply chain today entails a set of complex relationships among ecosystem partners, and the business use cases and applications of technology evolve rapidly. This makes independent investigation and reporting critical to identifying and acting quickly to make revisions.
  • Voluntary nature: Deployment of our standards and recommendations is typically voluntary. This puts restrictions on enforcement and requires us to constantly check, validate, and improve the standards and their adoption within the ecosystem. Thus industry peers coming together to collaborate on timely deployment and spurring their business partners to adhere to the specifications provides the check and balance needed for everyone to benefit from the standardization.

To support our efforts, we depend on insights into the deployment and application of standards that can be shared with the industry and potentially incorporated into tools and services that facilitate adoption.

So, the bottom line is that if you are working on stress testing a specification or analyzing adoption data, we would love to hear from you. “If you see something, say something.” It’s the best way to promote changes and to keep on improving our standards and the industry’s best practices.

It’s as easy as dropping a line to support@iabtechlab.com about your work and we will be happy to collaborate with you.