Solutions Partner

Web Analytics 101: 5 steps to more accurate website data

Posted

Publishers work hard to create meaningful digital content, but since nearly 40% of all website traffic is robotic, it can be hard to tell what articles and promotions are engaging real audiences.

In a new blog series, “Web Analytics 101: The Digital Publisher’s Guide to Better Data,” the Alliance for Audited Media shares how invalid traffic affects publishers’ websites and why removing this traffic from reports improves audience insights.

Here we look at specific steps publishers can take to identify invalid traffic and gain a better understanding of how human audiences engage with their content.

1 - Use automatic bot filtering tools

Many analytics platforms have built-in tools to filter bot traffic. While these filters might not catch all bots, they are a good starting point to achieve cleaner data. Often these tools must be activated by the website’s administrator to begin filtering bots. Google Analytics, Adobe Analytics and many ad services and fraud detection companies find bots through the IAB/ABC International Spiders and Bots List, which is managed and maintained by AAM.

2 - Identify traffic sources and spikes

Web analytics platforms provide performance data such as the number of visitors and pages viewed, as well as where these visitors came from. This is usually found under the "acquisition" category that shows if traffic originated from organic search, email, a referring website or another source.

It also helps to identify sources of traffic spikes, which occur when a greater number of visitors than usual hits a page. By identifying the source of a traffic spike, publishers can determine if it was caused by legitimate actions such as an article being shared on another website or social channel, or if it was caused by bots.

3 - Create custom bot filters

Once bots are detected, publishers can create custom filters to remove this traffic from their data. Bots are identified by characteristics such as the time spent on page, bounce rate, the time of day the visit occurred and other factors. Filtering bot traffic gives publishers a more accurate look at how humans are interacting with their pages, which leads to better decisions about content and promotions.

4 - Check website tags

Analytics providers require that publishers install a tag on their website to allow them to collect data. If a tag is inadvertently installed more than once, data analytics will be inflated, or if installed incorrectly, it might produce incomplete data. Publishers should check tag containers to ensure there is only one tag for each tracking suite so data isn’t included twice in reports.

5 - Participate in a third-party website audit

If your team needs assistance performing any of the above steps, a third-party audit can help. The AAM Digital Publisher Audit analyzes publishers’ websites to ensure they are attracting human audiences for their advertisers. AAM’s team helps publishers identify bots, create custom filters and ensure that websites are tagged correctly for traffic reporting. With more than 25 years of digital audit experience, our team stays on top of industry updates and changes so publishers have an independent third-party partner.

Do you have questions about your website’s data? AAM can help. Contact us to learn more