We automatically filters known bots from your analytics data to provide more accurate insights about your real human visitors. While this method isn't foolproof, it helps maintain cleaner statistics by removing most common bot traffic.
We uses a comprehensive list of known bot User-Agent strings to identify and filter out automated traffic. This includes:
While User-Agent based filtering catches many bots, it's important to understand its limitations:
The bot identification list is regularly updated to include new known bot patterns. Updates are automatically applied to your analytics - no action required on your part.
Q: Will this affect my SEO?
A: No. Legitimate search engine crawlers are still able to access your site - they're just not counted in your analytics.