One of the most common issues causing a failed or incomplete cookie scan is an incorrect URL. Check if your site's URL was typed correctly in your dashboard's Website Information, found under Settings in the left-side menu.
For example, some sites require that "www" is included with the URL for all cookies to be detected.
If you are still receiving a failed scan message when using the Cookie Consent Manager, it may be because your website disallows scanning in your robots.txt.
If you disallow scanning from certain bots, you can try removing the entire Disallow: line. If you wish to only remove it temporarily, you can wait for the cookie scan to finish, then add the Disallow: line back in. However, this could prevent future scans.
If you want to continue to disallow scanning from certain bots, add the following user agent to your robots.txt to allow the scan to run:
# Termly scanner User-agent: Scrapy/2.5.1 (+https://scrapy.org) Allow: /
If the scanner continues to fail it may be blocked by your hosting provider. Whitelist the following IP addresses to allow the scanner to run: 220.127.116.11, 18.104.22.168, 22.214.171.124, 126.96.36.199, 188.8.131.52, 184.108.40.206