Home » Articles » Captchas keep Google from crawling content

Captchas keep Google from crawling content

We’re all familiar with captchas checking that we’re not robots. Using captchas is a great way to minimise spam content, fake registrations, and other web security issues at bay. However, it does impact your content ranking. When it comes to good SEO practice hiding content behind captchas is a bad idea because Google’s web crawler is a robot, meaning it can’t see what’s behind the roadblock.

When Googlebot lands on a page with a captcha to access the main content, doesn’t interact with anything and assumes the captcha is only thing on the page. It will index the page, but the content won’t count towards ranking.

However, there is a solution that allows you to use content-blocking captchas without interfering with crawling or indexing.

According to Google’s John Mueller, provided the main content is easily accessible, you can safely use captchas. To ensure a captcha doesn’t block Googlebot’s view, he recommends checking those pages with the Inspect URL tool in Search Console.

But if you want to completely block content with a captcha while keeping it Google-friendly, simply serve Googlebot a captcha-free version of the page rather than the captcha-enable page users is served. This way, content is used for ranking. Contrary to popular believe, this technique doesn’t violate any Google guidelines and policies.

Request indexing through Google Search Console. Paste the URL you want to be indexed into the URL Inspection Tool. Then, click the “Request Indexing” button.

Contact Right Click Media for more information

You can still use captchas to secure your website forms and content without missing out on Googlebot’s crawling power to rank the information in search results. Right Click Media is a digital agency with extensive experience in all aspects of SEO, including special and compliant Googlebot crawling, and works to improve your ranking. For more information, get in touch with our team today.