1 / 6

Google Tips: Crawling and Indexing Pages

According to Google’s John Mueller, URLs often get crawled at different rates. But what poked people’s interests was when he mentioned that some URLs could be crawled once every six months.<br><br>

Télécharger la présentation

Google Tips: Crawling and Indexing Pages

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. GOOGLE TIPS: CRAWLING AND INDEXING PAGES

  2. According to Google’s John Mueller, URLs often get crawled at different rates. But what poked people’s interests was when he mentioned that some URLs could be crawled once every six months. Quick recall! As I discussed from The BingBot Series, web crawling is essential to maintain a website’s rank and also to assess any page issue. But this process should not be frequently done because it would cause constraints on a page resource and might create further problems.

  3. How often does Google crawl pages or indexes? During a webmaster hangout, one publisher asked on how fast does Google remove pages from the index if they added a “noindex nofollow” to its URL and gave an example of a situation they encountered themselves where they placed noindex to a page, but it stayed in Google’s index rather than getting removed. To which, John Mueller responded that Google does not crawl URLs with the same frequency all the time. Sometimes, they do it on a daily basis, or some URLs they do weekly. At other times, they do it every once in a half year or a couple of months. This process is to avoid overloading your servers, just like what happened to Bing.

  4. It also depends if you made significant changes across your website. Most of those changes can be detected quickly by the crawling system, but there could be some leftovers unidentified. Doing site queries is a common practice. What it does as an after effect is it leaves those URLs getting crawled once every half a year. And they will still be there after a couple of months. That’s when Google gets an inkling of a problem with that particular URL or site.

  5. However, there are some situations wherein there isn’t an issue at all, and you think your page doesn’t require indexing at all. Especially if it ranks well in SEO. So John Mueller suggested web developers or creators update their page’s sitemap file during its last modification date to back it up. That way, it gives Googlebot a hint to be discovered, and crawl it out as old web pages. Updating sitemaps can better be used if you have a large number of modified web pages that need to get crawled again. But if there’s only a few specific pages you have to re-crawl, you’ll find Google’s URL Inspection Tool to be more useful.

  6. SOURCE: HTTPS://ANYTHINGSEO.WORDPRESS .COM/2018/10/19/GOOGLE-TIPS- CRAWLING-AND-INDEXING-PAGES/

More Related