How to accelerate re-indexing pages
The Google watches new domain sites for a certain period. The period will finish after a certain count of re-indexing of the web pages.
Therefore accelerating re-indexing is very important for new sites in order to get over the period with almost no traffic from search engies.
- Google do not trust new website soon
- Checking turn over of Index
- What happens when re-indexing pages
- Accelerate crawler
The Google says officially they do not trust new websites. And saying following things.
- Google do not trust a new website soon
- It has period to check the status of website
- That is same with human trust other person after period to talk and drink together
- If the website is introduced by trusted site, it trust the site
For most of websites, it is very hard to linked from websites which have great trust. If it is the only way, the websites appears in the search result will be websites which are linked from some authority sites.
Therefore, I think there are other way to get trusted by contents instead of that.
I think the google counts re-indexing times for each pages, and total count of that come to a certain quantity, it trusts the website.
That is because following reasons.
- There are website which is ranked without many links and special links
- Such websites can be ranked and get traffic after 3 to 6 months
And the speed to get trusted seem to be relevant to the crawling speed.
The most easy way to check the turn over of the index is watching the detail of the structured data.
You can check it by Search console.
The structured data is once detected, it is checked after that again and again. If the crawler do not crawl the page which has structured data, and it become lame, the data is deleted from this list.
In this website, the most old data is 25 days ago from now. Therefore the expiration of the data will be about a month.
The structured data is detected after crawling by default crawler. If default crawlers do not crawl your pages, and Smartphone crawlers come too many times, you have to reorganize the links structure.
The data is detected after 2 to 3 days. If you find a lot of crawlers come, the list will be updated after a few days.
But the crawler do not detect the data on the first crawl. The Google does not analyse the detail of unimportant pages.
The first crawl is to check the page is not spam and thin content. After that, it crawls other pages, and detect links to the new page.
Therefore it needs 4 to 10 days to detect them. When the page's data is detected, it re-index the page after dispose the last index.
When re-indexing pages, it dispose last index once and create new index. That is because the indexes are corresponding to the algorithm that the search engine analyse the content and links.
Then following phenomena occurs.
- Number of pages shown by the site command decreases
- The number of indexes on sitemap section of Search Console decrease
- The number of index in the Index Status of Search Console decrease
When you start a new website and add a lot of articles at once, after several days, the indexes of the site command will decrease dramatically at once. But you do not have to worry about that. That is because re-indexing a lot of pages at once.
Re-indexing occurs after crawling pages. Therefore to accelerate the speed of crawler means to shorten the period that the Google do not trust your website.
Generally speaking, the period is about 3 to 6 months. The period is according to the speed of re-indexing, and that means crawling speed.
If you started a new site, I recommend you to update contents every day. The "update" means following operations.
- Write a new blog post or page
- The links to the pages updated automatically by CMS
- Add content to old pages
By doing such operations, speed of the crawler increases after about a month.
By making the crawler speed increase at fast phase of growing up, the re-indexing count increases faster.
There is another way to increase the speed. Following graph is of website which is not updated for long time.
The last update of this website is on April 1, 2015. The update is to add sitemap Html page to the website. This website has a lot of deep pages and most of pages are not reachable by 2 clicks from the top page.
Therefore by adding sitemap page, I fixed the problem. After fixing the problem, it detects the change and crawl almost all pages at once. After that, no change happens fo about 1 month and 3 weeks.
But after that, crawling speed increased. The Google seem to watch the site status for the period, and it has no problem, they starts to crawl the site seriously.
Therefore the most important fact of crawling speed seem to be the number of important pages. Updating new page is to increase important pages.