posted time Created time: 2015-05-02 posted time Last updated time:

Surmise google's index types and why ageing filter occurs

Google officially told that ageing filter and sandbox effect do not exists, but mechanism lead to them exists. I think they are caused by index's turn over.

So, I'm doing experiment by using this website. This site is young, currently spent 56 days, therefore it is good condition for experiment.

Tools used in the experiment

I used following tools in experiment.

  • Google's site command
  • Web Master Tools
  • Rank check by search result
  • Crawler log check tool on CMS

Google's site command

I use the site command on "google.com" with 2 methods.

  • Query without date range
  • Query without data range

When the date range was input, it shows pages indexed with history.

And on using it, I turned over all result pages in order to check real number of shown pages. That is because the number of pages at the first page is different from the number.

Web Master Tools of Google

On Web Master Tools of Google, we can use following data.

  • Structured Data
  • Internal Links
  • Content Keywords

Rank check by search result

Actually check the search result by keywords input manually. By checking them, I can check the variation of the keywords is analysed or not.

In order to get traffic searched by long tail keywords, it is very important that a page can hit by many variation of query keywords.

Crawler log check tool on CMS

Crawler's action is relevant to the search engine's internal analysis. By using crawler log checker of my CMS, I checked it.

Index types I surmise and the reason

After crawling the page, the page is indexed. And after that, re-crawl it and the index will turn over if search engine think it is worth doing that.

The index status starts from level 1 and make progress to greater level.

Level 1 First index

The first index is used when google find the page at first. Following situation is to be considered.

  • Indexed by the Fetch as Google
  • Find a new page by links and crawl it

This index is for QDF (Query deserves freshness). The index is stronger than next index and it has history data.

But it seems that the links to the page is not evaluated. That is because the first index is weaker than finally analysed index. And it can not cover variation of similar keywords and forms.

But this index disappears after a few days, maybe it is about 2 - 3 days. Then the page also disappears from site command's result without date range and with it.

Status of the page

  • It appears on site command's result with and without time range
  • The rank is higher than not analysed page

Crawler's action

  • Images on the pages are crawled just after the first crawl of the page

Level 2 Weak helper index

After the period of first index ends, then the new index is created. It is weak index.

It appears on the result of site command without date range, but not appears on without it. And the ranking goes down.

If the google appreciate the page is important, it crawls again and again.

Status of the page

  • It appears on site command's result only without time range
  • The ranking is lower than before

Crawler's action

  • If it is important, crawls again and again
  • Images on the pages are also crawled again and again after page crawl

Level 3 Index before analysed

If the link status to the page is good and google think the page is important, it is going to appreciate the page. Whether the page is regarded as important page or not is decided by the links from analysed and important pages.

Therefore it is important for new domain website to set internal link from top page to a new page.

After becoming this status, the last condition to be more appreciated page is trust. Google trust aged pages, therefore after this status, you have to do is wait for a few weeks.

Status of the page

  • It appears on site command's result only without time range
  • It appears on the structured data section of Web Master Tools
  • It appears on the Content Keywords section of Web Master Tools
  • It appears on the Internal Links section of Web Master Tools

Crawler's action

  • Crawler crawls again and again
  • Images on the pages are also crawled after some days period after page crawl
  • Smartphone crawler of the iPhone crawler comes

Level 4 Analysed index

This status is the goal of the page in SEO. Before this status, the links to the page are not evaluated.

Finally the pages value can excel one of the first index for QDF. that is because the link score is calculated.

In order to make this index, last index is deleted temporarily. And calculating for making this index takes a while. Therefore the ranking of the website goes down for the period.

After this index was made, the score from the internal links and external back links are calculated.

When analysed index is discarded

When the internal links changes in the website, or getting a lot of links from external domain, the Google is going to re-evaluate the website. Then the indexes are disposed at once.

Then the ranking goes down. On following situation, the ranking goes down.

  • Renewal the website
  • Delete a lot of useless pages
  • Get a lot of back links from external domain
  • Change links from the top page or important pages which is linked from all of the pages in the site

Status of the page

  • It appears on site command's result with and without time range
  • It appears on the structured data section of Web Master Tools
  • It appears on the Content Keywords section of Web Master Tools
  • It appears on the Internal Links section of Web Master Tools

Crawler's action

  • Crawler crawls periodically
  • Docomo and Sumsung crawler comes

Why a new domain site's ranking moves like ageing filter

A website with new domain do not appears in the ranking. Most of website can't get traffic from search engines for 2-3 months.

I think that is because of this index's feature. It takes for a period to be analysed by the google.

But the period to be analysed is up to crawled times, the period is not definitely decided in the google's algorithm. that is because the period is different for each page, and that is relevant to frequency of crawling the page.

How to make crawling speed faster

Crawling speed is up to following status.

  • Back links from external domain(follow and nofollow links)
  • Number of pages and contents in the site

If you want to make the speed faster, you have to get links from external site.

Let's use Social Media's nofollow links

Then I think you can use Social Media, that is because the nofollow links can make the speed faster.

Nofollow link from social media

The links from twitter are noffolow links, but the Google watch it. The picture above is log that google bot comes from twitter's link. The "/+95" is redirect url for an article, to make the url short.

Of course, this url is used in only twitter.

In addition to that, sometimes the Google appreciates the nofollow links. Following picture is detected nofollow links on the Web Mater Tools.

Detected nofollow links on web master tools

Generally speaking, the nofollow links are not detected. But actually, they are detected and shown on the Web Master Tools.

Therefore I think the nofollow links which is from web page with same theme and important page is evaluated.


Go to Top