posted time Created time: 2015-03-29 posted time Last updated time:

Least pages, more information is best policy in content SEO

The number of the indexed pages is one of the metric to measure strength of website in SEO. But I think search engines prefer small number of the pages if the quantity of information is same.


Least pages prevail in deeply analysed Index's rate

The most important thing in SEO to make search engines analyse web pages deeply. If there are a lot of pages, but few pages are analysed after indexed, the traffic from search engines become very low.

Robot comes again and again

Analysing process is very heavy for search engines. Therefore they do not analyse pages which seem to be useless.

In order to find pages eligible to analyse, it crawls web page again and again. Then if the number of pages is small, the opportunity for each page to be evaluated become higher.

Therefore, you have to reduce the number of pages as possible as you can.

A page contains certain quantity of contents appreciated

If the page's theme of contents is too narrow, search engines think the page do not have enough information to respond user's demand.

Therefore you have to write about a certain width of theme suitable for user's demand in a page.

Then the content volume in a page become properly big. If you have a theme you want to add in the website, consider whether you have to create a new page or adding it into a page which exists already.

The analysing speed is up to number of internal links

According to my experience, the speed of analysing website is roughly up to the number of pages in the website.

Number of internal links

Recently I'm checking the number of internal links newly detected and removed. Then the number is relevant to the crawl budget.

And even if the number of page, whose inbound links are detected, is not changed, but the number of detected and removed links are change constantly.

The website with a lot of pages has many links, then the search engine have to check the links. A page has common parts including many links, it is tough burden to search engines and those links are not highly appreciated.

If the number of the page is small, the links on the common part is fewer. Therefore cost of analysing become lower.

Indexing process and depth of analysis

The search engine do not analyse all of the pages after indexed them. In order to be shown in the SERP, the page's index have to be analysed deeply after that. Indexing is the first phase of the analysis.

After robot comes, following thing are done after crawled.

Parse Html and detect keywords

At first, after crawler fetched Html file from server, it parses the Html and detect all of the keywords. In addition to the keywords, it detects meta tags, out bound links.

In this phase, just tokenize the elements in the Html and do not analyse them.

Detect outbound links

After tokenized the Html file, check the links roughly and store the information into the database.

In this phase, just check same links are included in a page or not, and do not analyse the links.

Detect Structured Data

If the page is eligible to analyse on last analyse of this page, the search engine detects and analyse the structured data.

Detected Structured Data

The structured data is not analysed on first crawl.

To checking structured data is effective way to check how the search engine, which is mainly google, evaluates the website.

On most of case, the web pages, which structure data is detected, tend to be analysed both internal links and content keywords.

Analyse detected links

After crawl and parse the Html file, analyse the links on the page. This process is also not done for all pages. When the search engine appreciate the page, then analyse the links on the page.

And then, it does not analyse all of the outbound links on the page, it analyse the links whose destination page is appreciated,

Therefore, it also does not analysed on the first crawl.

On my case of a new domain, the links to the top page is detected at first.

Analyse page theme from keywords and links

If the links to the page is detected, search engines analyse the theme of the page from keywords and inbound links.

When this phase is done, detected links are shown on the web masters tools.

internal links detected

When the page is listed here, that means the page is appreciated. As this list increases, the rank of the keywords up.

Analyse canonical and meta tag

Canonical tag is to incorporate the page to other page. It is handled after the page analysed certain level. That is because the content of the source page have to be almost same with destination page.

Therefore, sometimes some pages which has canonical are shown as duplicated title and description warning. But it will disappear after this phase finished.

Scoring the entire website

After analysing some pages of the webiste, the search engine evaluates entire score. It scores again and again, continuously do that. That is because it can not evaluate pages at once.

The value of the page is up to content itself and relationship between pages.

On this phase, duplicated contents are detected.

Keeping website's quality lead to low cost for evaluation

The page volume was very effect way to rank up and get traffic from long tail keywords, but it is not available now.

The website strong at SEO now is contradiction of last ones. That is because search engines can understand semantic information more than before, and they can calculate quantity of information, which is not content volume.

Therefore SEO means building website with compact, but rich contet pages. It makes evaluation speed rapidly, and have search engines analyse almost all of web pages.


Go to Top