posted time Created time: 2015-05-05 posted time Last updated time:

Internal Link Strategy for new websites

For young website with a new domain, it is very important the pages in the site are analysed deeply. The new site have to get certain level of first traffic via search engines, in order to get natural back links.

In order to get traffic, the new website's owner have to make most of pages be regarded as important ones. The search engines decide that by linked status.

The 2 clicks rule is essential for the new sites

The 2 click rule is essential for the new site, especially ones using new domains. That is because the search engines do not evaluates pages which are far from top page.

What is 2 clicks rule

The 2 clicks rule is that all page can be reached within 2 clicks from the top page, or all pages.

Generally speaking, some people says it 3 clicks rule. But I recommend the 2 clicks rule for new website owners. That is because the most important thing for new website is that most of pages can be analysed.

If there are a lot of pages, and a lot of back links, it is OK in 3 clicks rule, but the most of young websites do not have both of them.

Thinking about Normal Pagerank

Especially the google has Pagerank. Currently, the Topical Pagerank is famous. But it has normal pagerank in it.

Before the content's theme is analysed, it uses normal Pagerank in order to decide it have to take cost in analysing the page.

If the Pagerank is too low, it does not analyse the page. Therefore you have to make link structure which do not have pages with too low rank.

How to avoid pages with low Pagerank

The Pagerank is calculated from the link topology. The many links to the page and few outbound links from the page lead to high score.

The summation of the page rank is constant, therefore the pages with high Pagerank absorb much value, and little value remains for pages which have low value.

Therefore it is very important not to make pages with too high Pagerank.

Set links to the pages which are linked many times

In order to avoid that, the pages linked from top menu have to have a lot of links to the pages in lower page hierarchy.

For example, the top category pages, which is 2nd level pages next to the top page, have to have all links to the sub categories, and the children.

Create a Html based sitemap page

Sitemap page

In addition to setting links to the top category pages, making sitemap page is effective way. The sitemap page is recommended in the web master's guideline of the google.

The sitemap page has links to the all pages. And it is linked from all pages. This website also has sitemap page.

By using sitemap page, from all of pages, we can reach any page by 2 clicks.

Linking from the top page is most effective way

For new websites, link from the top page is very effective method to make google analyse the page content after crawled.

In this website, the latest 10 blog articles and all of the main content pages are linked from top page.

The young website has a few pages, it is after some months that the number of pages is over 100. The top page can have 200 links. That means it is possible to set links to all pages from top page.

After the number of the pages increases and you think that the top page is messed, you can decrease the links on it.

In this website, following links uses top page.

Blog posts

Blog Posts

The links to the blog posts are on the Top page. In this section, 10 entries are listed.

The entries more than 10 entries are disappears from the top page, but the entries are linked from some main pages.

If the page is once regarded as important one, then the appreciation seem to remain after the links from the top page disappears.

Main contents

Main content links

This website's top page has links to the all main contents. Therefore all of the main content page can be reached with in just 1 click.

And the links on these pages can be reached with in 2 clicks.

Link hierarchy and logical directory structure

In addition to the link hierarchy, the google seem to be able to detect the logical directory structure. That is because it frequently crawls the directories which has child pages.

Googlebot's access log

The logical structure seem to be detected by following elements on the website.

  • Bread list
  • Links on the structured list tags, which are "ul, ol, li" tags
  • Url

If you links to the grandchildren pages, the directory hierarchy is correctly calculated.

Most of website has bread list. This is easy and effective way.

In addition to the bread list, I recommend you to use structured list tags. It teaches the structure to the search engine. In this site, the list tags with links are used in following web parts.

  • Sitemap page
  • Links from parent page
  • Navigation part on the right of the page

How to check if the link structure is correct

When you started a new website with new domain, you will care for whether the website is analysed correctly or not. Then the structured data on the web master tools is useful to check which page the google appreciates.

Check by structured data

This website has following structured data.

  • Breadcrumb (Bread list)
  • Article
  • BlogPosting

Especially the Article and BlogPosting is useful.

The main content pages has Article, and blog article page has BlogPosting data. And summation of them is the total number of appreciated pages.

In the picture, the number of structured data on the graph rose after 4/2/2015. It is after I put sitemap page on the website, and set links to all main contents on 4/1/2015.

Before that, the Structured Data on main contents are not detected on most of these pages. But after that, it detected the data rapidly.


Go to Top