Pages having Algorithm tag
To add contents is effective to SEO, but simply adding web pages is not effective. That is because the recent search engines can understand the quantity of the information, and ignore the pages which do not have few information.
The search engines the quality of website by quantity of information
The recent search engines come to being able to understand the natural sentences more than before. They can understand the duplicated semantics, and original semantics.
Automatically generated contents
Most of automatically generated contents are detected by the search engines. I wrote Be not afraid of big competitors on keywords research, then I wrote that "The Google do not shows all pages in site command".
There are many websites which has thousands of pages, but actually recognised as the target of search result is less than 400 pages in most of cases.
That means that, the Google eliminate the pages which has original information than the other pages.
The Google says the length of the url has nothing to do with importance, but the depth of url structure has something to do with it. But it is not ranking factor.
In this page, I'll mention how the content management system takes measure for that.
The conditions the url must fulfil
We can freely name the url, but we have to consider following things.
Length of the url
The url length should be less than 2000 characters. It is enough length and maybe there are no opportunities to set such a long url.
If the system automatically convert the title into url, it will not be less than 200.
The structured data is one of metrics to check the health of the website. It seem to be created before creating index.
Structured data is good tool to check website's health on Google
The structured data is detected and shown on the search console of the Google.
By watching the data, we can find the urls which are detected on the page are also shown on the Content Keywords and Internal Links console.
The Google do not analyse all of the website's page, it does only pages which are approved as important pages.
To checking which pages are approved is effective way to check the health of the website's SEO status. If important pages for you are not analysed, you had better change the internal link structure or add more original contents to the lower hierarchy pages.
Originality of contents is important factor in SEO. And we have to know how te originality is evaluated.
A lot of original contents increase website's value
In addition to writing a lot of contents in the website, the contents should have originality. Search engines appreciates originality of the contents.
Appreciated originality in contents
The originality is not simply ones which are not copied. Recently search engines detect the semantic of the sentences. Therefore, even if the order and words in the sentences are different, but the meaning is almost same, the originality is not detected.
Originality of the contents
Originality simply means there are no or few other similar information. To make original content is very easy. That is because it have to fulfil the only condition, that there are no other information same with it.
The originality it self do not has value. It is an attribute of the content.
The Google watches new domain sites for a certain period. The period will finish after a certain count of re-indexing of the web pages.
Therefore accelerating re-indexing is very important for new sites in order to get over the period with almost no traffic from search engies.
Google do not trust new website soon
The Google says officially they do not trust new websites. And saying following things.
- Google do not trust a new website soon
- It has period to check the status of website
- That is same with human trust other person after period to talk and drink together
- If the website is introduced by trusted site, it trust the site
For most of websites, it is very hard to linked from websites which have great trust. If it is the only way, the websites appears in the search result will be websites which are linked from some authority sites.
Therefore, I think there are other way to get trusted by contents instead of that.
When you start a new website, what you have to do is to make the website be approved by search engines. The most important fact for new websites is trust.
Increase the trust of website at first
A new website is generally not ranked by search engines. That is because it does not have trust.
Trust is for domain, not for pages
The trust score is given for domain, not pages. Therefore the domain's trust score is low. The pages in the domain can not rank even if how excellent content it is.
Therefore at first you have to do is increase the trust score.
Build historical data to be trusted
The Google has following patent.
Sometimes the ranking on Google get down even if you do white hat SEO. Sandbox effect occurs when you add contents or acquire new links.
It looks like getting penalty, but it is not that. You do not have to respond that, just waiting for reindex.
When the ranking getting down
When the google re-evaluate the website, the ranking get down dramatically. On most of cased, the theme of the entire content is relevant to that.
On following situation, the entire theme are change, or added. Then re-analysing the website may be done.
New category and pages with new theme added
When you writing the blog articles, sometimes you'll extend the categories of the website in order to add new themes.
For young website with a new domain, it is very important the pages in the site are analysed deeply. The new site have to get certain level of first traffic via search engines, in order to get natural back links.
In order to get traffic, the new website's owner have to make most of pages be regarded as important ones. The search engines decide that by linked status.
The 2 clicks rule is essential for the new sites
The 2 click rule is essential for the new site, especially ones using new domains. That is because the search engines do not evaluates pages which are far from top page.
What is 2 clicks rule
The 2 clicks rule is that all page can be reached within 2 clicks from the top page, or all pages.
Generally speaking, some people says it 3 clicks rule. But I recommend the 2 clicks rule for new website owners. That is because the most important thing for new website is that most of pages can be analysed.
Google officially told that ageing filter and sandbox effect do not exists, but mechanism lead to them exists. I think they are caused by index's turn over.
So, I'm doing experiment by using this website. This site is young, currently spent 56 days, therefore it is good condition for experiment.
Tools used in the experiment
I used following tools in experiment.
- Google's site command
- Web Master Tools
- Rank check by search result
- Crawler log check tool on CMS
Google's site command
I use the site command on "google.com" with 2 methods.
- Query without date range
- Query without data range
When the date range was input, it shows pages indexed with history.
When the keyword "SEO" was born, most of topics about it was backlinks. But now a day, serach engines evolved so that they can evaluate web contents with other factors.
What is the Topical Pagerank
The Topical PageRank is extended Pagerank. The PageRank has nothing to do with the theme and topics about the contentns. It calculate only by the links network topology.
But the Toplical PageRank relevants to page theme deeply.
Thesis about Topical Pagerank
There is a thesis about testing Topical Pagerank. By reading it, we can know the feature about it.
Overview of the system