Why penalty and filter exists in Google SEO
Search engines has penalty and filter mechanism to lessen the ranking value of the website. The is because they are necessary for them.
By knowing that, you can get along with SEO penalty and filter.
SEO penalty and filter is same in the meaning that the web page is evaluated as lower value than actually it is.
But the purpose of the low evaluation is different.
The SEO penalty occurs when you do following things.
- Set a lot of useless links
- Systematically generate too many contents
- Updating web page for SEO too frequently
When the penalty occurs, the search engines think you do evil SEO on purpose. There is automatic penalty and manual penalty.
Both penalty gives the website minus points.
Automatic algorithm penalty
Algorithm penalties are applied to the websites automatically. There are major penalty called Penguin and Panda. The Penguin updates are for links, and Panda updates are for contents.
In order to recover the penalty, you have to change the website and wait for next penalty update.
Manual penalty is applied by human check. And the penalties are shown on the Search Console. In order to recover it, you have to send re-evaluation request with message from the Search Console.
The filter is to ignore the value of analysed content or content it self. It does not give minus points to the website. Just ignore them.
Do not analyse the page
After crawling the page, and index that, Google do not evaluate the detail of the page contents and links.
Filter the pages
After analysing and evaluating pages, the google rank the page. Then it lessen the score to the filtered pages.
The penalty is applied to bad use of effective SEO scheme. Paraphrasing that, effective method is the target of penalty.
In most of cases, if you make websites seriously for humans, it has nothing to do with you. But rarely it has possibility that you do penalised actions on happen. Therefore you have to know that just in case.
Following signals has risk of penalty.
- Unnatural links
- Unnatural co-occurring keywords
- Too many optimised keywords
Duplicated contents is not the target. That is because there are a lot of duplicated contents and it is necessary.
The purpose of giving penalty is to stop bad use of effective SEO signals.
But recently, the Google come to detect the bad use by artificial intelligence. Therefore some of the penalized signals are just ignored now.
Especially link penalty seem to decreasing now.
If the Google can classify effective signals into natural and unnatural ones, it just have to ignore unnatural signals.
The penalty is given to the signals which if website owner continues to do that, Google has problem in calculating ranking.
The Google dislike to be done reverse engineering. That is because if website owner understand the algorithm of ranking, the ranking can be controlled.
Actually, Google do not evaluate the web page, and do not apply the score soon. That is because if the score changes soon, the causal relationship between the change and ranking is soon acknowledged.
Therefore you must not change the code and contents for only SEO too frequently. The crawler fetch the web page periodically and check the update is not for reverse engineering.
It is after it understand the website is for humans, not for reverse engineering and for only other SEO actions.
The filter is not penalty, but it also lessen the score of the website and pages.
When the website is created, Google do not trust the website soon. That is because there are a lot of bad SEO companies, and there are a lot of websites which is not for humans, just for SEO.
Google do not appreciate websites for only SEO.
When they understand the website is for humans, they trust the website certain level, and remove the filter.
There are a lot of websites in the internet world. The Google is huge company, and has huge high performance servers.
But it is not enough to analyse all of the website's pages in the world. Therefore they have to choose which pages to analyse.
Then the pages and websites which seem to be unimportant are not deeply analysed.
The old googles, several years ago, it has help index, which is lesser index to support regular indexes. The mechanism is not used currently, but the concept is still there.
This analysing filter is the one most frequently happen. Most of new websites are caught by this filter. And it can't be helped.
Therefore you have to remove this filter when you start a new website at first. In order to do that, you have to do following things.
- Write affluent quantity and quality of contents in the website
- Write articles continuously
- Consider the internal link structure
- Wait for 3 to 6 months without doing spam actions