Website with less than 100 visits/day have to improve
When you can not get 100 sessions/day even if you create pages with useful contents, and the number of pages are more than 100, you'll suspect SEO penalty.
But you have to consider following facts if you can not find obvious penalty.
If your website is very young, and it does not spend 3 month, it is normal case.
The new websites are not ranked in frequently queried keywords until 3 to 4 months are not passed. That is because the domain and your website is not appreciated by search engines.
In this period, there are nothing you can do by getting traffic from SEO. It is period of preparation for after 3 to 4 months later.
What you can do is write contents in the website, and link building is not necessary in this period. Even if you build them, it does not affect ranking.
If your website spend more than 4 months and enough contents, you have to check following things.
The content have to be analysed after index. To be indexed is not enough for the web page to be ranked in.
When the page is appreciated, the search engines change the algorithm to evaluate the page. Then remove the index at once, and re-index it after evaluating it with new algorithm.
The evaluation before the content is analysed, is up to the links, which are both internal links and external links. The search engines can not evaluate the page content before analyse it.
In order to solve this problem, you have to improve internal links so that it can keeps two clicks rule.
Sometimes I found websites which uses no headers in the contents. There are a lot of text, but there are no header to split the section.
The search engines manage the contents with sections of the page. The hx tags, which are h1, h2, h3 and so on, are used to detect the sections of the contents.
If you do not use them, you have to use it.
This is rare case. But if your web page has no keywords which are input on query, the websites never be hit.
This case sometimes happens when you write about your own product. When originality of your product is high, and the page is mainly mention the specification about it, keywords to be queried are not included enough.
In order to improve that, after writing a contents, you have to add topics how to use that, and situations where it is necessary. Then frequently queried keywords are included naturally.
This is a big pit fall for persons who likes technical SEO.
In order to analyse the page content and entire website, it takes long time than we think. Then we have to fix the status of the web contents.
If the status changes, search engines make decision to reset analysed data, and re-analyse them.
The title is the most important fact to decide the theme of page contents. Therefore when it changes, it have to analyse the theme again.
Changing them of some pages is not problem, because it can catch up if the cost of re-analysing is low. But if you change them of many pages, the cost become huge, and statistic status of the website changes dramatically. The status of the website will be reset.
This case happens when you use subtitle.
If the subtitle of many pages changes, the title of these pages changes. And if you do that frequently, the search engines can not fix the evaluation.
The link structure of the website is very decisive fact in analysing website. Links are also used in analysing theme of the page.
Therefore you have to fix the structure when you start the site. You have to keep following things as possible as you can.
- Bread list's anchor of root category
- Main menu
- Blog category's structure
Sometimes you have to update them, but you have to reduce the opportunities as possible as you can.
However, following operations seem to be OK.
- Add a new blog category
- Change links of frequently updated pages like top page and blog categories
In order to keep this rule, you have to arrange topics and plan how to grow up your website. When you plan to start a website, it is very important time.
Therefore then I recommend you to use Mindmap and plan that.
The Fetch as Google function is useful when you want to invite crawler to any page. But when we use it, it seem to create a special index for the directly crawled page, and the analysed status reset.
Therefore, if you use it for all pages, the status seem to be reset. On my experience, when I did that, the ranking of all keywords got down after a few days.
The situation when you have to use it is following time.
- A new page is added to the site
- The content of the page is almost different from last version
These situations are when the analysing process can be starts from 0. On other cases, you have to avoid to use it as possible as you can.