Pages having Crawler tag
The Blog Side Part Component is to show the latest blog posts on the side bar or part after the body content.
Appearance of Blog Side Part Component
The Blog Side Part is used on the Template of Blog Top Page, Blog Category Page, and Top Page.
This component makes list of new articles like below.
This parts is effective to notice a new article is published. When the crawler fetch these pages, which have this component, it definitely recognize the new blog entry by the link.
The Google watches new domain sites for a certain period. The period will finish after a certain count of re-indexing of the web pages.
Therefore accelerating re-indexing is very important for new sites in order to get over the period with almost no traffic from search engies.
Google do not trust new website soon
The Google says officially they do not trust new websites. And saying following things.
- Google do not trust a new website soon
- It has period to check the status of website
- That is same with human trust other person after period to talk and drink together
- If the website is introduced by trusted site, it trust the site
For most of websites, it is very hard to linked from websites which have great trust. If it is the only way, the websites appears in the search result will be websites which are linked from some authority sites.
Therefore, I think there are other way to get trusted by contents instead of that.
Sometimes the ranking on Google get down even if you do white hat SEO. Sandbox effect occurs when you add contents or acquire new links.
It looks like getting penalty, but it is not that. You do not have to respond that, just waiting for reindex.
When the ranking getting down
When the google re-evaluate the website, the ranking get down dramatically. On most of cased, the theme of the entire content is relevant to that.
On following situation, the entire theme are change, or added. Then re-analysing the website may be done.
New category and pages with new theme added
When you writing the blog articles, sometimes you'll extend the categories of the website in order to add new themes.
The number of the indexed pages is one of the metric to measure strength of website in SEO. But I think search engines prefer small number of the pages if the quantity of information is same.
Least pages prevail in deeply analysed Index's rate
The most important thing in SEO to make search engines analyse web pages deeply. If there are a lot of pages, but few pages are analysed after indexed, the traffic from search engines become very low.
Robot comes again and again
Analysing process is very heavy for search engines. Therefore they do not analyse pages which seem to be useless.
In order to find pages eligible to analyse, it crawls web page again and again. Then if the number of pages is small, the opportunity for each page to be evaluated become higher.
Therefore, you have to reduce the number of pages as possible as you can.
The google webmaster tool's sitemap page changed. Most of pages but new blog posts are indexed.
Today is 4th day after I started this website with new domain.
This most rapid response for indexed data is the sitemap page
In order to know the pages newly created pages can be indexed or not, the sitemap page gives rapid information next to the site command.
I'm interested in how long does it take for google to analyse website's contents, because it is necessary when I check the response for SEO activities.
I opened a new site before 2 days with new domain. Today 4 structured data is shown in the webmaster tools.
My website has currently 43 pages. It was 39 pages when I opened it.
The detected pages are ones that I used the "fetch as google". Therefore it is very effective way to make google analyse the pages rapidly.
Article is detected at first
Detected data is "", which is generated by template engine of the content management system.Article (scheme.org)
I've started this website on March 7th in 2015. And next day, all of the pages are indexed by google. Recently, indexing speed is faster than before.
About this website
This website, which is "www.creator-learning", is using a new domain. There are no last owner.
The number of pages is 39. I contrived not to make useless pages, and I made the blog archive pages "noindex, nofollow".
The active page's number is 39, and they are included in the "sitemap.xml".
What I did after opened
After I opened the website, I registered sitemap cml files to following search engines' web master tools.
The Content Management System has function to analyse googlebot's access, and webmaster tools status. By checking them, you can check the health of actual SEO status.
Check crawler log
By checking crawler log, you can check SEO health of the website. The crawler comes to important pages, and if the crawler does not come to a page, it has some problem.
Raw log data
The web master tools show statistic data of crawling. But it is only the number of access.
Actually, there are some kinds of crawlers. By watching the raw log data on Content Management System, you can check actual crawling.