In addition, if we set it up correctly, it will make it easier for search engines to crawl all the contents of the blog that we have.
Thus, article after article that has been published, will be indexed faster by search engines.
Configuring custom txt robot settings on blogspot, must be done correctly. So that search engines can know and distinguish which pages may be crawled and which may not be crawled. Errors in setting the txt robot on blogspot will result in blog articles being ignored or even lost in the Google search engine.
For that, for those who don't know how to set the txt robot, please follow the following guide. Because it will describe in full about how to set the robot txt and also the special robot header tag on blogspot. Listen carefully.
The correct way to set the Robot txt on Blogspot
How to set robots txt
When we first created a blog, the special txt robot was automatically activated at the default settings. We need to know that the default setting is the safest setting for the blog, because it is a direct setting from blogger powered by Google. However, if you want to activate it with your own settings, please follow the following method.
1. Please go to blogger dashboard
a. click settings
b. Crawlers and indexing
c. Custom robots.txt
d. click edit. Then select yes.
2. After that, copy the robot txt settings below.
User-agent: *
•Allows: /
• Disallow: /search
• Sitemap: https://www.yourblog.blogspot.com/sitemap.xml
3. Click save changes. Done
Apart from that, we can also take advantage of the disallow setting to take advantage of pages that search engine robots don't want crawled, for example about and contact pages. Then you can set a custom robot.txt like the following:
• User-agent: *
•Allows: /
• Disallow: /search
• Disallow: /p/about.html
• Disallow: /p/contact.html
• Sitemap: https://www.yourblog.blogspot.com/sitemap.xml
0 comments:
Post a Comment