What is robots.txt?
The robots.txt file, also known as the robot exclusion protocol, is a standard that prevents web crawlers from accessing all or portion of a website. It is a text file used for SEO that contains commands for the indexing robots of search engines that describe which pages may and cannot be indexed.Create custom robots.txt file for blogger SEO-friendly the best search engines such as Google, Bing, Yandex, and others, we specify user-agent, allow, disallow, and sitemap functionalities in the robots.txt file. Let's have a look at what each of these phrases means. and all search engines crawling bots indexing blog articles and sites across the web, we use robots meta tags.
How to create a custom perfect robots.txt file for blogger blog?
Some steps for create perfect custome robots.txt for robots.txt file in blogger/blogspot blog SEO search engine.1). Blogger Login
2). Click on the settings option
3). Scroll down and go-to crawlers and indexing option
4). Enable custom robots.txt by the switch button
5). Click custom robots.txt option
6). Will be open robots box
7). Paste the custom robots.txt
8). Update button
The perfect robots.txt file for the Blogger code given below and repalce with your domain name "www.example.com".
Disallow:
User-agent: *
Disallow: /search*
Disallow: /20*
Allow: /*.html
Sitemap: https://www.example.com/sitemap.xml
Sitemap: https://www.example.com/sitemap-pages.xml
0 Comments