As you know, configuring robot.txt is important to any online store that is working on a site’s SEO.
You can configure the option by following below path in Magento 2.x admin.
Go to Stores -> Configuration -> General (Left side) -> Design (section) -> Search Engine Robots -> Default Robots (Select)-> Select
- INDEX, FOLLOW Instructs web crawlers to index the site and to check back later for changes.
- NOINDEX, FOLLOW Instructs web crawlers to avoid indexing the site, but to check back later or changes.
- INDEX, NOFOLLOW Instructs web crawlers to index the site once, but to not check back later for changes.
- NOINDEX, NOFOLLOW Instructs web crawlers to avoid indexing the site, and to not check back later for changes.
Now In order to set Custom Instructions.
For e.g:
Allows Full Access
1 2 | User-agent:* Disallow: |
Disallows Access to All Folders
1 2 | User-agent:* Disallow: / |
Default Instructions
1 2 3 4 5 6 7 8 9 10 | Disallow: /lib/ Disallow: /*.php$ Disallow: /pkginfo/ Disallow: /report/ Disallow: /var/ Disallow: /catalog/ Disallow: /customer/ Disallow: /sendfriend/ Disallow: /review/ Disallow: /*SID= |
When complete, click Save Config.
If you have any query related to configure Robots.txt file then please write in comments.
Enjoy Magento 2!
Leave a Reply