According to CEO Tobi Lutke, Shopify stores are now able to edit their robots.txt file, which gives owners more control over how search engines crawl their site. All Shopify stores start with the same robots.txt, which the company says works for most sites, but now the file can be edited through the robots.txt.liquid theme template. Website owners will be able to make the following edits to the robots.txt file:
- Allow or disallow certain URLs from being crawled
- Add crawl-delay rules for certain crawlers
- Add extra sitemap URLs
- Block certain crawlers
Shopify recommends using Liquid to add or remove directives from the robots.txt.liquid template, as it preserves the ability to keep the file automatically updated. Here are the steps to customizing a Shopify store's robots.txt file:
From your Shopify admin, go to Online Store > Themes.
Click Actions, and then click Edit Code.
Click Add a new template, and then select robots.
Click Create template.
Make the changes that you want to make to the default template.
Save changes to the robots.txt.liquid file in your published theme.
Site owners can also just delete the contents of the template and replace it with plain text rules, though Shopify doesn't recommend it as it won't be automatically updated.
Shopify cautions that editing the robots.txt.liquid file is an unsupported customization, which means their support team can't help with edits. "Incorrect use of the feature can result in loss of all traffic," says the platform. Though that's true of the robots.txt file for any site.