Not being indexed can sometimes be useful, particularly if you don’t require the feature, want to optimize your website’s performance, or just want only a set number of people to have access to it.
Up until now, that was something you could do on our platform, but it was mostly a “yes” or “no” thing — you could either allow all crawlers or block all crawlers.
Now, you have much more control over how your robots.txt file works.
Say you want to be indexed by Google crawlers, but not by others. Or you’re ok with most crawlers, but don’t want AI to be trained on your content.
Well, you can! Short.io now supports granular permissions that let you control exactly who can and who can't gain access to your website.
As additional flexibility, you can also choose to share one Robots.txt file across multiple domains — retaining your configuration for all your different websites — or you can create individual configurations for each one.
Try it out!