How do I edit the Robots.txt file when I'm using Confluence Cloud? I need Google to crawl my (1 and only) public Confluence space (it has anon access enabled), and I cannot get the site crawled. I want to specifically set the space to "allow" in robots.txt...
Thx,
J
Isn't this for links and not the confluence pages themselves?
https://support.atlassian.com/confluence-cloud/docs/hide-external-links-from-search-engines/
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
@Jason Green This is not currently possible. You can vote for this feature change here: https://jira.atlassian.com/browse/CLOUD-5915
I would suggest reaching out to support to see if they can assist you with changing search.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.