The robots exclusion standard (robots.txt) is used by websites to tell web crawlers and other web robots which areas of the website should not be processed or scanned. Web robots are often used by search engines to categorize websites.
To exclude directories and a file in your site, add the below lines in robots.txt:
Disallow: /search/Disallow: /login/Disallow: /file.aspx
is requesting access to a wiki that you have locked: https://my.axerosolutions.com/spaces/5/communifire-documentation/wiki/view/22311/robots-txt
Your session has expired. You are being logged out.