Something I had to do a while back, but have had to repeat recently when setting up new servers is setting IIS to run robots.txt files as if they were a dynamic page. This is because I run a single code base that shows different languages depending on the domain being called. The process for doing this is, in theory, quite straight forward but it got a little fiddly as I worked on it late in the evening so I thought I’d log the standard process as well as the problem I encountered.
NB: I’ve included Coldfusion in the title because that’s the language I was using on the site in question, it’s not actually a Coldfusion process, but it should work fine on any IIS based CF system.
Firstly set up the IIS server so that it will run your robots.txt dynamically:
By default IIS will treat .txt files as non dynamic and will simply render the text in them if they are called. We want to tell IIS to parse them in the same way they would parse .asp files. Note there’s a handy example of how to parse robots.txt files with ASP.net here.
Get the path to the ASPX engine
This bit is adapted from ASP.net article mentioned above.
- Open IIS and right click on your website and bring up the properties screen
- Go to Home Directory > Configuration. You will be on the Mappings Tab.
- Locate the .asp item and click Edit – Copy the path in the Executable Field and cancel out of that window. Make no changes here, you just need the path to the executable.
- Now, check to see that .txt extension has not been defined (if it has you should check to see who has made this change and why before changing).
- If it has not been defined then click ‘Add’, paste the path into the ‘Executable’ field, and ‘.txt’ into the ‘Extension’ field. In the ‘Verbs’ field you should limit the verbs to ‘POST,GET’ - I haven’t investigated security implications for this, but limiting by default is a good idea IMO. Both ‘Script engine’ and ‘Verify that file exists’ should be checked.
- If it has been defined and you don’t mind changing it, obviously skip the add bit, click on the extension then click ‘Edit’ and make changes as above.
- Once you’ve saved, ‘OK out’ of the configuration section
Double check that Active Server Pages are enabled
This is the bit that got me the other night, working on a new install there was no reason for Active Server Pages to be enabled, so when I was testing nothing happened.
In IIS again, this time click on Web Service Extensions (below the Web Sites folder), on the right there should appear a list of extensions. Active Server Pages should be one of them, and it should be set to ‘Allowed’. If not make it so!
Now, restart IIS and test it all works
Assuming you have done nothing yet to your old robots.txt (and it had something in it), when you now try to browse to it in the browser you should get a an unformatted dump of the text you had in it. This means IIS is now using Active Server Pages to parse the request, but there is no .asp code to parse so it just renders text.
From here you need to put some .asp logic in place so that it loads the appropriate information.
If you get a 404 error
It probably means you have not allowed ASP pages to execute, refer to instructions above. Remember to restart IIS again after doing this.