Ed Andersen

Software Developer and Architect in Japan

Adding a Dynamic Robots.txt file to an ASP.NET MVC site

Ed Andersen Avatar



Robots.txt is required to allow search engines to properly index your site, and more importantly not index it. If you have a public-facing staging or preliminary site that you don’t want to show up in Google results, you need to make sure that it returns the correct robots.txt with the

Disallow: /

line to prevent indexing. However, manually adding robots.txt files to staging and production environments as a manual process can be improved with the process below – the same code can serve up a locked down robots.txt in staging or internal URLs, and allow indexing in production.

First, add a route that responds to /robots.txt in either Global.asax.cs or RouteConfig.cs before your Default routes:

        controller = "Robots",
        action = "RobotsText"

You’ll also need to make sure that runAllManagedModulesForAllRequests is true in web.config as normally text files bypass the ASP.NET pipeline:

    <modules runAllManagedModulesForAllRequests="true"></modules>

The create a new controller called “RobotsController” with a single action “RobotsText”. All requests to /robots.txt will go here:

    public class RobotsController : Controller

        public FileContentResult RobotsText()
            var contentBuilder = new StringBuilder();
            contentBuilder.AppendLine("User-agent: *");

            // change this to however you want to detect a production URL
            var isProductionUrl = Request.Url != null && !Request.Url.ToString().ToLowerInvariant().Contains("elasticbeanstalk");

            if (isProductionUrl)
                contentBuilder.AppendLine("Disallow: /elmah.axd");
                contentBuilder.AppendLine("Disallow: /admin");
                contentBuilder.AppendLine("Disallow: /Admin");
                contentBuilder.AppendLine("Sitemap: http://www.mysite.com/sitemap.xml");
                contentBuilder.AppendLine("Disallow: /");

            return File(Encoding.UTF8.GetBytes(contentBuilder.ToString()), "text/plain");


You can try a number of ways of detecting a production environment, from the naïve URL checking above to environment variables in your application container.

Ed Andersen Avatar

About me

Hi! 👋 I’m a Software Developer, Architect and Consultant living in Japan, building web and cloud apps. Here I write about software development and other things I find interesting. Read more about my background.

Ed’s “Newsletter”

Get the latest blog posts via email ✌️


6 responses to “Adding a Dynamic Robots.txt file to an ASP.NET MVC site”

  1. Thanks, that is the best solution I have found.

  2. Greats article man ! thanks,
    Juste a question : when you set the “isProductionUrl” variable : did you mean && Request.Url.ToString().ToLowerInvariant().Contains(“elasticbeanstalk”); ? cause with ! test, you’re not going to be in production mode. Isn’t right ?

  3. Thank you for this post!!! I was struggling to get this working in an MVC project, and had everything set up in the route config and web config, but it still wasn’t working. This was the only article I found that had the following hint: First, add a route that responds to /robots.txt in either Global.asax.cs or RouteConfig.cs before your Default routes:

    If it wasn’t for that, I’d still be scratching my head. Thanks again!

  4. Just want I needed! Way easier than trying to do deployment time decision on which robots.txt file to upload like I was trying to do before. Thanks so much.

  5. this method can generate duplicate content in robot.txt file

  6. i’ve got one question only “var isProductionUrl = Request.Url != null && !Request.Url.ToString().ToLowerInvariant().Contains(“elasticbeanstalk”); ” what does it means ? im kinda new so i wonder should i edit this code for my website or leave it like it is?

Leave a Reply

Your email address will not be published. Required fields are marked *