I recently ran into an issue where I was trying to dynamically serve robots.txt from Rails, i.e., map "/robots.txt", => "welcome#robots". The reason for this is that I want to serve one version of the file in staging and a different version of the file in production. So, I implemented this approach and removed the default Rails robots.txt from my application's public folder only to find that robots.txt was still being picked up from somewhere. I determined that the culprit was the rails_admin gem.
I've searched for a way to prevent the robots.txt file from being picked up without having to modify rails_admin but have come up short. Since the robots.txt file in the gem serves no purpose and since it is conflicting with Rails it doesn't seem that it would hurt to remove it. However, let me know if you disagree or if you know of a different way to resolve this issue.
Removing robots.txt from public as it seems to get picked up if you r…
…emove robots.txt from your Rails app which prevents you from being able to serve robots.txt using a Rails controller action