by
back

Creating a dynamic robots.txt for ruby on rails

With a robot.txt file you can control what pages may get crawled from bots. You can allow or disallow single pages or folders.
A simple, static solution would be, to but a simple robots.txt file into you /public folder in your rails app... but then you can't dynamically set the content of this file.

If you want a different file for staging and production server or want some dynamic routes in your robots.txt, then you need to generate this file with rails.

First we need a route that matches /robots.txt

routes.rb

get '/robots.:format' => 'pages#robots'

Then we want the controller only to respond to .txt format:

app/controllers/pages_controller.rb

def robots
  respond_to :text
  expires_in 6.hours, public: true
end

Now we can create the view and customize the content of your robots.txt file

app/views/pages/robots.text.erb

<% if Rails.env.production? %>
  User-Agent: *
  Allow: /
  Disallow: /admin
  Sitemap: http://www.yourdomain.com/sitemap.xml
<% else %>
  User-Agent: *
  Disallow: /
<% end %>

Be sure to delete your old robots.txt file in the /public directory....



comments powered by Disqus