- 5 months ago 39
Ruby Question

How to route to robots.txt?

I'm told by various sources that does not exist.

I do have a file in public/robots.txt

User-agent: *
Allow: /

How can I make the route work so that the error goes away and google can properly crawl the site?


Ensure that there is the following line in your config/environment/production.rb:

config.public_file_server.enabled = ENV['RAILS_SERVE_STATIC_FILES'].present?