- 1 year ago 80
Ruby Question

How to route to robots.txt?

I'm told by various sources that does not exist.

I do have a file in public/robots.txt

User-agent: *
Allow: /

How can I make the route work so that the error goes away and google can properly crawl the site?

Answer Source

Ensure that there is the following line in your config/environment/production.rb:

config.public_file_server.enabled = ENV['RAILS_SERVE_STATIC_FILES'].present?
Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download