Any site that we use Bedrock/Sage appears to be returning a 404 not found when testing for a a robots.txt
All sites in question are set to PRODUCTION and we can load the url directly.
But… when we use any type of service (Google Robot Tester, curl) to fetch the url of the robots.txt, it returns as 404 Not Found.
It looks like the robots.txt file is dynamically loaded via Bedrock routing.
The only way we can seem to resolve it by hard coding the file into the public root /web
Hope you can help
Thanks
Ross