Hi - Just trying to track down/ verify something: I have a production site that was having some issues with fallback images for twitter cards due to relative path to images - no probs there I fixed the path to include the full domain url to the image.
Having updated trellis and WP(bedrock) and pushed to staging I thought I’d better validate my cards one last time - the validator returns “ERROR: Fetching the page failed because it’s denied by robots.txt.” Oh no - and only a few minutes before I leave for the day!
I tested live and staging and the robots files are different:
User-agent: * Disallow: /wp/wp-admin/ Allow: /wp/wp-admin/admin-ajax.php
User-agent: * Disallow: /
Where in the chain has this change happend - WP 4.8.2 or the most recent update to trellis (H5BP ?) - I don’t have a robots.txt in the repo.
How can I fix before/ after deploy to production server and retain settings that work for future?