Robots.txt file changes

Hi - Just trying to track down/ verify something: I have a production site that was having some issues with fallback images for twitter cards due to relative path to images - no probs there I fixed the path to include the full domain url to the image.

Having updated trellis and WP(bedrock) and pushed to staging I thought I’d better validate my cards one last time - the validator returns “ERROR: Fetching the page failed because it’s denied by robots.txt.” Oh no - and only a few minutes before I leave for the day!

I tested live and staging and the robots files are different:
Production = User-agent: * Disallow: /wp/wp-admin/ Allow: /wp/wp-admin/admin-ajax.php

Staging = User-agent: * Disallow: /

Where in the chain has this change happend - WP 4.8.2 or the most recent update to trellis (H5BP ?) - I don’t have a robots.txt in the repo.

How can I fix before/ after deploy to production server and retain settings that work for future?

Thanks.

Any environment that is not production disables indexing:

Thanks - I did think so - just odd that it had been working on staging prior to my latest update.

Thanks for the clarification

This plugin is not working for me, my staging site still have status:
Robots.txt file. Page crawling is allowed in robots.txt file.

I’m using Trellis, with my custom theme.