Correct method for hiding production site from search engines

Hey guys,

Got a site where the client wants to hide the site from search engines, but have the site live still prior to launching their product.

I tried adding the following to the config/application.php file with the other custom settings:
Config::define('DISALLOW_INDEXING', false);

However the robots.txt file is still showing the following on production:

User-agent: *
Disallow: /wp-admin/
Disallow: /*s=*
Disallow: /xmlrpc.php
Allow: /wp-admin/admin-ajax.php

I tested and can also confirm that the constant DISALLOW_INDEXING is showing as true on the website, however the hide from search engines option was unticked and the robots.txt file remained as above. Also ticking hide my site also appears to not change the robots.txt file.

For now I’ve manually created a robots.txt file and threw in a robots meta tag for good measure, but what is the correct process for this?

Thanks