I know, but I have read that any "dings" on your site are negative for your page SEO ranking. And since the first thing that google robot looks for on your site is robots.txt, and if does not find it, it is a possible "ding" on your site.
I have also read that it does not matter if you have robots.txt or not. But I would rather be on a safe side and not incure any "dings" in order to keep my site ranking high.
Especially since robots.txt can't hurt the side but only help it by having sitemap listed there:
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Sitemap: http://primary-domain.com/sitemapindex.xml
For the reasons above, I would love to have robots.txt for all my domains.
So does anybody know how to enable robots.txt for mapped domains?