The entries in the robots.txt can be changed by going to the plugin’s admin page.
I appreciate you responding to help. When I go to what I believe is the plugins setting page I see this in the box:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Disallow: /wp-includes/
Allow: /wp-includes/js/
Allow: /wp-includes/images/
Disallow: /trackback/
Disallow: /wp-login.php
Disallow: /wp-register.php
However, when I go to the /robots.txt file I see this:
User-agent: * Crawl-Delay: 20 #Begin Attracta SEO Tools Sitemap. Do not remove sitemap: http://cdn.attracta.com/sitemap/6184370.xml.gz #End Attracta SEO Tools Sitemap. Do not remove
So it appears your plugin’s robots.txt file is the one that is actually loading – but the box in WordPress as indicated above is showing something very different. And in the one that it appears the plugin is loading, I see nothing blocked, no sitemap – only that link to //cdn which appears to have nothing to do with my website.
My plugin doesn’t add “Begin Attracta SEO Tools Sitemap”. It seems to me that you have another plugin generating a robots.txt or a cached robots.txt.
My plugin puts this at the top of the robots.txt file when you view it in a browser:
# This virtual robots.txt file was created by the Virtual Robots.txt WordPress plugin: https://www.wordpress.org/plugins/pc-robotstxt/
Well heck, haha – thank you for clearing that up. I’ll edit my review 🙂