Adding my own robots.txt
-
I asked about robots.txt, and you answered.
I saw your answer only in my email, as my post and your answer were unfortunately both deleted as I hijacked the thread. I am sorry about that.
This is your answer to my question if I could use my own robots.txt file on the server root. (copy/pasted from my email)
Sybre Waaijer wrote:
Hello!
1. For most (read 99%) of WordPress websites, what you want to do isn’t an issue. However, WordPress Multisite configurations could generate a new robots.txt for each subsite; when you upload a file, that feature can no longer work. For example, these two outputs are different but are on the exact same WordPress website:
– https://premium.theseoframework.com/robots.txt
– https://theseoframework.com/robots.txt2. Google ignores the delay directive in robots.txt — they figure this out automatically, but you can temporarily configure it via their website, see https://support.google.com/webmasters/answer/48620.
Bing does read the delay directive, but you can also configure Bing via their website.If the 3000-4000 requests/day limits your server’s uptime/performance/bandwidth, you may want to speak to your hosting partner about that; a well-configured server should be able to serve about 2~3 requests every second per thread without issue (source).
Still, 4000 requests daily from Google and Bing are a lot, so I assume the website is large.
I notice you wrote (source) in regards to what the server should be able to handle. I am using Cloudways / Vultr, and I am not really impressed. I think my smaller Litespeed server did a better job before I moved to Cloudways.
And I wondered if you could please give me the source-link you were talking about once more, as this is something I’m interested in investigating.
The topic ‘Adding my own robots.txt’ is closed to new replies.