my website is often down because a spider is accessying to many resources. This is what the hosting told me. They told me to ban these IP address: 22.214.171.124 126.96.36.199 188.8.131.52
But I've no idea about how to do this.
I've googled a bit and I've now added these lines to .htaccess in the root:
# allow all except those indicated here
allow from all
deny from 184.108.40.206
deny from 220.127.116.11
deny from 18.104.22.168
Is this 100% correct? What could I do? Please help me. Really I don't have any idea about what I should do.
Best How To :
based on these
https://www.projecthoneypot.org/ip_22.214.171.124 https://www.projecthoneypot.org/ip_126.96.36.199 https://www.projecthoneypot.org/ip_188.8.131.52
it looks like the bot is http://www.semrush.com/bot.html
if thats actually the robot, in their page they say
To remove our bot from crawling your site simply insert the following lines to your
Of course that does not guarantee that the bot will obey the rules. You can block him in several ways. .htaccess is one. Just like you did it.
Also you can do this little trick, deny ANY ip address that has "SemrushBot" in user agent string
SetEnvIfNoCase User-Agent "^SemrushBot" bad_user
SetEnvIfNoCase User-Agent "^WhateverElseBadUserAgentHere" bad_user
Deny from env=bad_user
This way will block other IP's that the bot may use.
see more on blocking by user agent string : http://stackoverflow.com/a/7372572/953684
Should i add, that if your site is down by a spider, usually it means you have a bad-written script or a very weak server.