I have an Asp.Net MVC 5.1 website. Some people have wrote some bots to download the whole website and worse than that, they don't stop. I was wondering how can I deny access to any action on the website if it's being accessed too fast. By "too fast" I mean a speed which is humanly impossible. For instance once every second. I think this is a common problem for any website. Which is the recommended way to do achieve such restriction?
PS: My website is on a VPS and I have full access to IIS if there's a way to do this there.
Best How To :
If you use an
HttpModule, you can filter each request that comes in before it hits any of your actions or anything.
However, you're going to run into a plethora of issues which may make your attempts cause more harm than good.
- Whenever a browser loads a page, it also loads the resources associated with that page. Are you willing to risk the possibility of breaking images and CSS on your website?
- Search engines exhibit the behavior you've described. Do you not want your website to be searchable?
- What criteria do you use to determine whether it's the same entity trying to download the various pages on your site? If you use IP address, that will work against some bots, but it'll also work against people who are using a shared internet connection with other people accessing your site at the same time. And more sophisticated bots will use tricks to get around this.
So you've got to consider the key question: what is the harm caused by these bots scanning your website?
- If they're hammering your server, then the behavior is more consistent with a denial-of-service attack, in which case you should read up on how to prevent those.
- If they're taking your copyrighted data and displaying it on their own websites without your permission, then you may want to consider legal options, or reporting their abuse to search engines so they don't enjoy success from the fruits of your labors.
- If it's "none of the above" then you should probably just ignore them.