xss – How to protect yourself from excessive "punching" of addresses through the ajax request handler?

Question:

There is a form where one of the first fields immediately after filling in .on('blur') transparently breaks through the site's database – is it already registered or not? Depending on the result, some of the fields below are hidden or shown.

Villains can screw up such a mechanism with mass requests, as a result of which they will receive information that they should not have – rel. who is on the site and who is not.

As I do so far: I store the number of requests in the session. If more than N, I do not process further, all answers are negative. It is clear that you can reset the session, go through a proxy, through Tor. But, probably, it will somehow protect against a full-scale penetration.

How you want to do it: something like protecting forms, where a unique token is generated every time, which can only be used once.

Question: how to "correctly" protect yourself from the very impudent and massive data penetration through the ajax request handler?

Answer:

The approach with storing the number of requests for certain IP addresses usually protects well, and it is extremely difficult to come up with anything else here. You can only change the level at which the filtering takes place and transfer this task directly to the web server.

If you are using IIS, then there is a special module: Dynamic IP Restrictions module .

I think there are analogues for Apache

Scroll to Top