How can I limit the number of requests to a php page, for example, to 3 in 1 sec, and how can this be better implemented?
Ps The page itself contains dynamic JSON
data.
How can I limit the number of requests to a php page, for example, to 3 in 1 sec, and how can this be better implemented?
Ps The page itself contains dynamic JSON
data.
And you do not restrict access, just cache the result by storing it in RAM, for example, in memcached with a retention period of 3 seconds.
<?php $m = new Memcached(); $m->addServer('localhost', 11211); $json = 'error'; if (!($json = $m->get('json'))) { if ($m->getResultCode() == Memcached::RES_NOTFOUND) { // Долго и трудоемко вычисляем JSON $json = '{...}'; // Устанавливаем значение на 3 секунды $m->add('json', $json, 3); } } echo $json;
As soon as memcached destroys the json key, the script will not detect it, will again calculate the dynamic request, put it in memcached and give it to the client. In between these events, json will be retrieved from memcached very quickly. Then you can give an arbitrary number of requests, without subjecting the storage to the responsibility for generating JSON. Moreover, with increasing load, you can increase the key storage time, and when it decreases, on the contrary, reduce it.
It is better to implement this not in PHP, but by means of a web server.
For example for nginx there is a limit_req module intended just for your task - limiting the number of requests per unit of time. In the examples there is a config for limiting by one IP. But you can set a "key" and several variables - for example. IP and browser version:
limit_req_zone $binary_remote_addr$http_user_agent zone=one:10m rate=3r/s;
Source: https://ru.stackoverflow.com/questions/545710/
All Articles