We have a fairly large Rails application, when using ActiveRecord I make a request to a large table with millions of records, then the application becomes unavailable to users until the end of this request, tell me how this can be avoided? We use thin, PostgreSQL DB, nginx web server as server.
- github.com/kaminari/kaminari - Colibri
- Thanks for the answer, but I seem to have incorrectly formulated the question. The difficulty is that the application generates reports for large periods of time, data on them is calculated on request from the database At the time of generating the report, and it takes about 15 minutes, the application becomes unavailable to other users - user310675
1 answer
Good day.
It is quite normal that in large databases some queries can take a long time (although 15 minutes is a bit too much). But if your request is executed synchronously, i.e. directly from the controller action is a request to the database and the minute is not returned - this is already bad. Multiple concurrent requests like this will cause the application to respond with a “denial of service (503)”.
Problem solving - asynchronous request. The controller receives the request, runs for example. ActiveJob and render response. The user conditionally sees the message "Wait, your request is being processed" and quietly uses the application further.
Here are a couple of links:
In general, you need to design the application so that requests are processed seconds (units of seconds). If the request involves something heavy and long, it is better to implement it through a deferred task.
- Thank you very much for the answer, I thought to look towards sidekiq, but I was not sure that it would work in my case and thanks again for the advice! - user310675 8:50 pm