Hello.
There is a web project, a flurry of requests is expected soon. There is also a wild desire to keep a log of all calls, and even more, for further analysis. We do not want to use third-party tools.
The data, you see, is not very cumbersome: a pair of string fields, a date / time, an identifier, and a plural field. Therefore, we decided to use MongoDB for storage in conjunction with django. There is absolutely nothing complicated, but the question still worries how well this representative of No-SQL solutions will cope with the task?
I would like to know, maybe, who has already carried out the analysis under a barrage of requests? Maybe where are the strengths and weaknesses of MongoDB? How expedient, in the light of further data processing, is to use this particular solution instead of, for example, PostgreSQL? In relational databases, one significant drawback is the inability to create multiple fields, which is fraught with one-to-many and many-to-many connections, and this can adversely affect performance, which in turn is very critical.
UPD
The essence of the task is to collect all the movements of the user on our pages simultaneously sending these statistics to our server. This is implemented on jQuery. From 3 to 7 requests will be sent from each page, the number of users per day is predicted to be from 1,000,000. This most likely means that the database will be a bottleneck.
alter table
is a horror flying on the wings of the night: D Accordingly, with any change you don’t need to suffer over the database structure, you just do what you need PS: about the flurry of requests - everything, as before, has sieves caching and competent indices - Zowie