Hello. In general, the problem is this: The project on JAVA is NoSQL DB - MongoDB, you need to work from about 1000000000 records. Inserting, in an empty table, 5000000 records, takes 13-15 minutes, but after inserting 5000000, the further insertion process begins to slow down and almost exponentially and RAM begins to devour Nemer. The priority task of this database is search (the search speed per 10,000,000 is satisfactory)
Question:
- Why it happens?
- How to fix it?
Possible solutions:
Every 5000000 records - create a new table?
Index optimization? (I have a search by id)
Optimization of the MongoDB config?
System optimization?
Replacing the database?
Thanks in advance for your reply!