There is a base of 6000 elements (and this is not the limit), it has 35 fields, half of which are updated almost every 5 minutes (and some records are more often). Due to the frequency of data updates, the lines are taken at the rate of 100 pieces when updating. Indices are placed on the fields by which sorting and sampling takes place.

  1. Does it make sense to include caching for such a database?
  2. Does it make sense to make frequently updated fields in a separate table? (here) On the product pages, almost all fields are used, is it necessary to make 2 bases instead of 1?
  3. Handler read rnd next for some reason increases by 1.5k only when the page is updated, when I look at it through PMA; if you go back to see after time, then the increase will be no longer significant (probably taking into account those 1.5k when updating). Does this mean that queries have been optimized?
  • The size of such a table is a few megabytes, and so it is all in memory. Updating every 5 minutes is very rare. How to get the brakes on such material is a mystery ... - Akina
  • Well, see it sounds primitive with my words, but the table weighs 200 MB. I forgot to mention 3 of the constantly updated fields - text. They also have information in base64. In five minutes, all 6k records are updated. If there was an opportunity to go through such an amount in less time, then it probably would have been ... Base in InnoDB. - Peter Likrov

0