There is a base of 6000 elements (and this is not the limit), it has 35 fields, half of which are updated almost every 5 minutes (and some records are more often). Due to the frequency of data updates, the lines are taken at the rate of 100 pieces when updating. Indices are placed on the fields by which sorting and sampling takes place.
- Does it make sense to include caching for such a database?
- Does it make sense to make frequently updated fields in a separate table? (here) On the product pages, almost all fields are used, is it necessary to make 2 bases instead of 1?
- Handler read rnd next for some reason increases by 1.5k only when the page is updated, when I look at it through PMA; if you go back to see after time, then the increase will be no longer significant (probably taking into account those 1.5k when updating). Does this mean that queries have been optimized?