Suppose in the application there are two fields for entering values ​​and one button by clicking on which a long calculation takes place with several long cycles, which lasts about 3-4 seconds. Does it make sense to cache the result, for example, when closing an application, write a serialized HashMap with a cache to a file, and when starting an application from this file, read the cache and run the program to check the cache for the requested result and, if it does not exist, then read it using cycles and etc. In the cache write, for example, 10,000 values. Interesting expert opinion will give such caching a noticeable performance boost.

    3 answers 3

    Cache manipulation is extra work, extra code that needs to be maintained. Manipulations on serialization-deserialization are even more so. In addition, it can complicate the start-stop and general maintenance of the application creating an extra burden on support.

    Personally, for some reason, it seems to me that it is enough to make an ordinary cache in memory for any values. The first users will receive small delays, but over time everything will settle down. If the probability of repetitions is small, then there is absolutely no need to cache.

    Is that the number of combinations of input data is limited and then you can not cache, and calculate them in advance and prepare a file with the results of the calculations, then to always read them. But, as I understand it, this is not the case.

    UPD: there really is another factor influencing the decision: how often will the server or the restart application be restarted. If it is very very often, there are many options, there are many users, and the repeatability is neither this nor that, then it will probably be worthwhile to do serialization ..

    • If you do not go deep, then in Java serialization is done through the Serialization API, which practically does not require developer efforts - so this is not a problem. And yes, to prepare a file with ready-made solutions will not work because of the huge number of incoming combinations. And so I agree with you. - Evgeniy
    • You will need to work on configuring the application (the path to the place where to store all this), work on catching the start-stop, the support will have to stop the server only carefully (you will not be able to kill -9). The very fact of having a config complicates things. In general, the point is that you should not pay for something that may not be worth the cost, even if they seem small. - cy6erGn0m

    it costs / doesn’t depend on how often the application will be restarted and how complicated the operation is. Usually in small applications, using lazy initialization is enough:

    class MyCache { private final Map < Object, Object > cache = new HashMap < Object, Object > (); public Object getValue ( final Object key ) { Object value = cache.get ( key ); if ( null == value ) { value = calculateValue ( key ); cache.put ( key, value ); } return value; } private Object calculateValue ( final Object key ) { return null; } } 

    when your application is sufficiently “grown up” then you should pay attention towards ready-made solutions (eg ehcache). but even in this case, it is better to analyze the costs of caching / calculating data using a profiler

      depending on how large objects you are going to serialize and deserialize - if they are still less in time, then it is worth serializing.

      Do not forget that the memory is not rubber ... and it is impossible to put everything in KET, or your program, when launched, reading serialized objects, will eat all the memory

      • For example, 10,000 pairs of double values, I do not think that this is a terrible loss for memory. But for myself, I have so far made the decision not to complicate my life and change the code only if necessary. Now there is no need and the program without caching works fine. - Evgeniy