There is a large json file (approximately 7.5 megabytes), which is loaded from disk and decoded in about 3 seconds. I wanted to fix it and put the decoded multidimensional json array in memcached to increase the speed of the script execution:

$cached_cities = $memcached->get('decoded-cities'); if($cached_cities) { $cities = $cached_cities; } else { $uncached_cities = json_decode(file_get_contents('../../assets/json/cities.json'), true); $memcached->set('decoded-cities', $uncached_cities, 604800); $cities = $uncached_cities; } 

But it does not work. Created a test script, put the line 'foo-bar' in the key - it was supposed to. And my array does not fit into any cache.

    3 answers 3

    It does not accept, because the default limit is 1 record 1 mb. Change the startup parameters or config like here https://stackoverflow.com/a/29227471/5996783

    • Actually, the -m 128 option is specified in /etc/memcached.conf. Is this not it? - JamesJGoodwin
    • Not. How much I remember it how much to allocate memory on storage - hardworm

    You won’t be able to win much this way, because when you save it to memcached, your array will be serialized again. Then again serialized. That is, you will run into the same problem, but with a different kind of serialization.

    The best solution would be to store the result of reading JSON immediately in a PHP file:

     file_put_contents('cache.php', '<?php return '.var_export($array, true).';'); 

    This step should not be done every time, but only when updating the JSON file. For example, doing so when building a project:

     if (!is_file('cache.php') || filemtime('data.json') > filemtime('cache.php')) { $data = json_decode(file_get_contents('data.json'), true); file_put_contents('cache.php', '<?php return '.var_export($data, true).';'); } 

    Using data saved in this way is as simple as:

     $data = include 'cache.php'; 

    If you use the instruction cache, which is most likely, then the data will be loaded from such a file instantly.

    If you usually do not need all-all data from this file, or you are going to sample it, then it will be even better to store the data in the database.

    If you think that you can make the project faster due to the abandonment of the database, then think well. Modern databases work very optimally with data: you need to really try to overtake them. It would be better to first measure the search speed in the database with the necessary indices before trying to invent another bike.

    In your case, regular databases like MySQL will work faster, if only because they do not need to load the entire data set into memory with each request .

    • Hmm, I thought memcached is just needed in order to store something in it so that at the right moment it can be taken from fast memory. In fact, I need to sample from this large json array. Do you think my Postgre can provide me with a quick sample? You need to make up to half a second from 10 to 30 samples. - JamesJGoodwin
    • one
      I don’t know about PostgreSQL, but for MySQL, thirty search queries on a warmed-up base of 10 MB in 0.5 seconds are seeds. Even without indexes. It is unlikely that you can do the same thing faster in code. Especially in PHP. - sanmai

    Most likely in php.ini is not configured, it is possible that this is a problem related to the fact that the server memcahed can not accept Json, this setting is again in php.ini. Try to read this - http://php.net/manual/ru/memcached.setup.php and this is http://php.net/manual/ru/memcached.configuration.php