You won’t be able to win much this way, because when you save it to memcached, your array will be serialized again. Then again serialized. That is, you will run into the same problem, but with a different kind of serialization.
The best solution would be to store the result of reading JSON immediately in a PHP file:
file_put_contents('cache.php', '<?php return '.var_export($array, true).';');
This step should not be done every time, but only when updating the JSON file. For example, doing so when building a project:
if (!is_file('cache.php') || filemtime('data.json') > filemtime('cache.php')) { $data = json_decode(file_get_contents('data.json'), true); file_put_contents('cache.php', '<?php return '.var_export($data, true).';'); }
Using data saved in this way is as simple as:
$data = include 'cache.php';
If you use the instruction cache, which is most likely, then the data will be loaded from such a file instantly.
If you usually do not need all-all data from this file, or you are going to sample it, then it will be even better to store the data in the database.
If you think that you can make the project faster due to the abandonment of the database, then think well. Modern databases work very optimally with data: you need to really try to overtake them. It would be better to first measure the search speed in the database with the necessary indices before trying to invent another bike.
In your case, regular databases like MySQL will work faster, if only because they do not need to load the entire data set into memory with each request .