I have this line that finds an element in an array and gives its index.
$index = array_search($request->{'url'}, array_values($uac));
The $ uac array consists of addresses (example http://some.site/somepage.php&need=help ). The array has in itself more than 5000 elements (sometimes much more). Because of this, the script sometimes gives the error Fatal error: Out of memory. I set the memory_limit in php.ini to -1 (to do without restrictions) BUT the script continues to generate this error. As I have already read here, this is due to the fact that PHP is 32bit, not 64bit. Well, as far as I understand, in order to avoid this error, I need to split the array into parts (the array_chunk function while preserving the key), search for these individual parts, and then connect them again into one array (the array_merge function) to get the correct index. Help me please write a function for carrying out these frauds "invisible" for the script. Ie, I need an advanced array_search function that will work like the original but invisibly for the user (script) to cut the array into several arrays (let's say one hundred elements) then search and end up pasting the array in order to get the initial index. Since I have not found a ready-made solution on this site, I think this will also be useful for other people in the future.
UPD: Already have an answer that I will try. But since there is a comment about 100k elements, I’ll give the full code to help the gurus.
<?php header('Content-Type: text/html; charset=utf-8'); error_reporting(E_ERROR | E_WARNING | E_PARSE & ~E_NOTICE); $uarray = json_decode($_POST['array']); $uac = $uarray; $res = $uarray; function request_callback($response, $info, $request) { global $uac; global $res; $index = array_search($request->{'url'}, array_values($uac)); $uac[$index] = " "; $rspnc = json_decode($response); $res[$index] = $rspnc; } require("RollingCurl.php"); $rc = new RollingCurl("request_callback"); $rc->options = array(CURLOPT_BINARYTRANSFER => true, CURLOPT_RETURNTRANSFER => true, CURLOPT_SSL_VERIFYPEER => false); $rc->window_size = 5; foreach ($uarray as $url) { $request = new RollingCurlRequest($url); $rc->add($request); } $rc->execute(); for($i = 0; $i <= count($res); $i++) { for ($j = 0; $j <= 1; $j++) { echo $res[$i]->{'name'}; echo "/"; echo $res[$i]->{'quality'}; echo "/"; echo $res[$i]->{'buy_offers'}[$j]->{'o_price'}; echo "/"; echo $res[$i]->{'buy_offers'}[$j]->{'c'}; echo "/"; echo $res[$i]->{'buy_offers'}[$j]->{'my_count'}; echo "/"; echo $res[$i]->{'classid'}. "_" .$res[$i]->{'instanceid'}; echo "<br>"; } echo "<p><p><p>"; } ?>
Well, let the number of elements in the array $ uarray 10 thousand. The code is bad, I know it myself. Give optimization tips.