UPD 2
Check if SELinux is enabled, most likely it limits. He may even root. The admin at work says that he always turns off SELinux completely.
Perhaps there are some limits. In general, it is not very good to have such a number of directories in one directory. Check again right, most likely the problem is still in them.
In PHP, it is better to use iterators , they should work faster than procedures with a large number of cycles. For example:
$path = realpath('/etc'); $objects = new RecursiveIteratorIterator(new RecursiveDirectoryIterator($path), RecursiveIteratorIterator::SELF_FIRST); foreach($objects as $name => $object){ echo "$name\n"; }
UPD
You can also use generators, by the way, they seem to be faster than iterators even:
function getAllFiles($dirName) { $dh = opendir($dirName); while (false !== ($fileName = readdir($dh))) { yield $fileName; } closedir($dh); } foreach (getAllFiles('/etc') as $fileName) { echo "$fileName\n"; }
Apruv on speed work
Test for 70 thousand files on 1Kb and 40 thousand directories in one directory.
scandir () , yield , DirectoryIterator
scandir () eats RAM, this is understandable, it is necessary to store a place for an array of data. Speed generators did not add, sadness. But the DirectoryIterator iterator won both in speed and RAM consumption.
"On matches" of course, but with very large volumes I think it will be noticeable.