There is a PHP price parser:
foreach ($products as $product){ ... $pageContent = file_get_contents($sourcePageURL); preg_match('/<'.$openedTagWithClass.'>'(.*?)<\\/'.$openedTag.'>/is', $pageContent, $priceString); $priceFromLink = $priceString[1]; ... } In the $products array, each element contains a link to a page with a price source and an opening tag with a class. Those. file_get_contents() takes the contents of the page by reference, and preg_match() pulls only the price between the specified tags from it.
So far, the input array contains some pages, but over time several hundred are planned, so the question is: how can you minimize the load on the source sites or their server when performing this process?
It can either break the whole cycle into parts or not, I’m not guided deeply about what is actually happening, in general, you need to somehow optimize this process if there are hundreds of pages of input.