There is a need to collect information from https pages. I use PHP
<?php $url = 'https://www.google.com'; function getSslPage($url) { $ch = curl_init(); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE); curl_setopt($ch, CURLOPT_HEADER, false); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT_REFERER, $url); curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE); $result = curl_exec($ch); curl_close($ch); return $result; } getSslPage('https://www.google.com'); echo getSslPage($url); $file = 'c:\file.txt'; file_put_contents($file, getSslPage($url)); ?> Instead of using Google other sites. Everything is fine, but the stored code is 15-30 thousand lines, and the text file itself weighs over 1mb. The whole process on my not powerful (to put it mildly) PC takes about 3 seconds. From this file I need no more than 10-20 lines. What would be faster:
- Get code into a variable without saving to a file, search for the necessary information and save these 10-20 lines to a file?
- Or, in any case, save the code and only then look for the lines I need?