The FTP server stores files from 0.1 to 300 mb.
It is necessary that users can download these files from the site from the browser.
Now the function responsible for issuing files looks like this:
public function save($path, $filename = null) { if(empty($filename)) { $filename = basename ( $path ); $file = $path; } else { $file = $path . '/' . $filename; } ob_start(); $result = @ftp_get($this->_conn_id, "php://output", $file, FTP_BINARY); $data = ob_get_contents(); $datasize = ob_get_length(); ob_end_clean(); if ($result) { $f = ['data' => $data, 'size' => $datasize]; ftp_close($this->_conn_id); if (!$f) { error("Error. File ".$file." not found.", true); } header('Content-Description: File Transfer'); header('Content-Type: application/octet-stream'); header('Content-Disposition: attachment; filename=' . $filename); header('Content-Transfer-Encoding: binary'); header('Expires: 0'); header('Cache-Control: must-revalidate, post-check=0, pre-check=0'); header('Pragma: public'); header('Content-Length: ' . $f['size']); echo $f['data']; exit(); } else { error("Error!!!! File ".$file." not found.", true); } return null; } But she has a problem: the file is first downloaded completely from FTP to a server with PHP, and then sent to the user for output. For small files, this is not fatal, but for large files - you have to wait for minutes until the "Save file as ..." window comes up in the browser.
What can be done to speed up or optimize this process? How can I split a file into parts or something else ...? Because downloading files of 300 MB each time on php server is somehow fat.
nginxdesirable through anXSendfile. Althoughapache, there is also amod_xsendfile. - And