Suppose I need to read a CSV file containing 20,000 product names + this file weighs about 50MB. How can I open such a file using php and execute about 20,000 sql queries.

Increase script execution time? If only this, then you can complete the list with all the settings for php: time, increase memory and so on.

Thank.

  • Do you need to import the csv file into the database? here in most cases, you can directly - and quickly, and safely. - KoVadim
  • I have fopen more than 2.5 MB does not open, somehow embarrassed to further increase the memory, and on the host probably it is also not rubber? + more requests. After all, before adding a new product, you need to check whether it is in the catalog, if there is an add, if not, update the information that goes to CVS. Ie out of 20,000 data I get at least 40,000 queries in the database, which confuses me a lot - Alexander Lukin

2 answers 2

If you run this script from the command line, then there is no limit on execution time.

To remove memory restrictions, use

ini_set('memory_limit', -1); 
  • and how to run from the command line and if it’s not difficult you can complete the list for the operation of heavy processes: memory, time and something else if there is. Thanks for the answer - Alexander Lukin
  • Like this: php /path/to/your/script.php And there are no other settings besides memory and execution time for limiting the resources consumed. - mantigatos

Is the data in the table unique? What I mean is that it can be possible to use INSERT with an ON DUPLICATE KEY UPDATE statement?

I load the store prices from the suppliers into the Internet, as the program provided by the developers did not suit - it worked slowly and stupidly, I parse files (xls, csv, xml) on the local host using php - and then make up 500 lines of data from the filtered data, connect to the database and execute them. Speed ​​suits me -25 thousand records, together with the parsing of a single file, are performed within 30 seconds. A server is located at a distance of 6000 km

  • Well, this is to reduce requests by half was 20,000 and will be 20,000, but still there are ways to work with large files and a large number of requests so that 400, 500 and more MB of CSV files with 20,30,50,100,200 thousand can be processed. product names without restrictions and without consequences for hosting? At least algorithms to dig in any aside. Thanks for the answer, any answers make me only smarter :) - Alexander Lukin