Hello! There is a script that should process a large csv file and check if there is an entry in the database. If there is no entry - add, there is - update. It's simple.
Table structure:
id | number | brand | price 1 | 23 | 1 | 22.3 2 | 24 | 2 | 12.3 The essence of the question: now the script in the loop makes requests to the database to check for existence and further decides to add or update. It turns out a lot of requests and the script is very slow. On duplicate key update - works only with the id field primary field. I have all the fields except id in the loop.
Is there any way to increase performance with such a task or only in the loop do 2 requests? Of course, you can save everything in two arrays and continue to do one insert and update, but the amount of data is about 10M lines ...
UPD (example of a query in a loop)
if(($part_id = $this->db->query('SELECT id FROM parts WHERE `part_number` = "'.$item['part_number'].'" AND `brand_id` = "'.$item['brand_id'].'" LIMIT 1')->row_array()['id']) != FALSE) $this->db->update('parts', $item, array('id' => $part_id)); else $this->db->insert('parts', $item);
id- the primary field,numberandbrand- index. Those. brands can repeatnumber- ka5itoshkaload dataMySQL - Mike$stmt=$db->prapare('SELECT id FROM parts WHERE part_number=? AND brand_id=?');. In the loop$stmt->bind_param('ii',$item['part_number'],$item['brand_id']); $result = $stmt->get_result(); $result->fetch....$stmt->bind_param('ii',$item['part_number'],$item['brand_id']); $result = $stmt->get_result(); $result->fetch....$stmt->bind_param('ii',$item['part_number'],$item['brand_id']); $result = $stmt->get_result(); $result->fetch....For good, if the necessary parameters are in ordinary variables and not in array elements, then bind_param is also done only once before the loop. inside only get_result. - Mike