There is a query, and in it I use about 500 IDs listed in IN

UPDATE `table` SET `age` = '0-18' WHERE IN(1,2,3,4,5.....500); 

The request is processed quickly, but the question has arisen, is there a limit to the number of listed IDs in IN?

  • I started parsing .csv files, and there are 100k lines there ... If all this is done within a single host, consider creating a temporary table ( ENGINE=Memory ), importing the file (via LOAD DATA LOCAL INFILE ) and using it in the query update. Perhaps even with the creation of the desired index for the request. - Akina

2 answers 2

Will work until max_allowed_packet is exceeded

    500 pieces obviously are not taken from the air, most likely they can be somehow obtained under the conditions, then the request acquires a natural lokanichnost and rat:

     update tabl SET age = '0-18' WHERE id = (SELECT id from ... WHERE ...) 

    but if it’s necessary to point out with your hands like this (well, sometimes assenil) then create a temporary ttbl sign with one id field (for the rats to make it PK), enter or load them there, and also

     update tabl SET age = '0-18' WHERE id = (SELECT id from ttbl) 

    or so

     update tabl SET age = '0-18' WHERE id IN (SELECT id from ttbl) 
    • 500 is not the limit, for example, I indicated 500, and so the average value varies from 900 to 1,500. I began to parse .csv files, and there on 100k lines ... - user190134
    • Well, if this is a request for the parsing program - then you need to send requests in batches of several hundred: say, accumulated, say 200 - you sent a request. and the server is easier and calmer - Eugene Bartosh