I do not know how to correctly simulate the situation to understand, so I ask:
CREATE TABLE `test` ( `id` INT UNSIGNED NOT NULL AUTO_INCREMENT, `value` INT UNSIGNED NULL DEFAULT '0', PRIMARY KEY (`id`) ) COLLATE='utf8_unicode_ci' ENGINE=InnoDB; I insert in one request a large amount of data, for example 10,000 lines:
START TRANSACTION; INSERT INTO test (`value`) VALUES (1),(2),(3), ... , (10000); COMMIT; If at the same time in another stream there will be another similar transaction, the auto-increment field will be values in order, without breaks 1 ... 10,000 or mixed with values from another transaction?
Is it possible to lock the database for the duration of the request?
What I do: you need to occasionally insert large amounts of data with many-to-many links, but the script will work on shared hosting with restrictions on database queries and do 10,000 INSERT + 10,000 LAST_INSERT_ID + ~ 30000 INSERT in a linked table there is no possibility. Here also I think up how to reduce the number of requests to a DB.