look in the access logs of the web server, you will see, most likely, duplicate requests.
To cure an error, you can lock (create a lock file with a name equal to the client's IP address, let's say) perform an import when the script starts for each ip address, at the end of the script delete the file. if a file with such ip exists, do nothing.
those. at any time from the 1st ip, only one import process will be possible.
You can modify the script differently.
save md5(file_get_contents("old_csv_file")) , compare md5(file_get_contents("new_csv_file")) ; if they are equal, you do nothing.
then go to the programmer 1c and knock on the head, if for some reason he sends double requests.
and yes, check your code just in case, so that you yourself for research purposes somewhere do not leave a double data insertion in the database, of course.
PS In any case, insert a check for the addition of duplicate data will be good. It is better not to do it at the level of the php-script, but to do it at the level of the database.
If your price list has, for example, an item number, make it in the database with a unique key and in the case of duplicate data, simply update the current values.
best of all is to implement both md5-checking and rewriting in the database, all of a sudden 1c will start sending price lists that contain both duplicates and unique values, but on the whole the data differs from the previous price list.
your task as a programmer is to either provide the maximum possible number of solutions for working with bad data, or not letting download bad data at all so that the guys on that side do not relax. This ultimately depends on your business objectives.