there is a file to which a POST request comes from time to time and then it is processed and some data is written to the database. Recently it happened like this: the file was launched 3 times ((judging by the logs) every 8 seconds) with the same POST, as a result there are 3 identical records in the database. What could it be? My options: 1. there were 3 requests for this file 2. an error that led to restarting the file with the same POST parameters
- First option. Identical database entries are a data architect problem. You can always add unique keys - ArchDemon 2:47 pm
- Clean $ _POST immediately after sending, use locaton or other means to redirect. do not leave an action at the form empty, but register a separate script there for processing $ _POST then in that separate script perform all requests and redirect back to the form, well, or to a page with a message about success - Chinese IzKitai
- @ChinaSIZE This file is associated with another service (api), our file only accepts this $ _POST. I wrote a duplication check, but it shouldn’t be there a priori ( - Vasyl Danilyuk
- @Vasyl Danilyuk means you need to solve this problem on the sending side. If they are really the same for all 100% $ _POST - Chinese IzKitai
|
1 answer
If the sending goes out of shape it could be triple pressing. Data post can be sent any number of times. But three records in the database if you are confused, then you need to use unique columns that should not be duplicated, put UNIQUE in the database and continue to catch Exception or check for existing data in the database before writing and write only if there is no such data. When you start the POST handler, write the second option to the timestamp file of the start handler and check further if the start was less than 10 seconds ago, then do nothing.
|