If I understood the point correctly - the problem is in too frequent change of location
. When the address of a page changes several times in a short time, each subsequent change overrides the previous one. As a result, only the last works.
In my experiences, I implemented downloading three files in a loop. After starting the cycle, only the last file is downloaded. The browser does not even have time to send a request to the address, and the address is already changing again and you need to send a request to a new one.
The simplest way out showed itself well: we put requests for files (in your case) in a timeout with an increasing interval. In my case, 500ms worked fine:
var counter = 1; $.get("", { name: name, img: img, price: price, url: url }) .done(function(data) { setTimeout(function(){ window.location.assign("/slave/instagram/parse/?name=" +name + "&img=" + img + "&price=" + price + "&url=" + url + "&category_id=" + category_id); console.log("Next"); }, 500 * counter++); });
Of course, it is assumed that the file is downloaded to the specified address, but the page is not updated and continues to work.
If it does not help, add more code to the question; the cycle in which all this happens is interesting.
assign
, the page should refresh to the specified url, i.e. after the update, it will lose its effect and after it is called, it will again change to the first url. And so in a circle. Of course, I could be wrong. - Amandiwindow.location.assign
calls occur every few seconds - you really need to look for the problem. If they occur almost instantly in a row - the last one works. Try a test for the sake of the browser console in a loop up to 10 and executelocation='blabla#i'
- each next interrupts the previous one and as a result only the last one remains to work. - Ivan Pshenitsyn