At the moment we have a problem ...

Node.js script, starts the server and waits for clients, parsit several sites at once (from 4) and returns sorted results to the client's request ...

With a load of 50 conversions per second, after 5-10 seconds after starting, all external connections to the sites that need to be parsed start to hang, connections are broken on timeout (we get Error: socket hang up), but this does not give anything any new ones also hang. ..

Tried to run on different servers in different ds, the problem is in the script itself ...

FreeBSD: 9.2 Node.js: 0.10.22

Test script - https://www.alphabid.com/parser2.js

  • I hope you know that async!=fork ? in the sense that var _async = require ('async'); function a (cb) {var c = 0; for (var i = 0; i <100000000; i ++) {c = i * 2; } console.log ("A end"); cb (null, c); } function b (cb) {console.log ("B end"); cb (null, "BBB"); } _async.parallel ([a, b], function (err, results) {console.log (err, results);}); will output: A end B end null [199999998, 'BBB'] - zb '
  • but for me everything looks good, most likely the data providers slow down you, take your "server", which will stupidly give data without any restrictions (you can statics) and try to load, most likely there will be no such brakes. - zb '
  • So I checked one of your suppliers with a simple script: time curl megayachts.ru/news > / dev / null; for ((a = 0; a <500; a ++)); do echo $ (echo -n "$ a"; curl -v megayachts.ru/news &> $ a.log && echo OK) & done $ grep 'left intact' * | wc -l 413 which means that 413 out of 500 connections have been dropped. so that. tip: cache data, do not ask aplink for every sneeze - we don’t live in the XXIV century - zb ' ' '
  • megayachts.ru is a random site. here is an example of a real xml xml.intecppc.com/… the point is that you can't cache - Bender2009
  • you need to always get fresh data ... - Bender2009

0