Back end on node.js.

In one of the cases, it is necessary to handle the gap of the http connection with your hands (for example, if the cunning Russian hackers sawed through the Friendship chainsaw a cable channel with a twisted pair, while laughing insanely and shouting blasphemy).
I subscribe req.on ('close', () => {I do something ...}), I test locally - everything is fine.
But on the prod, the node hangs over nginx. And when the connection is broken between the browser and nginx, the connection between nginx and the node is not broken, and my handler does not work.

Is it possible to somehow override this behavior of nginx?

nginx.conf:

user www-data; worker_processes auto; pid /run/nginx.pid; include /etc/nginx/modules-enabled/*.conf; events { worker_connections 768; # multi_accept on; } http { ## # Basic Settings ## sendfile on; tcp_nopush on; tcp_nodelay on; keepalive_timeout 65; types_hash_max_size 204800; # set client body size to 2M # client_max_body_size 3200M; # server_tokens off; # server_names_hash_bucket_size 64; # server_name_in_redirect off; include /etc/nginx/mime.types; default_type application/octet-stream; ## # SSL Settings ## ssl_protocols TLSv1 TLSv1.1 TLSv1.2; # Dropping SSLv3, ref: POODLE ssl_prefer_server_ciphers on; ## # Logging Settings ## access_log /var/log/nginx/access.log; error_log /var/log/nginx/error.log; ## # Gzip Settings ## gzip on; # gzip_vary on; # gzip_proxied any; # gzip_comp_level 6; # gzip_buffers 16 8k; # gzip_http_version 1.1; # gzip_types text/plain text/css application/json application/javascript text/xml application/xml application/xml+rss text/javascript; ## # Virtual Host Configs ## include /etc/nginx/conf.d/*.conf; include /etc/nginx/sites-enabled/*; } 

sites-available / default:

 ><server { listen 80 default_server; listen [::]:80 default_server; root /home/superuser/serverside/static/; index index.html; location / { try_files $uri $uri/ /index.html; proxy_set_header Host $http_host; proxy_redirect off; } location /api/ { proxy_pass http://localhost:2222; proxy_http_version 1.1; } location /socket.io/ { proxy_pass http://localhost:2222; proxy_http_version 1.1; } } 

In the handler for an example I write:

 req.on('close', (e) => { console.log('close!'); console.log('e:'); console.log(e); }); 

In the console I see:
close!
e:
undefined

  • 3
    nginx config to studio, nginx version to studio, close event handler code to studio - nörbörnën
  • @norbornen questions of course legitimate, thank you. I honestly do not understand anything in the nginx config so I didn’t even look there. But now, of course I'll post it. - muturgan
  • one
    Just be careful with such examples, if you suddenly want to catch the light, you will break off :) But with the closing of the browser, everything will be fine, yes. Perhaps the proxy_buffering off option will help for nginx, but this is not accurate - andreymal
  • 2
    No matter what. Disconnection from the Internet’s point of view is the sending of a RST or FIN TCP packet. Turning off the light or sawing a twisted pair does not result in sending such a packet and does not break the connection: if the light is quickly turned back on (and the client has a laptop on the battery and it has not turned off) or the twisted pair is glued together quickly, the connection will continue as if nothing had happened and the file sending will continue! If no timeouts are set on the client or on the server, the file transfer can be continued any time, at least in a few days - andreymal
  • one
    As an example of the monstrous flexibility of connections, you can also give a VPN. Suppose a client goes to the Internet via VPN and also sends a file via VPN, while on a VPN it has a static IP address. Evil Russian hackers cut the twisted pair, and the file transfer is suspended. The smart client clogs the bolt on the wired internet and sticks the Yota USB modem, reconnects to the VPN server, routing to the client's IP address is updated - and the file transfer continues through the mobile Internet without disconnecting :) Advanced routers (I personally used Keenetic) do all of the above automatically. - andreymal

0