I want one page on the site to generate koa.js for nodejs , and others django for python . For example,

The GET /nodejs must be answered by nodejs, and GET /python must be answered by python.

Can this be done and how?

  • Through the third proxy server nginx suit? - andreymal
  • one
    The port is, by its nature, still an owner; you cannot add a packet of listeners. But the proxy can, though your scheme smacks of exquisite perversion. - user207618

2 answers 2

It is impossible to hang on one port, you can make a reverse proxy that will redirect requests to NodeJS and Python, but then NodeJS and Python should hang on their ports to which the redirection will go, I would make the main server on NodeJS which would redirect to Python If the system is nix, you can even do it on Unix sockets, it will be faster and less to load the processor.

  • Thank. Exactly what is needed. And it is possible about Unix sockets in more detail? How can they be done with them? - pank
  • There is a nodejs.org/api/net.html#net_server_listen_path_backlog_callback documentation in the node, I'm not particularly friendly with the python, but the search says that there is support, in short it is almost the same as the network, just a special virtual folder is specified in the system instead which works as a buffer, so there is a very low latency and do not load the processor with TCP / IP packet processing - pnp2000
  • node is multi-threaded, python single-threaded is most likely implied. Are you sure that it is worth so proxy through the node? - Igor
  • node single threaded - pnp2000

In production, "application servers" (that which directly hosts the application with the interpreter) usually do not stick out. Usually they are covered with "reverse proxy" .

What for?

  • Then, in a good way, a considerable part of requests to applications simply requests files, without additional logic and checks . It is good if the application server, say, is written to a large extent on some C, and can deal with such requests without even asking the interpreter. But usually, to provide a good, customizable interface, the server stitches tightly with the interpreter and runs much slower than something that is written on its own in C or something comparable. But it already depends on the implementation of specific servers.

    Some application servers are not initially designed to stick out without protection. Not because they are easy to hack, but because a malicious client can drop them . Scripts described in one of his answers earlier.

  • More so that it was possible to lift several processes of one application and scatter requests between them . This is called a load balancer. When the load becomes large, these processes can even be spread by different machines.

  • Still for the sake of caching frequent requests with the same answers. Although setting it up rarely goes without incident.

  • More for the sake of protection . Web servers, typically working in the role of balancers, are usually extremely resistant to the loads themselves, and can be developed in this direction by additional efforts by configuring or adding additional modules that check queries and (possibly) ban violators.

I mainly meet in their role:

  • nginx (popular version, can do a lot of things, many additional modules)
  • Varnish (moving more like a caching proxy, not very popular, but fast)
  • HAProxy (mainly balancer, used in many large projects)

... and a typical case : when the necessary port (80?) listens to the proxy, and receives requests, respectively, the proxy, and receiving requests, the proxy does one of:

  • ... serves them himself (from the cache, the file system or according to the rules in the config)
  • ... sends a request to another HTTP server (again, according to the rules in the config file: it communicates with another webserver on a Unix-socket or network connection)
    • For an application server, the request usually looks like it came from the address 127.0.0.1 (from the connection point of view); this is usually corrected by adding the X-Forwarded-For header to the request sent, where the real client address is written.
    • ... then receives a response from another server and sends it back to the client.