Given:

1) weak server

2) 10,000 computers

3) material on each computer (for example, video for 2-3 hours)

To find:

The solution for transferring material is stable from all machines.

There is no time limit.

How best to implement this?

My experience suggests that the http protocol will not work here. For with a large data transfer, everything just collapses.

I think the best option is to use TCP. In general, I will listen to the opinion of more experienced colleagues. =)

  • 6
    You have mutual exclusion: “high loads” and at the same time “no time limit”. Since there is no time limit, you can connect to all the machines in turn and download at least 2400 bits per second via a modem. - gbg
  • 2
    2-3 hours video is at least a gigabyte (ok, let's say the video is of very poor quality and it is 250mb). The total amount of data is 10 terabytes (2.5 terabytes). If the network is 1 gigabit, then it is 100mb per second - 28 hours (7 for poor quality). But there is still a disk ... You just did the calculations? - KoVadim
  • 2
    It is strange that TCP is right for you, but HTTP is not (which works through the same TCP). And how does FTP look like to you? - Vladimir Gamalyan
  • 3
    Use the torrent tracker model, let the videos be loaded from already downloaded clients. There will be a bottleneck network route. - Dmitry Chistik
  • 2
    > It is strange that TCP is right for you, but HTTP is not >>> TCP can work simultaneously with everyone bit by bit, and http does not turn off until everything is loaded. - hitcode

0