Hello, there is a mini computer cubieboard 1 and the goal: to broadcast from it a webcam (which is connected to it via USB).

Options for my decisions:

  1. Install Debian, install ffmpeg and translate
  2. We install Android, install any application for broadcasting (I used IP Webcam )

I thought that the first option would be better, because:

  1. OS without graphical shell - eats less
  2. There are no any applications that would also eat resources

But it turned out all to turn around, the android broadcast worked almost perfectly (delay around 0.2s.) With good resolution.

And on Debian with a resolution of 320X240, there were terrible drops in FPS + delay of about 2s and the CPU was 100% loaded. And at 640X480 resolution, it did not work at all.

Maybe something I misunderstand, but how is it possible?

  • What kind of Debian have you used? - alexis031182
  • @ alexis031182, here's the OS info: Linux Cubian 3.4.79-sun4i # 28 PREEMPT Fri Oct 10 03:17:31 CST 2014 armv7l GNU / Linux | No LSB modules are available. | Distributor ID: Debian | Description: Debian GNU / Linux 8.0 (jessie) | Release: 8.0 | Codename: jessie - Mr_Epic
  • Have you tried installing Debian as described in the link below or just Cubian? wiki.debian.org/InstallingDebianOn/Allwinner - alexis031182
  • Only Cubian, since it is specially remade for cubieboard - Mr_Epic
  • If you have confidence that the system is set up properly and works correctly, then the next step of the test should obviously be ffmpeg. Try to broadcast a video, for example, not from a camera, but from some video (file). See which codec is used by Webcam, try using it with ffmpeg or find a similar one. It is quite possible that the codec, which ffmpeg is trying to encode video, and “eats” all available resources. - alexis031182

3 answers 3

The hardware codec is the answer to your question. Take a camera with hardware mjpeg and mjpeg-streamer. A router with a 200 MHz processor was broadcasting a Logitech camera and was loaded at the same time by 7-10%. At the same time, ARM processors have their own codec, which is used by Android.

Debian cannot use these features for several reasons: licensed purity, the absence of a programmer who writes this module and packs it for Ubuntu (because they will take Debian in the repository).

    Delays, they are lags, can occur for many reasons. In your case it is

    • problem with transcoding. The device is weak, in realtime it is difficult to cope.
    • ffmpeg can cache a little (well, at least one frame will be behind - you need to put the picture into the buffer, then send it)
    • the network itself. Here without comments.
    • the player. Players also like to cache, because the data on the network does not often go in an even flow and a small pause introduces a delay. Then either you need to speed up playback, or hold the delay (which will accumulate). A small buffer solves the problem.

    We now turn to the teams.

    ffmpeg -f v4l2 -r 25 -s 640x480 -i /dev/video0 -fflags nobuffer -c:v copy -f sdl - 

    Here you need to clarify a little. -f v4l2 -r 25 -s 640x480 -i /dev/video0 - everything is simple, choose the format, source and number of frames per second. This is chosen according to the personal preferences and capabilities of the camera.

    -fflags nobuffer - tells ffmpeg not to buffer. It may help to remove the delay a little bit.

    -c:v copy is an important string. She says “don't try to transcode the video. Just copy.

    -f sdl - - this will not work if there is no gui. This option says "output to screen". It is good for tests, but not for real work. Since you want a minimum of delays, you can try using udp (then you need somewhere such a construct -f mpegts udp://127.0.0.1:8080 ) or other convenient protocols.

    • I correctly understood that this command (ffmpeg -f v4l2 ...) should I use in the settings (etc / ffserver.conf) to run, and not directly into the console? - Mr_Epic
    • if I understand everything correctly, ffserver is started first, and then ffmpeg, they just point it to the previously launched ffserver. A little more detail . - KoVadim
    • Looks like this: I run ffserver (in which stream is configured), then I launch ffmpeg indicating where to send data, and ffserver already recodes and creates a stream, maybe you need to configure something in it? That's what happens: joxi.ru/QY2LWBqTEPyym6.jpg - Mr_Epic
    • At least 1 second can be removed immediately - in the vlc, which is used, on the network tab (which is in open media), click "show more options" and select the buffer size to be zero (by default it is one second). With recoding to ffserver, the situation is worse - ffmpeg.org/pipermail/ffmpeg-user/2013-March/014172.html - it seems there is not there and there will not be. - KoVadim
    • 2
      This is what is completely written in the terminal: From left - ffmpeg, from right ffserver joxi.ru/qVrwwNQC4eYarX.jpg At the expense of the device can not cope, I would also think so, but everything works fine on Android, that is the question .. - Mr_Epic

    Try mjeg_streamer instead of ffmpeg. It is easier to choose the appropriate fps and screen resolution. Well, the dns 1302w camera is not the best choice for such tasks. It is possible that somewhere in android reading from cheap Chinese video cameras is better implemented. It would not hurt to compare kernel versions.