I apologize in advance if the question seems unprofessional. Faced the task of managed video output via hdmi using RaspberryPi. I use openmax as a tool and Python as a programming language. To control, I use subprocess.Popen, opening omxplayer with the necessary parameters and controlling the player through commands in the pipe. The task that caused the difficulties was to launch an html-page with an additional layer on top of the video being played. In fact, there is one more video stream that would broadcast the render of the html file. In which direction to think? Is it possible to somehow turn the output of the program (for example, a web browser) into a video stream? Perhaps it will be RTSP, or something else that can be used as a source for display in omxplayer.

  • if you forget about how you are trying to implement it now, is it possible to mention what you are trying to do (in terms of the final result) - what context - jfs
  • I need to overlay the HTML page over the video and pack it all into a video stream output via hdmi - Isay
  • this is already written in your question. It describes how you are trying to do something. The context might allow the best possible solution to offer. - jfs
  • Actually, the task is to connect Raspberry to any devices that have a hdmi port. And using it to play the video. Work through openmax is necessary for iron acceleration, since the video will be HD. There will be a rather complicated template on top of it, a lot of dynamic data will be inserted into it (device's x-rays), so the idea of ​​layout on the web seems to me the most convenient. - Isay
  • I would try to dig in the direction of enabling hardware acceleration in the browser ( chrome://gpu ) for html5 video (then it’s easy to put another html on top) —I don’t know if this is possible now. Alternative: screencasting as a video stream from another machine seems too complicated. - jfs

0