I apologize in advance if the question seems unprofessional. Faced the task of managed video output via hdmi using RaspberryPi. I use openmax as a tool and Python as a programming language. To control, I use subprocess.Popen, opening omxplayer with the necessary parameters and controlling the player through commands in the pipe. The task that caused the difficulties was to launch an html-page with an additional layer on top of the video being played. In fact, there is one more video stream that would broadcast the render of the html file. In which direction to think? Is it possible to somehow turn the output of the program (for example, a web browser) into a video stream? Perhaps it will be RTSP, or something else that can be used as a source for display in omxplayer.
chrome://gpu) for html5 video (then it’s easy to put another html on top) —I don’t know if this is possible now. Alternative: screencasting as a video stream from another machine seems too complicated. - jfs