Hello, I am using Nvidia NVENC (all in cpp) from a Windows 10 machine to encode raw video files to H.264 using the GPU for the purpose of doing live-streaming to another device. I would like to use Live555 for my server & client RTSP to send my h.264 stream of frames as UDP packets using RTSP/RTP.
I've looked at testH264VideoStreamer.cpp and at the FAQ on the wiki and it seems to be the file I should use. The FAQ mentions converting test.264 to read from stdin and pipe my encoder stream to the file. I'm a little lost as to how to proceed, having never done it before, and am seeking some help. The NVENC encoder takes a frame, encodes it as H.264 and then keeps it in memory before flushing it and repeating, and so I would need to pipe my stream before the frame is flushed so it sent to RTSP. The code I'm working from used to store the frames in a file, but I would like for openRTSP to directly get the frames from video memory to save the CPU of as much work as possible. Please let me know, I would really appreciate some help here. Philippe
_______________________________________________ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel