Hi, I have a USB camera that I can stream as MJPEG using a JPEGVideoSource subclass. It all works nicely and the frames are streamed and received ok. I would however like to be able to limit the frame rate of the stream, as it now seems to be 20+ fps. In this case it's way too high as the use case is a surveillance camera that grabs a big overview image, not video conferencing. I tried looking at the FAQ and the examples but didn't see anything.
Also, is there some way to know when all clients have disconnected from a RTSP source so that I could stop grabbing and encoding frames? It seems to work automatically for my H264 stream, but the MJPEG stream continues encoding forever. Both are handled using a OnDemandServerMediaSubsession subclass to create the source and the sink. I hope these questions make any sense and I'd be happy for any hints. Best regards, Jan Ekholm -- Jan Ekholm jan.ekh...@d-pointer.com _______________________________________________ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel