Hi, I am working on a RTSP RTP-over-TCP H264 streaming application from a live HW-based encoder. I have implemented MyDeviceSource, MyH264VideoStreamFramer, H264DeviceMediaSubsession and MyH264App based on testOnDemandRTSPServer. In general the application works and I get the live stream. The intended viewer application is VLC on a remote machine in the same LAN, where bandwidth is a non-issue. I've encountered 2 obstacles that have been giving me a hard time: 1. "Glitches" in the video output. After every (quite small) amount of frames the video "glitches" and I get block artifacts, and frames that seem to jump back in time. From playing around with the presentation times, I thought it might be related to this. My approach to fPresentationTime is to set it to gettimeofday() every GOP (i.e. SPS/PPS frames), and increment by 1000000/FPS for every encoded frame. The encoder encodes frame by frame only. When I use openRTSP as the client, the video file is saved perfectly, which no glitches. 2. The streaming server (OS is Linux2.6) is also used for other cpu-intensive processes. This is a requirement. The machine is practically constantly at 100% cpu usage. However, it practically does not use any network resources. When this is the situation, the video glitches become overwhelming, and it seems the client has a very hard time handling the stream and sometimes even disconnects. So the question is how and where is Live555 subject to cpu load, and why does it have such a catastrophic effect on the video output given that network usage is low. Could this be related also to (1)? Thanks, Yaron Levy Emza Visual Sense Ltd. Research & Development Email: <mailto:ya...@emza-vs.com> ya...@emza-vs.com Tel: (972)-52-2968679
_______________________________________________ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel