Re: [Live-devel] Live video and audio streaming using one RTSPServer

2015-11-05 Thread Ross Finlayson
> Sorry, I meant fLastPlayTime That is a member variable used internally by the implementation of the “ByteStreamFileSource” class, and should not be something that you need to concern yourself with. But presumably - if you’re using a “ByteStreamFileSource” - you’re feeding it into a “H264Vide

Re: [Live-devel] Live video and audio streaming using one RTSPServer

2015-11-05 Thread Daniel Yacouboff
Sorry, I meant fLastPlayTime               -Original Message- From: live-devel [mailto:live-devel-boun...@ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, November 05, 2015 11:48 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live video and a

Re: [Live-devel] Live video and audio streaming using one RTSPServer

2015-11-05 Thread Ross Finlayson
> However, my video is raw h264, and I don't guarantee that on each recv() call > I get exactly one complete frame (I might get more than one frame, or a > partial one), so 1) how can I calculate the prefferedFrameSize if it's h264 > video, as it depends on the movement level in the frame, and

Re: [Live-devel] Live video and audio streaming using one RTSPServer

2015-11-05 Thread Daniel Yacouboff
ve-devel] Live video and audio streaming using one RTSPServer > Should both streams at least have the same start value in presentation time? If they’re intended to be played at the same time, then yes. Ross Finlayson Live Networks, Inc. http:

Re: [Live-devel] Live video and audio streaming using one RTSPServer

2015-11-05 Thread Ross Finlayson
> Should both streams at least have the same start value in presentation time? If they’re intended to be played at the same time, then yes. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ___ live-devel mailing list live-devel@lists.live55

Re: [Live-devel] Live video and audio streaming using one RTSPServer

2015-11-05 Thread Daniel Yacouboff
Live-devel] Live video and audio streaming using one RTSPServer > Can you please give me a head start on how to correctly calculate the > presentation times? For instance, do I need to have the same time value for > an audio frame and a video frame that should be played together? Yes.

Re: [Live-devel] Live video and audio streaming using one RTSPServer

2015-11-04 Thread Ross Finlayson
> Can you please give me a head start on how to correctly calculate the > presentation times? For instance, do I need to have the same time value for > an audio frame and a video frame that should be played together? Yes. > Should I use a shared presentation time variable for both streams? I’

Re: [Live-devel] Live video and audio streaming using one RTSPServer

2015-11-04 Thread Daniel Yacouboff
ideo and audio streaming using one RTSPServer > What am I doing wrong? I’ve read a lot about it and it seems my presentation > times are not ok, but how can I calculate them properly? Yes, the problem seems to be the presentation times (i.e., the values of the “fPresentationTime” member vari

Re: [Live-devel] Live video and audio streaming using one RTSPServer

2015-11-04 Thread Ross Finlayson
> What am I doing wrong? I’ve read a lot about it and it seems my presentation > times are not ok, but how can I calculate them properly? Yes, the problem seems to be the presentation times (i.e., the values of the “fPresentationTime” member variable) that you’re setting for both your audio and

[Live-devel] Live video and audio streaming using one RTSPServer

2015-11-04 Thread Daniel Yacouboff
Hello there, I've been working on your library for a while now, in order to implement a RTSP server, streaming live audio and video. The audio and video packets are received via TCP from another service, each type of packet (audio\video) to another socket. What I did was creating a new sub-classe