Re: [Live-devel] fNumchannels hardcoded

2013-10-30 Thread Ross Finlayson
> a=tool:LIVE555 Streaming Media v2011.12.23 Why are you using such an old version of the "LIVE555 Streaming Media" code? You should upgrade to the latest version (the only version that we support). Ross Finlayson Live Networks, Inc. http://www.live555.com/ __

Re: [Live-devel] True push DeviceSource

2013-10-30 Thread Ross Finlayson
> I tried to use openRTSP.exe on the output with option -F to save the audio > and video and i found that audio is getting saved just fine thought i was not > able to play back the video in saved file Note - the saved video file is just a H.264 Elementary Stream file. If you rename it to have

Re: [Live-devel] RTSP Interleaved Streams

2013-10-30 Thread Ross Finlayson
> Do I need to change how I set up the unicast OnDemandServerMediaSubsession if > the client may request interleaved streams ? No. The server will serve this automatically, iff the client requests it. > The version of Live555 that is in use is not up-to-date as our customer is > using an old

[Live-devel] fNumchannels hardcoded

2013-10-30 Thread ssingh
Hi, I found that when I was checking my streaming using ffmpeg the ffmpeg displays the channels as 1 and audio codec to be (null). I am using AC3RTPSink in live555. I tried to debug the source of live555 and found that the fNumChannels is not set while creating object (calling base class Audi

Re: [Live-devel] True push DeviceSource

2013-10-30 Thread ssingh
Hi, I tried to use openRTSP.exe on the output with option -F to save the audio and video and i found that audio is getting saved just fine thought i was not able to play back the video in saved file but at least i know that my audio is being streamed. But i am not sure why VLS is not renderin

Re: [Live-devel] RTSP Interleaved Streams

2013-10-30 Thread Bob Bischan
Piers, I have not yet found a way to get VLC to request RTSP interleaved data, so I am limited in how to test this at present. VLC does support RTSP-over-TCP. This option may be configured from the GUI or from VLC config file. GUI: Tools > Preferences > Input/Codecs > Demuxers > RTP/RTSP > Use

[Live-devel] RTSP Interleaved Streams

2013-10-30 Thread Piers Hawksley
Hi Ross, One of our customers is trying to get RTSP interleaved streams from our live555 server. I have posted the 178KB Wireshark trace online at http://www.hawksley42.co.uk/amg-dump.pcapng. Frame 177 has a block of 112 bytes on the end which does not appear to be valid data. Do I need to

Re: [Live-devel] MPEG2 Transport Stream Packet ID

2013-10-30 Thread Ross Finlayson
As you noted, to simplify the code - when constructing a new Transport Stream - we reuse the PES packet's "stream id" to also be the Transport Stream "PID". This has worked OK for everyone so far, so I wasn't planning on changing this, unless there's a compelling reason to do so... Ross Finlay

[Live-devel] MPEG2 Transport Stream Packet ID

2013-10-30 Thread Piers Hawksley
Hi Ross, Can the MPEG2 transport stream packet ID be set ? The code appears (in MPEG2TransportStreamFromESSource::addNewVideoSource) to set this to 0xE0. Changing this causes MPEG2TransportStreamMultiplexor::handleNewBuffer to fail 'if ((stream_id&0xF0) == 0xE0) { // video', and thus set 'str

Re: [Live-devel] patch for Amino STB - JMACX

2013-10-30 Thread Ross Finlayson
Thanks for the note. However, I won't be adding these changes to the supplied "LIVE555 Streaming Media" code, unless I am contacted *directly* by 'Amino Corporation' (not just by some intermediary), explaining why they continue to violate established Internet standards, and why (in spite of thi

Re: [Live-devel] Matroska BANK_SIZE overflow

2013-10-30 Thread Ross Finlayson
> I reach this limit parsing an MKV file with an H264 stream. So please put this file on a (publically-accessible) web server, and send us the URL, so we can download and test it for ourself. Ross Finlayson Live Networks, Inc. http://www.live555.com/ _

Re: [Live-devel] Reagarding normal play time (NPT ) in Live555Media Sever

2013-10-30 Thread Ross Finlayson
> Is it possible to get the normal play time for each PLAY request from the > RTSP Client without RANGE HEADER information included in the client request. I'm not sure I understand your question. The client should always be able to get the 'normal play time' by calling "MediaSubsession:getNorma

[Live-devel] Matroska BANK_SIZE overflow

2013-10-30 Thread PROMONET Michel
Hi Ross, Searching in the mailing list, I saw some discussion about the BANK_SIZE, the last one I found seems to say that it is no more needed at least for H264 parsing. I reach this limit parsing an MKV file with an H264 stream. It's annoying that parsing a file makes abort the pro

Re: [Live-devel] True push DeviceSource

2013-10-30 Thread ssingh
Hi, Thanks for the guidance and I tried it. Now I can see the packets being transmitted in testRTSPClient and also in VLC. VLC shows packets being decoded for both audio and video. But still somehow video is running smooth but audio just comes and goes. Its like something is choking audio. It

[Live-devel] Reagarding normal play time (NPT ) in Live555Media Sever

2013-10-30 Thread Nambirajan M
Hi Ross, Is it possible to get the normal play time for each PLAY request from the RTSP Client without RANGE HEADER information included in the client request. We checked the code in handleCmd_PLAY in RTSPServer.cpp. In this function, if the Range Header is not in the client request, the Live5

Re: [Live-devel] Frame Buffer

2013-10-30 Thread Ross Finlayson
> So my concern and my question was more about recieving/sending frames than > recieving/sending packets, so that I was asking if inside livemedia there is > any buffer that stores packets, sorts them and compile them in frames. Sort of. First, our RTP reception code automatically ensures that

Re: [Live-devel] Frame Buffer

2013-10-30 Thread David Cassany Viladomat
Thanks Ross, I will have a look at VLC as you suggest. Sorry if I was not clear enough, in my app (written in plain C) we work at frame level (compressed or not, we already wrapped libavcodec, so we work at AVFrame level rather than at NAL Unit level for h264) in order to transform the image and r