Hi.
Thanks for the explanation.
2009/8/25 Jerry Johns
> The LiveMedia library uses the magical world of select(), which allows
> for monitoring multiple file handle descriptors (or in this case, sockets)
> to monitor for incoming/outgoing data.
>
> By blocking on this select() call, an event l
Hi.
I worked out a basic RTSP client, using MPlayer code as an example.
Still, I can't find a way to break-down the received data to packets, and to
get both the encoded MP4 data from packet, or/and the packet details (such
as the frame resolution, whether the packet is key-frame, etc...).
Is th
Hi Ross.
You don't do this. Please everybody, read the FAQ!
>
> The "LIVE555 Streaming Media" software uses an event loop (a single thread
> of control) for concurrency, instead of using multiple threads.
>
Sorry for barging into this thread - an architecture question if I may. I
expirirmented w
Hi.
I read the FAQ but I still not 100% sure about this:
"
*Longer answer:* More than one thread can still use this code, if only one
thread runs the library code proper, and the other thread(s) communicate
with the library only by setting global 'flag' variables. (For one possible
use of this t
Hi.
Is it possible to access the RTP packet header extension (with application
data), which comes right after the CSRC list, via live 555 library?
I mean, whether live 555 allows to go to such depth when working with RTP
packets.
Thanks in advance!
___
Hi.
What is the most recommended solution then in such case?
Keep video in elementary, but audio in mp3 for example, and sync between
them?
Regards.
2009/5/27 Ross Finlayson
> 1) Just to verify, does it means that elementary stream files (m4e or m4v)
>> are unable to contain the PTS/DTS times
Hi.
Second, it's important to realize that the ".mov" (or ".mp4") file format is
> badly designed, and is poorly suited for recording live input streams (like
> these). One basic problem with the file format is that it records
> audio/video data using sample/frame *durations*, rather than timesta
Hi.
Unfortunately not, because the 'raw' (i.e., audio and video Elementary
> Stream) files would not contain any of the 'presentation time' information
> that we get from RTP/RTCP. Therefore, when you later combine them into a
> single audio+video file, you won't be able to get A/V synchronizatio
Hi.
Thanks for the replies.
No; it's a media *server*, not a 'reflector'.
>
Any idea what it would take to add such functionality? Any place I can
start?
3) For some reason, the M4E files I create via openRTSP are not seek-able,
> at least via VLC. Are there any know limitations for players tr
Hi.
I tried to search for this info in documents and in mailing lists, but
couldn't find any.
1) Can the media server reflect RTP packets like Darwin streaming server for
example?
2) Can the media server wrap RTSP in HTTP, again, like DSS?
3) For some reason, the M4E files I create via openRTSP
10 matches
Mail list logo