I am writing an application to stream live Audio and Video via RTSP,
but am having some trouble synchronizing the two output streams.
This problem is usually caused by not setting properly
"fPresentationTime" properly in your media source objects. For both
audio and video, "fPresentationTime"
Dear readers,
I am writing an application to stream live Audio and Video via RTSP,
but am having some trouble synchronizing the two output streams.
When streaming just an audio stream by itself the output is great.
(Note: I am using an AudioInputDevice and encoding it to an MP3
stream.)
The probl
I use live555 as media server to send H.264 data as live stream not
file stream,when the number of client less than sixteen ,it would
work well. But if the number of client more than sixteen, the task
sheduler could not handle the task as real time.
These sorts of scaling problems are often ca
I use live555 as media server to send H.264 data as live stream not file
stream,when the number of client less than sixteen ,it would work well. But if
the number of client more than sixteen, the task sheduler could not handle the
task as real time.
So I want wo know the doEventLoop() can handle