synchronized with the video, once you later combine them in a
> single “ServerMediaSession”.
>
>
> Ross Finlayson
> Live Networks, Inc.
> http://www.live555.com/
>
>
> ___
> live-devel mailing list
> live-devel@lists.live5
ive-devel@lists.live555.com
> http://lists.live555.com/mailman/listinfo/live-devel
>
>
--
Xingjun Chen
M.S. Degree *Earned*
Electrical and Computer Engineering
University of Arizona, Tucson AZ 85721
___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Hi Ross and all other developers,
I currently wrote my own "DeviceSource" as the video source to open the
read side of a pipe and reading video stream from there, the write side of
the pipe is feeded by my OMAX call back function, which keep pushing raw
H264 byte-stream frame by frame, meanwhile
in network byte
> order).
>
>
> Ross Finlayson
> Live Networks, Inc.
> http://www.live555.com/
>
>
>
> ___
> live-devel mailing list
> live-devel@lists.live555.com
> http://lists.live555.com/mailman/listinfo/live-devel
>
>
--
Xingjun Chen
___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
inux-gnueabi-gcc for the C code, and arm-none-linux-gnueabi-g++
for the c++ code.
--
Xingjun Chen
___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel