\Hi!!... i'm trying to solve a problem to create a stream application.
I have an application that use OpenCV to capture frames from any
camera connected .. this works in a thread feeding a circular buffer
(using semaphores ).. then i read from this buffer to do some
processing and feed another circular buffer to storages the frames
(OpenCV image representation) ... So i wan't to read from this
frame, encode the data into a JPEG image (this can be donde with
libjpeg or using OpenCV so no problem here) .. and use this new jpeg
image (not a file a memory buffer) .. and stream this..
Can this be done with live555? (first question)
I read the Elphel code and i trying to understand how the library works.
I need to create a VideoSource or something like that or i can use
something to "Push" this jpeg images to do the streaming part?
Is there any guideline, example or something where i can found
information of how to do this..
Yes, see http://www.live555.com/liveMedia/faq.html#liveInput
--
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel