I want to eventually write a codec, which actually is the easiest part about getting something running. It works in MATLAB for still frames.

I figure the first step is to figure out how to integrate into someone elses framework. I've looked at GStreamer, and it seems simple enough, but I just can't seem to get a new node working there. So, I'm looking into the Live framework, mostly because I want to encode and send data over RTP, which appears to have a lot of support here.

My code takes a YUV image and filters that image (works in MATLAB), then I hope to encode that and send it via RTP to a client that does the reverse.

For a simple proof of concept, I would like to just take in a video stream, increment the chrominance plane and plop that back out to a file. If I can do this much in source code, I know I can implement the rest of my codec.

Is there an example for this? I've started down the path of using the MPEG1or2VideoRTPSink* files as a reference. Is this the right path to take, or is there a better way?
_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to