> The architecture we are building is based on an Helix Media Server that will 
> be used to broadcast the stream to PCs, phones, ... 
> Then, on the iPhone part, we're building an app that will access the camera, 
> and then maybe by using Live555 libraries, streaming it to the Helix Media 
> Server (H264/AAC using RTP). A good example of what we are trying to achieve 
> is the Livu app (http://stevemcfarlin.com/livu/index.html).
> 
> I've compiled the Live555 libraries for the iPhone and I've checked the test 
> programs (particularly testH264VideoStreamer or testMPEG4VideoToDarwin). 
> TestMPEG4VideoToDarwin seems to be very close to what we want to do but I was 
> wondering if someone had some advices on how to achieve this goal. 
> Are we on the right path or not ?

Yes, you probably are, provided that the protocol that the "Helix Media Server" 
uses to receive incoming data is the same protocol (or at least a similar 
protocol) as the one that's used by the "Darwin Streaming Server".

Personally, I dislike the idea of clients 'pushing' data into servers (see 
<http://lists.live555.com/pipermail/live-devel/2011-August/013783.html> and 
<http://lists.live555.com/pipermail/live-devel/2011-August/013785.html>), but I 
recognize that there are legacy servers out there that support that model.


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/

_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to