Hi,

I can run the live555ProxyServer, which if I understand correctly is 
unicast/multicast receiver, and unicast streamer.

I'm hoping I might get a few pointers regarding how to set up a proxy server 
which is a multicast streamer.
I aim to receive a single live camera source via unicast, then stream this data 
back out via multicast.

I'm able with the testH264VideoStreamer to get the multicast stream part tested 
and working.
The issue I'm having is altering that example to take a live source rtsp url 
rather than an input file.

I've read a few posts on here that talk about subclassing DeviceSource.
I haven't done this yet would you explain why this might be required,
rather than just relying on RTPSource as the example clients do?

This is how I think the multicast proxy should work:

Camera -> Source_1 (RTPSource) -> Source_2 (filter) -> Sink_1     ::::: I 
should be able to use testRTSPClient or equivalent here.
-> Magic ->
Source_3 (built from Sink_1) -> Source_4 (filter) -> Sink_2              ::::: 
I should be able to use test*Streamer here.

I think I'm currently falling over as I attempt the Magic in the middle
to pass the data received from the camera (Sink_1) to the streamer (Source_3).

This magic must happen somewhere within the live555ProxyServer code
are you able to point me at the function, or explain the process if I have 
misunderstood?

When outputting to file with the debugFileSink line I get output but it isn't 
playable in VLC.
Suggesting I'm missing header data or its just nonsense.

Here is what I've tried thus far:

Following how I think the system should work, I've modified Sink_1, which from 
testRTSPClient refers to 'class DummySink'.
In particular I altered the afterGettingFrame function, so it now contains:



// ...

// Get the DummySink bytes into a Source object.
    ByteStreamMemoryBufferSource *memSource
      = ByteStreamMemoryBufferSource::createNew(envir(), fReceiveBuffer, 
frameSize,
        False, 0, durationInMicroseconds);
    if (memSource == NULL) {
      *env << "Unable to load from memory as a byte-stream source\n";
      exit(1);
    }
FramedSource *videoES = memSource;

// Apply H264 Filter to FramedSource
// I've tried each of these and passing the memSource in to the sink's 
startPlaying directly
//H264VideoStreamFramer *videoSource = 
H264VideoStreamFramer::createNew(envir(), videoES);
H264VideoStreamDiscreteFramer *videoSource = 
H264VideoStreamDiscreteFramer::createNew(envir(), videoES);
// I also tried the H264VideoRTPSource but I don't believe that is correct here 
as the DummySink class isn't a subclass of RTPSink.
// so I can't pass in a valid groupsock.

       // Output to file or streamer
#if CREATE_DEBUG_FILE
debugFileSink->startPlaying(*videoSource, afterPlaying, &cd); // output to file 
on disk to confirm I'm creating valid frames
#else
sink_1_converter_sink->startPlaying(*videoSource, streamerAfterPlaying, &cd); 
// sink to be used as streamer source input
#endif

// ...

My thinking here is that I have just received a frame at the DummySink and I 
can pass that along to the streamer (or my debug file) straight away.

In my main function I have:

//...
debugFileSink = H264VideoFileSink::createNew(*env, "test.264");
// only one openURL keeping things as simple as possible
openURL(*env, argv[0], <url string>);
//...


The rest of the code in my main function is basically the main function from 
testH264VideoStreamer
with the sink_1_converter_sink passed in to the PassiveServerMediaSubsession.

Am I on the right train of thought?

Any help is greatly appreciated.

[cid:image005.png@01D70BB2.CDAAB540]<http://www.scientificgames.com/>
Mark Hinchcliffe
Solutions Architect
Scientific Games
O: N/A
M: N/A
Visit SGGaming.com<http://www.sggaming.com/>
[A picture containing drawing, table  Description automatically 
generated]<https://www.linkedin.com/company/scientific-games/>


_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to