Hi,

I'm trying to stream Opus-encoded live audio via live555 from my server and I'm 
a bit lost in how to implement that. I was trying to orient on live555 sample 
code and on how streaming of OGG files is done, but I'm losing track in the 
code at some point. Is there a sample I can use as kind of guidance? 

So far, I tried an approach similar to what I already have, which is streaming 
of live H264/5 video, thus to derive my AudioSubsession class from 
OnDemandServerMediaSubsession, which's createNewStreamSource creates a custom 
Opus-specific source, derived from FramedSource. I overwrote the MIMEtype() 
method to return "audio/OPUS". I'm not sure though, how to fill data in my 
doGetNextFrame override. So far, I am trying to copy only a single available 
Opus package to fTo, set the fFrameSize accordingly, and then return. 
Otherwise, multiple packages wouldn't be able to be delimited later on, as 
there is no such thing as a frame startcode in Opus as far as I know. Not sure 
if this will work, but currently, I've got another unclarity at an even earlier 
stage: When my client tries to connect, I can see in the server code, that 
RTSPServer tries to get an SDP description from the session instance 
(RTSPServer::RTSPClientConnection::handleCmd_DESCRIBE), which returns NULL and 
causes the!
  server to return a 404. I am not sure how and where to implement delivering a 
correct SDP for my case or if this would even happen automatically just like 
for H264/5 video and I would just have to use the right base classes.

Thanks,
Roland

_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to