If you look at the ONVIF Device Test Tool (which you can only acquire if your company has a paying Onvif membership) there is a test for backchannel using rtp/rtsp/tcp and udp. You can find it under diagnostics tab > real time streaming > audio backchannel > unicast > backchannel - g711 (rtp/rtsp/tcp ||udp).

The test sets up everything required to be able to send and then plays a 10 second audio file. This tool is almost mandatory if you plan on integrating a backchannel into your code. Not only because you can see which soap data is send to the camera but also because so many camera's don't work properly that you need to check support up front to be able to rule out glitches on the camera side.

On my 2 camera's, I can play backchannel audio with UDP but not with TCP, no audio is coming through the speaker even though it is send over the network. I wonder that if what you say is true, then why did they implement a TCP test for backchannel? You could say that when using UDP it's on a separate socket so that's different, but for TCP? Onvif is the standard, defining the soap messages etc, the protocol for communication is still RTSP.

Note that onvif backchannel uses the RTSP require tag to check for backchannel support on the camera side. Unfortunately, some camera's deny the require tag so you need a workaround for that too. Try sending require=someRandomString, if you get an Ok reply the camera is not checking the require field.


Op 02-Jun-16 om 11:45 PM schreef Ross Finlayson:
There’s really nothing in the RTSP specification (note, btw, that ‘ONVIF’ is 
not the RTSP specification) that provides for a RTSP server receiving media at 
the same time that it sends it.  And as people have noted, there’s no support 
in our RTSP server implementation for this.  So I suggest that you don’t use 
RTSP (and/or our RTSP server or client implementation) at all.

However, it’s possible (and quite easy) to use the “LIVE555 Streaming Media” 
code to implement ‘full-duplex’ media (and by ‘full-duplex’, I presume you mean 
using the same socket for both sending and receiving media).  To do this, just 
create a single “Groupsock” object for the socket, but use this *one* 
“Groupsock” object when creating both a “RTPSink” (subclass) object (for 
transmitting), and a “RTPSource” (subclass) object (for receiving).

Similarly, for RTCP, you can either create a new socket (on even port #+1) and 
“Groupsock”, and create a single “RTCPInstance” for this, or else have this 
(single) “RTCPInstance” use the same socket and “Groupsock” that you used for 
RTP.


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/


_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel


_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to