Re: [Live-devel] Streaming G711 audio

2024-11-16 Thread d.gorde...@ngslab.ru

It works! Thanks a lot!



On Nov 15, 2024, at 2:53 AM, d.gorde...@ngslab.ru wrote:

Hi,

Now I have a streamer that streams video with AAC audio:

 audio_sink_ = MPEG4GenericRTPSink::createNew(*env_, 
audio_rtp_socket_.get(), payload,
   16000, "audio", "AAC-hbr", "", 1);

and it works well.

I need also stream G711U or G711A audio. How to make it?

Try the following:
SimpleRTPSink::createNew(*env_, audio_rtp_socket_.get(), payload, 8000, 
“audio”, “PCMU”, 1);
for G711U.  (For G711A, replace “PCMU” with “PCMA”.)


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/


___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel




___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel


[Live-devel] Streaming G711 audio

2024-11-14 Thread d.gorde...@ngslab.ru

Hi,

Now I have a streamer that streams video with AAC audio:

    audio_sink_ = MPEG4GenericRTPSink::createNew(*env_, 
audio_rtp_socket_.get(), payload,

  16000, "audio", "AAC-hbr", "", 1);

and it works well.

I need also stream G711U or G711A audio. How to make it? I see there are 
some classes derived from AudioRTPSink, but doubt which one should I use.



___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel


Re: [Live-devel] Receiving slice-based frames

2025-03-05 Thread d.gorde...@ngslab.ru

It works, thanks a lot!



On Feb 22, 2025, at 1:10 AM, d.gorde...@ngslab.ru wrote:

Hello

I need to receive slice-based frames from a video server. These frames, as you 
know come by slices. The last slice (packet) comes with Mark=True in the RTP 
header.

  I have a class which receives the frames:

class VideoSink final : public MediaSink {}

It has afterGettingFrame() method which gets the frames and works well for 
usual frames. After getting frames I save them in archive and later send them 
to a receiver. So, when I send them, I need to set the Mark flag to say the 
receiver what is it - a intermediate slice or a last slice or a usual frame. 
But I don't know nothing about the Mark, because I receive only video payload.

So, how to get this flag, or is there other method to know about which slice 
comes?

In general, it's the job of the decoder to figure out how to render the 
incoming NAL units - which includes deciding when a received slice is the last 
slice in a video frame.

However, as a hint, you can use the value of the RTP packet's 'M' (i.e., 'marker') bit, 
which is (supposed to be) set for the last RTP packet of an 'access unit' (i.e., video 
frame).  I.e., you can call "RTPSource::curPacketMarkerBit()" to test this.


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/


___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel




___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel


[Live-devel] Receiving slice-based frames

2025-02-21 Thread d.gorde...@ngslab.ru

Hello

I need to receive slice-based frames from a video server. These frames, 
as you know come by slices. The last slice (packet) comes with Mark=True 
in the RTP header.


 I have a class which receives the frames:

class VideoSink final : public MediaSink {}

It has afterGettingFrame() method which gets the frames and works well 
for usual frames. After getting frames I save them in archive and later 
send them to a receiver. So, when I send them, I need to set the Mark 
flag to say the receiver what is it - a intermediate slice or a last 
slice or a usual frame. But I don't know nothing about the Mark, because 
I receive only video payload.


So, how to get this flag, or is there other method to know about which 
slice comes?



___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel


[Live-devel] Troubles with playing backward

2025-07-01 Thread d.gorde...@ngslab.ru
I recorded a number of frames and want to play them backward (forward 
playing works). The problems are: video doesn't play or video stops 
after 10 - 20 seconds and is slower twice.


How I do the playing. I read the archive backward and find previous 
I-frame. After I read 2 frames before it, there are SPS and PPS. After I 
make SEI, set all its frames a unique timestamp = I-frame timestamp, set 
gettimeofday(&fPresentationTime, NULL) and send them.


What I tried. Merging these frames into one frame. In this case 
Wireshark shows only SEI and FU-A Start:PPS. I'm not sure what is it. 
Are there PPS and SPS in this frame alongside with I-frame?


Why did I merge them? RTP timestamps are different in Wireshark. I set 
them the same in my code, but there are different anyways! If I play 
forward there the same, ok, but the backward playing changes something 
in Live555? RTP timestamps are increasing also.


___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel