[Live-devel] about DeviceSource for multi channel encoder

2012-07-07 Thread reply2010
hi,experts

i modify DeviceSource.cpp to stream a live encoder,but i find triggerEvent 
could not trigger
deliverFrame0 function which is registered by eventTriggerId = 
env.taskScheduler().createEventTrigger(deliverFrame0);
what is wrong with me?
because my encoder has multi channel,i have to modify static variable 
EventTriggerId to non-static
here is my code
//***
#include "DeviceSource.hh"
#include  // for "gettimeofday()"

DeviceSource*
DeviceSource::createNew(UsageEnvironment& env /*,
   DeviceParameters params*/) {
  return new DeviceSource(env/*, params*/);
}
//for non-static eventtriggerid
//EventTriggerId DeviceSource::eventTriggerId = 0;
//unsigned DeviceSource::referenceCount = 0;


DeviceSource::DeviceSource(UsageEnvironment& env/*,
  DeviceParameters params*/)
  : FramedSource(env)/*, fParams(params)*/ {
//
/* 
 if (referenceCount == 0) {
// Any global initialization of the device would be done here:
//%%% TO BE WRITTEN %%%
  }
  ++referenceCount;
*/
//***

  // Any instance-specific initialization of the device would be done here:
  //%%% TO BE WRITTEN %%%

  // We arrange here for our "deliverFrame" member function to be called
  // whenever the next frame of data becomes available from the device.
  //
  // If the device can be accessed as a readable socket, then one easy way to 
do this is using a call to
  // envir().taskScheduler().turnOnBackgroundReadHandling( ... )
  // (See examples of this call in the "liveMedia" directory.)
  //
  // If, however, the device *cannot* be accessed as a readable socket, then 
instead we can implement it using 'event triggers':
  // Create an 'event trigger' for this device (if it hasn't already been done):
  if (eventTriggerId == 0) {
eventTriggerId = env.taskScheduler().createEventTrigger(deliverFrame0);
  }
}

DeviceSource::~DeviceSource() {
  // Any instance-specific 'destruction' (i.e., resetting) of the device would 
be done here:
  //%%% TO BE WRITTEN %%%

  //*
  /*--referenceCount;
  if (referenceCount == 0) {*/
// Any global 'destruction' (i.e., resetting) of the device would be done 
here:
//%%% TO BE WRITTEN %%%

// Reclaim our 'event trigger'
envir().taskScheduler().deleteEventTrigger(eventTriggerId);
eventTriggerId = 0;
 //* }
}

void DeviceSource::doGetNextFrame() {

}

void DeviceSource::deliverFrame0(void* clientData) {

 ((DeviceSource*)clientData)->deliverFrame();
  return;
}

void DeviceSource::deliverFrame() {
  // This function is called when new frame data is available from the device.
  // We deliver this data by copying it to the 'downstream' object, using the 
following parameters (class members):
  // 'in' parameters (these should *not* be modified by this function):
  // fTo: The frame data is copied to this address.
  // (Note that the variable "fTo" is *not* modified.  Instead,
  //  the frame data is copied to the address pointed to by "fTo".)
  // fMaxSize: This is the maximum number of bytes that can be copied
  // (If the actual frame is larger than this, then it should
  //  be truncated, and "fNumTruncatedBytes" set accordingly.)
  // 'out' parameters (these are modified by this function):
  // fFrameSize: Should be set to the delivered frame size (<= fMaxSize).
  // fNumTruncatedBytes: Should be set iff the delivered frame would have 
been
  // bigger than "fMaxSize", in which case it's set to the number of 
bytes
  // that have been omitted.
  // fPresentationTime: Should be set to the frame's presentation time
  // (seconds, microseconds).  This time must be aligned with 
'wall-clock time' - i.e., the time that you would get
  // by calling "gettimeofday()".
  // fDurationInMicroseconds: Should be set to the frame's duration, if 
known.
  // If, however, the device is a 'live source' (e.g., encoded from a 
camera or microphone), then we probably don't need
  // to set this variable, because - in this case - data will never 
arrive 'early'.
  // Note the code below.
 
//***
  if (!isCurrentlyAwaitingData()) {return;} // we're not ready for the data yet

  // Deliver the data here:
  if (newFrameSize > fMaxSize) {
fFrameSize = fMaxSize;
fNumTruncatedBytes = newFrameSize - fMaxSize;
  } else {
fFrameSize = newFrameSize;
  }
  //gettimeofday(&fPresentationTime, NULL); // If you have a more accurate time 
- e.g., from an encoder - then use that instead.
  // If the device is *not* a 'live source' (e.g., it comes instead from a file 
or buffer), then set "fDurationInMicroseconds" here.
  memmove(fTo, newFrameDataStart, fFrameSize);
  requestData = true;
  // After delivering the data, inform the reader that it is now available:
  Fr

[Live-devel] testOnDemandRTSPServer for pipeline

2012-09-10 Thread reply2010
hi,everyone
i replace video file with pipeline for testOnDemandRTSPServer.cpp,but i find i 
could not play in rtsp client.However i could play in testH264VideoStreamer.cpp 
by the same way.
i research code in two cpp file.i find startplaying code in two cpp file are 
similar,that is

H264VideoRTPSink->startplaying at last.but in fact i could see video in rtsp 
client if i use testH264VideoStreamer.cpp,and could not see anything if i use 
testOnDemandRTSPServer.cpp.

I want to play pipeline in testOnDemandRTSPServer,could any expert give me any 
light.why i could play and how to revise code in testOnDemandRTSPServer.cpp.

any comment are welcome.thanks a lot.

 ___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel


[Live-devel] how to play rtsp://xxx.xxx.xxx.xxx:8554/filename.h264

2012-09-12 Thread reply2010
hi everyone
  i want to play any file in a directory by 
live555's.testOnDemandRTSPServer.however,there are many many files in this 
directory,and the count of files are increasing in this directory.i have to add 
many many H264VideoFileServerMediaSubsession in testOnDemandRTSPServer.cpp if i 
want to play any of them.anyone could tell me how to play any files if i don't 
add many many H264VideoFileServerMediaSubsession.i want to know if i could play 
rtsp stream by rtsp link including file name.such as 
rtsp://xxx.xxx.xxx.xxx:8554/filename.h264.
any comment would be appreciated.
 ___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel


Re: [Live-devel] how to play rtsp://xxx.xxx.xxx.xxx:8554/filename.h264

2012-09-12 Thread reply2010
kind expert Ross Finlayson,

thanks.I have test live555MediaServer.Good.But i find i could not trick play 
h264.Could live555MediaServer trick play h264 file?as i known,ts could  trick 
play,i want to know if h264 could do it.thank you again.







At 2012-09-12 21:08:49,"Ross Finlayson"  wrote:

  i want to play any file in a directory by 
live555's.testOnDemandRTSPServer.however,there are many many files in this 
directory,and the count of files are increasing in this directory.i have to add 
many many H264VideoFileServerMediaSubsession in testOnDemandRTSPServer.cpp if i 
want to play any of them.anyone could tell me how to play any files if i don't 
add many many H264VideoFileServerMediaSubsession.


I suggest using the "LIVE555 Media Server" (see 
), instead of "testOnDemandRTSPServer".  
The "LIVE555 Media Server" will automatically stream any (valid) file that's in 
its directory, without the programmer having to explicitly add a 
"ServerMediaSubsession".


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/

___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel


[Live-devel] rtsp server for live audio

2012-10-25 Thread reply2010
Hi,respectable live555 experts
I have two questions:
1,Could live555's rtsp server or other application in live555 stream for live 
audio such as PC microphone device?I want to stream live audio by my PC's 
microphone.the type of audio stream could be amr or aac or g711 etc.
2,openRTSP say as below
Extracting a single stream (to 'stdout')
To record only the audio stream from a session, use the "-a" command-line 
option. (Similarly, to record only the video stream, use the "-v" option.) In 
this case, the output audio (or video) stream will be written to 'stdout', 
rather than to a file.
 
I want to know what is 'stdout' here.
a file or a pipeline or device?
thanks a lot
any reply would be appreciated!
best regards
 
jegger
 ___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel


Re: [Live-devel] rtsp server for live audio

2012-10-26 Thread reply2010
HI,
i find WindowsAudioInputDevice maybe a way to stream live audio.but i could not 
find any documents or examples about how to use WindowsAudioInputDevice.anyone 
could give me a example for WindowsAudioInputDevice.
thanks



 


At 2012-10-26 06:21:52,reply2010  wrote:

Hi,respectable live555 experts
I have two questions:
1,Could live555's rtsp server or other application in live555 stream for live 
audio such as PC microphone device?I want to stream live audio by my PC's 
microphone.the type of audio stream could be amr or aac or g711 etc.
2,openRTSP say as below
Extracting a single stream (to 'stdout')
To record only the audio stream from a session, use the "-a" command-line 
option. (Similarly, to record only the video stream, use the "-v" option.) In 
this case, the output audio (or video) stream will be written to 'stdout', 
rather than to a file.
 
I want to know what is 'stdout' here.
a file or a pipeline or device?
thanks a lot
any reply would be appreciated!
best regards
 
jegger
 


___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel