On Mon, 3 Jan 2011 08:16:51 -0200 Gustavo Sverzut Barbieri
<[email protected]> said:

> On Mon, Jan 3, 2011 at 6:26 AM, Carsten Haitzler <[email protected]> wrote:
> > On Fri, 31 Dec 2010 01:31:46 +0100 Stefano Sabatini <[email protected]>
> > said:
> >
> >> In data Wednesday 2010-12-29 19:57:53 +0100, Nicolas ha scritto:
> >> > Hello,
> >> >
> >> > Here's a new version of EvasVideoSink, feel free to test it.
> >> > https://sourceforge.net/projects/evasvideosink/files/
> >> >
> >> > I also created a widget for Elementary.
> >> > To test it, you need the latest svn version of Elementary, follow the
> >> > instructions listed in the README file for compilation.
> >> >
> >> > Why create an Elementary widget for EvasVideoSink and not Emotion?
> >> >
> >> > Emotion is intended to be a complete library manager for xine, gstreamer
> >> > and vlc but the problem with this kind of project is that it is never
> >> > complete!
> >> > Take the example of pidgin, it supports all the protocols, but most are
> >> > not complete, that is why I chose GStreamer and I intend to focus solely
> >> > on GStreamer.
> >> > Please note, I'm not criticizing anyone, it's just my opinion!
> >>
> >> And if you have time to kill and you want to try something new, you
> >> could try to write a sink for libavfilter (the FFmpeg A/V filtering
> >> library), I recently wrote an SDL sink (check the ffmpeg-devel
> >> archive) and I even started to work to an ecore/evas sink but got
> >> stucked at some point. If you're interested I can let you see my
> >> unfinished work and provide help (and eventually push it to the FFmpeg
> >> repo). From what I can see the libavfilter variant should be much
> >> simpler.
> >
> > i'm a little bit curious - how is this really much different than emotion -
> > where emotion wraps up gst and xine in an abstracted api to access and also
> > provides the sinks needed for both to display in the evas object emotion
> > creates... :) well i know the difference is in that you've just done the
> > sink bit and no wrapping/abstracting - but i'm a tad curious as to "why"
> > when emotion already provided that? :) (if you were missing controls for
> > the video or audio stream the emotion could always have the controls added
> > in api...) :)
> 
> emotion relies on some primitives that are not always reliable
> (although they should be), like gstreamer version using decodebin.
> Lots of systems need special pipelines to play media, some requires
> special pipelines to play some formats. Try it on OMAP platforms and
> check for yourself, most don't work (pandaboard does not, n800 -> n900
> did not as well AFAIR).

in which case... emotion simply needs to be able to set up that pipeline then -
the result is the same - output of video data as yuv (well if an evas sink is
involved). why should the person using gst have to "specially set up" that
pipeline per platform - why should they need to know. i'm a bit surprised gst
was set up this way to not "just work" (well use the dsp and accelerate the
decode anyway). i suspect a design or implementation flaw here on the gst+omap
side - but i dont know the details. i just smell something wrong. but... again
- why can't emotion just set that pipeline up properly? why push that work into
each and every media player app? :)

-- 
------------- Codito, ergo sum - "I code, therefore I am" --------------
The Rasterman (Carsten Haitzler)    [email protected]


------------------------------------------------------------------------------
Learn how Oracle Real Application Clusters (RAC) One Node allows customers
to consolidate database storage, standardize their database environment, and, 
should the need arise, upgrade to a full multi-node Oracle RAC database 
without downtime or disruption
http://p.sf.net/sfu/oracle-sfdevnl
_______________________________________________
enlightenment-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/enlightenment-users

Reply via email to