On Wed, Jun 10, 2015 at 3:45 PM, Joe Witt <[email protected]> wrote:
> Andrew
>
> Awesome and thanks for sharing that!
>
> Ricky,
>
> Definitely with you seeing these as valuable tools for the toolbox.  The
> rub on supporting these languages is that they require folks with solid
> understanding of the fundamentals of those languages and with knowledge of
> NiFi.  So it is just something that will take time but i am confident we'll
> get there.
>
> The other thing that is a bit of a bummer with these languages is they tend
> to come with very large dependency trails or sizes.  The nifi build is
> already too big in my opinion (150+ MB).  I really would like to see us
> move to a registry construct whereby folks can indicate which
> processors/extensions they want activated on their system.  There are lots
> of challenges with that so I'm guessing it won't be soon.
>
> Russell
>
> What you're proposing just simply makes sense and is a good idea so it is
> easy to be welcoming - but thanks.  Surely there are going to be some
> gotchas but let's work through them.  Ultimately this is about the power of
> the JVM - not Java.  You know Clojure and we know nifi.  Do you know if in
> the case of Clojure one could do something like Andrew did with his Scala
> example?
>

Sure: both languages have excellent Java interop. I took a look at Andrew's
code, saw where he extended a Java class, used Java annotations, overrode
Java methods, etc. The pattern looked so very much like some Clojure
work I just did to
implement a Hive GenericUDF in the Hadoop stack recently, e.g.:

https://github.com/russellwhitaker/uap-clj-hiveudf/blob/master/src/uap_clj/udf/hive/generic/device.clj

Yep, Nifi work can be done in Clojure.

> The dynamic loading aspect these could provide, as Ricky points out, could
> be hugely cool when combined with NiFi's existing capabilities.
>

Indeed!

> I'll start a Jira for these in a bit if they're not already there  (and
> Ricky's Python and Groovy and Andrew's Scala).
>

Looking forward to that Joe!

Cheers, Russell

> Thanks
>
> Joe
> On Jun 10, 2015 10:44 AM, "Andrew Hulbert" <[email protected]> wrote:
>
>> Russell,
>>
>> I've had good success building 3-4 processors in Scala so far such as this
>> one...but haven't used too many advanced scala features in it at the
>> moment. Mostly I'm just mimicking Java style but it does work.
>>
>>
>> https://github.com/jahhulbert-ccri/geomesa-nifi/blob/master/nifi-geomesa-nifi-processors/src/main/scala/org/locationtech/geomesa/nifi/GeoMesaIngestProcessor.scala
>>
>> -Andrew
>>
>> On 06/10/2015 12:14 AM, Joe Witt wrote:
>>
>>> Russell,
>>>
>>> This sounds like a great idea.  I'm no clojure expert but it seems
>>> like it would be quite reasonable to build processors in Clojure.
>>> We've also discussed doing something similar in Scala.  Basically
>>> languages that run on the JVM we should have a good chance of
>>> providing a nice developer experience for.
>>>
>>> It is quite common for folks in the earlier stages of learning nifi to
>>> utilize ExecuteStreamCommand, ExecuteProcess, etc.. type of
>>> processors.  This is a nice and gradual transition model.  But we do
>>> like to help avoid having to make those external calls over time.
>>>
>>> Would you be interested in collaborating on putting together some nice
>>> examples or talking about how we could support clojure nicely?
>>>
>>> Thanks
>>> Joe
>>>
>>> On Tue, Jun 9, 2015 at 3:27 PM, Russell Whitaker
>>> <[email protected]> wrote:
>>>
>>>> At work, I've found the sweet spot for Clojure programming in our Hadoop
>>>> data processing stack: writing Hive UDFs (user-defined functions) which
>>>> get
>>>> distributed to HDFS, registered ("add jar ..."), and invoked as needed
>>>> by users.
>>>> It's been a real treat to avoid having to write Java (which I can do
>>>> and have done
>>>> much of in a past life) but still interoperate in the JVM.
>>>>
>>>> Now we're adding Nifi as a generalized data ingestion system into our
>>>> Hadoop
>>>> processing clusters, with various sources and (mostly) PutHDFS targets
>>>> (hoping
>>>> to do PutS3 in future), and are wondering how we might consider our
>>>> team's
>>>> emerging development pattern of Clojure coding to the JVM and plugging
>>>> into an
>>>> otherwise "pure Java" framework; i.e. we'd like to explore doing under
>>>> Nifi what
>>>> we've done under Hadoop.
>>>>
>>>> So far, the only option that really comes to mind is invoking "java
>>>> -cp <my_clojure_lib>"
>>>> in the context of an ExecuteStream processor, which is fine as things
>>>> go - and how
>>>> we're likely to quickly prototype new core libs for business-specific
>>>> ingestion
>>>> logic in Nifi - but I'm wondering if there's been some
>>>> as-yet-undiscussed thinking on
>>>> this matter in the Nifi community.
>>>>
>>>> Thanks, R
>>>>
>>>> --
>>>> Russell Whitaker
>>>> http://twitter.com/OrthoNormalRuss
>>>> http://github.com/russellwhitaker
>>>>
>>>
>>



-- 
Russell Whitaker
http://twitter.com/OrthoNormalRuss
http://www.linkedin.com/pub/russell-whitaker/0/b86/329

Reply via email to