[ 
https://issues.apache.org/jira/browse/PIO-30?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15449635#comment-15449635
 ] 

Marcin Ziemiński commented on PIO-30:
-------------------------------------

I think that having two separate branches might be quite problematic and maybe 
harder to maintain than having a cross build configured. With cross build you 
will get two different version built - one compatible with scala 2.10 and other 
with 2.11. I am almost done with PR for this and in terms of code differences 
there is only one short file distributed between scala specific directories. 
Now if you want to run some sbt command for two versions at the same time, you 
just add '+' to it, e.g. sbt "+test" will run tests first for scala 2.10 and 
then for 2.11. 

> Cross build for different versions of scala and spark
> -----------------------------------------------------
>
>                 Key: PIO-30
>                 URL: https://issues.apache.org/jira/browse/PIO-30
>             Project: PredictionIO
>          Issue Type: Improvement
>            Reporter: Marcin Ziemiński
>
> The present version of Scala is 2.10 and Spark is 1.4, which is quite old. 
> With Spark 2.0.0 come many performance improvements and features, that people 
> will definitely like to add to their templates. I am also aware that past 
> cannot be ignored and simply dumping 1.x might not be an option for other 
> users. 
> I propose setting up a crossbuild in sbt to build with scala 2.10 and Spark 
> 1.6 and a separate one for Scala 2.11 and Spark 2.0. Most of the files will 
> be consistent between versions including API. The problematic ones will be 
> divided between additional source directories: src/main/scala-2.10/ and 
> src/main/scala-2.11/. The dockerized tests should also take the two versions 
> into consideration



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to