[
https://issues.apache.org/jira/browse/PIO-30?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15893838#comment-15893838
]
ASF GitHub Bot commented on PIO-30:
-----------------------------------
Github user chanlee514 commented on the issue:
https://github.com/apache/incubator-predictionio/pull/345
Hi @shimamoto, thanks for pointing that out. I'll make changes to enable
`-Dbuild.profile=scala-2.11 -Dspark.version=1.6.0`, which is a simple matter of
handling the build args in `build.sbt`.
Also, I think it may be a good idea to rename the build profiles to
**spark1.6** and **spark2.0**. The main intention of the cross-build is to
maintain backward compatibility when upgrading to spark 2.0 (which has many
advantages, but requires scala version >= 2.11).
And lastly, `install.sh`, as well as the README files, are not yet updated
to reflect the new changes, but I will work on that soon.
> Cross build for different versions of scala and spark
> -----------------------------------------------------
>
> Key: PIO-30
> URL: https://issues.apache.org/jira/browse/PIO-30
> Project: PredictionIO
> Issue Type: Improvement
> Reporter: Marcin ZiemiĆski
> Assignee: Chan
> Fix For: 0.11.0
>
>
> The present version of Scala is 2.10 and Spark is 1.4, which is quite old.
> With Spark 2.0.0 come many performance improvements and features, that people
> will definitely like to add to their templates. I am also aware that past
> cannot be ignored and simply dumping 1.x might not be an option for other
> users.
> I propose setting up a crossbuild in sbt to build with scala 2.10 and Spark
> 1.6 and a separate one for Scala 2.11 and Spark 2.0. Most of the files will
> be consistent between versions including API. The problematic ones will be
> divided between additional source directories: src/main/scala-2.10/ and
> src/main/scala-2.11/. The dockerized tests should also take the two versions
> into consideration
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)