Priya,

my comments on your proposal:
-A module is the smallest unit of development that is managed independently
by a team. The CI capability should be provided at module level
-I like the approach of doing CI only during the development phase of that
module. Once it reaches maturity then CI should only be done before
publishing a new version
-I don't understand
-In general only changes in the tracked module should be taken into account.
I would not track development changes in its dependencies.
-CI should use the latest public minor versions of main module dependencies
(and includes) as per defined in the module dependencies. In fact CI should
do scan for updates and apply all available updates (using selenium/mmc).
But only changesets in the main module should be pushed
-In some cases more than one module can be configured to be continously
integrated. It makes sense when those modules are:
  -developed by the same team (of course)
  -developed at the same time
  -tightly integrated each other (eg. packs and its components)
  -not published yet (so no public/stable release available)
-I don't undertand the "triggered targets are install.source and
package.module". In my opinion the automated flow should be:
  -1st packaging: In an instance with the module installed with mercurial:
    -update to latest module revision (if more than one module, for each of
them)
    -smartbuild -Dlocal=no
    -package.module (if more than one, for each of them starting by the most
internal dependencies)
  -2nd installation: in an instance with required dependencies installed,
using selenium&mmc, install the just packaged obx files (if more than one,
for each of them, starting by the most internal dependencies)
  -3rd update: In an instance with the previous public version (if any)
installed:
    -scan for updates and apply all (using selenium&mmc)
    -install the just packaged new obx (using selenium&mmc)
  -4th Client/Organization level: If the module includes datasets at
client/organization level, using selenium&mmc, applying the module during
initial client/organization setup in the instances used in 2nd and 3rd steps
-In few months it will be possible to publish in "testing mode" in the CR
and it will simplify a lot this continous integration flow. This way it will
be possible to test publishing the obx file in the cr  and
installing/updating from the cr (without the need to declare what are the
required dependencies to be installed in the testing instance)
-It should be possible for a module to declare additional automated testing
after installation/update. QA should train development teams on how to
create selenium tests and how to deploy these tests in the CI environment

Hope it helps.

Ismael


-----Mensaje original-----
De: Priya MuthuKumar [mailto:[email protected]]
Enviado el: viernes, 16 de abril de 2010 11:39
Para: [email protected]
Asunto: [Openbravo-development] Integrate the modules development
testingto continuous integration


Hi,

This mail is to get your feedback and idea on integrating the modules
development to continuous integration (CI)
which means as we have pi development are tracked/compiled/tested in
builds.openbravo.com, the modules that are in development phase will
be continuously compiled and on success generate obx for testing.

The Ultimate goal is to have a job in "builds.openbravo.com" and
trigger it periodically and report the results to appropriate team.

Here we have the proposal on how to integrate modules to CI and make
them available to appropriate team to track.

* Have the ERP source in hudson local workspace to compile and build
the obx, periodic polling to pi branch will be done and
the workspace will be updated to the latest changeset.

* The main module (in development phase) and the dependent/included
modules are cloned to <path to workspace>/modules/

* Monitor the modules source repo (which includes main module,
dependent and included modules) for new changeset, if any of the
module in the list is with new changeset, then trigger on the compile
and build the obx for the main module. Targets that gets triggered for
every build are
     * install.source
     * package.module -Dmodule=<org.openbravo.module>

* On failure, e-mail notification will be sent to appropriate team.

We also like know how you would like to have the jobs to be, options

1) One job for each team (erp_devel_mods-loc, erp_devel_mods-platform,
erp_devel_mods-eng-dev ...)and configure it based on the module your
team is currently working and modify it when the team takes up another
new module development.

2) One job for each active module (erp_devel_mod-advpaymentmngt,
erp_devel_mod-generictreereport, erp_devel_mod-uiselector ...) and
remove the job when the module has no more development activity and
create a new job for next active module.

Any comments/suggestion are welcome

Regards
Priya Muthukumar

----------------------------------------------------------------------------
--
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
Openbravo-development mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/openbravo-development


------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
Openbravo-development mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/openbravo-development

Reply via email to