You're awesome Jens!

-Dan

On Fri, May 17, 2019 at 9:48 AM Jens Deppe <jde...@pivotal.io> wrote:

> I fixed the first two and started ignoring
> AvailablePortHelperIntegrationTest for Windows (Windows has a default
> active port range wildly different than Linux which makes this test very
> flaky, and since it's just testing test code it seemed reasonable to ignore
> it).
>
> LauncherMemberMXBeanIntegration test still needs some love and I can look
> at that since I've worked on it in the past.
>
> --Jens
>
> On Fri, May 17, 2019 at 9:33 AM Dan Smith <dsm...@pivotal.io> wrote:
>
> > Looking at the metrics for the windows jobs, it looks like the windows
> > tests are mostly red due to a few specific tests. The acceptance and gfsh
> > distributed tests jobs seem to be ok, it's just the unit tests and
> > integration tests that have problems. It also looks like
> > ExportConfigCommandTest stopped failing recently. Anyone want to take the
> > credit?
> >
> > I'd be in favor of putting them in the main pipeline but not the PR
> > pipeline, since they take so long. Based on the metrics it looks like we
> > are pretty close to green with these jobs in terms of effort. Anyone want
> > to volunteer to fix the remaining failures (or see if they have been
> fixed
> > recently)?
> >
> > -Dan
> >
> >
> >
> https://concourse.apachegeode-ci.info/teams/main/pipelines/apache-develop-metrics/jobs/GeodeWindowsUnitTestOpenJDK8Metrics/builds/209
> >
> > CreateDiskStoreCommandTest:  2 failures
> > ExportConfigCommandTest:  34 failures
> >
> >
> >
> https://concourse.apachegeode-ci.info/teams/main/pipelines/apache-develop-metrics/jobs/GeodeWindowsIntegrationTestOpenJDK8Metrics/builds/209
> >
> > LauncherMemberMXBeanIntegrationTest:  4 failures
> > AvailablePortHelperIntegrationTest:  17 failures
> >
> >
> >
> >
> > On Thu, May 16, 2019 at 5:32 PM Owen Nichols <onich...@pivotal.io>
> wrote:
> >
> > > I’ve created a PR for this: https://github.com/apache/geode/pull/3597
> > >
> > > > On May 16, 2019, at 3:06 PM, Blake Bender <bben...@pivotal.io>
> wrote:
> > > >
> > > > +1 this needs to happen.  I hope that doesn't cause too much pain for
> > the
> > > > dev team, but the native client team has a hard requirement that all
> > our
> > > > stuff works properly on Windows at all times, and it can cause
> trouble
> > if
> > > > random builds of the server can break us on Windows.
> > > >
> > > > I would hesitate to run these per-commit if they're taking that long,
> > but
> > > > daily is a thing that can easily happen.
> > > >
> > > > On Thu, May 16, 2019 at 2:23 PM Bruce Schuchardt <
> > bschucha...@pivotal.io
> > > >
> > > > wrote:
> > > >
> > > >> big +1, as long as artifacts of failed runs can be downloaded
> > > >>
> > > >> On 5/15/19 6:28 PM, Owen Nichols wrote:
> > > >>> For a very long time we’ve had Windows tests in the main pipeline
> > > >> (hidden away, not in the default view), but the pipeline proceeds to
> > > >> publish regardless of whether Windows tests fail or even run at all.
> > > >>>
> > > >>> Now seems like a good time to review whether to:
> > > >>> a) treat Windows tests as first-class tests and prevent the
> pipeline
> > > >> from proceeding if any test fails on Windows
> > > >>> b) keep as-is
> > > >>> c) change Windows tests to trigger only once a week rather than on
> > > every
> > > >> commit, if they are going to remain "informational only"
> > > >>>
> > > >>> One disadvantage to making Windows tests gating is that they
> > currently
> > > >> take much longer to run (around 5 hours, vs 2 hours for Linux
> tests).
> > > >>
> > >
> > >
> >
>

Reply via email to