Re: Weekly Cassandra Status

2017-03-27 Thread Dikang Gu
Jeff, thanks for the summary!

I will take a look at the token jira, https://issues.apache.org/
jira/browse/CASSANDRA-13348, since I was working on that recently.

--Dikang.

On Sun, Mar 26, 2017 at 3:35 PM, Jeff Jirsa  wrote:

> Email stuff:
> - We've moved github pull requests notifications from dev@ to pr@ - if you
> want to see when github issues are opened or updated, subscribe to the new
> list (send email to pr-subscr...@cassandra.apache.org ), or check out the
> archives at  https://lists.apache.org/list.html?p...@cassandra.apache.org
>
> We have some new JIRAs from new' contributors. Would be great if someone
> found the time to review and commit:
>
> - https://issues.apache.org/jira/browse/CASSANDRA-13358
> - https://issues.apache.org/jira/browse/CASSANDRA-13357
> - https://issues.apache.org/jira/browse/CASSANDRA-13356
>
> There are also two trivial typo changes in GH PRs:
> - https://github.com/apache/cassandra/pull/101
> - https://github.com/apache/cassandra/pull/102
>
> These are trivial to commit, and I'll likely do it "soon", but there's an
> open question about process: we've historically avoided using GH PRs, but I
> think ultimately it's better if we can find a way to accept these patches,
> especially with new contributors. I'm not aware of the what the standard is
> for whether or not we need to make a JIRA for tracking and update CHANGES -
> if any of the members of the PMC or committers feel like chiming in on
> procedure here, that'd be great. In the recent past, I've made a trivial
> JIRA just to follow the process, but it feels pretty silly ( GH PR #99 /
> https://github.com/apache/cassandra/commit/091e5fbe418004fd04390a0b1a
> 3486167360
> + https://issues.apache.org/jira/browse/CASSANDRA-13349 ).
>
> A pretty ugly duplicate-token blocker got registered this week: committers
> with free cycles may want to investigate (likely related to the new vnode
> allocation code in 3.0+):
>
> https://issues.apache.org/jira/browse/CASSANDRA-13348
>
> Here's a spattering of a few other patches (some new contributors, some
> old, but only patches from non-committers) who have patch available and no
> reviewer:
>
> - https://issues.apache.org/jira/browse/CASSANDRA-13369 - cql grammar
> issue
> - https://issues.apache.org/jira/browse/CASSANDRA-13374 - non-ascii dashes
> in docs
> - https://issues.apache.org/jira/browse/CASSANDRA-13354 - LCS estimated
> tasks accuracy
> - https://issues.apache.org/jira/browse/CASSANDRA-12748 - GREP_COLOR
> breaks
> startup
>
> We had 3 tickets opened to update embedded libraries to modern versions
> - JNA - https://issues.apache.org/jira/browse/CASSANDRA-13300 (committed)
> - Snappy - https://issues.apache.org/jira/browse/CASSANDRA-13336 (stilll
> open until I check the dtests a bit more closely)
> - JUnit - https://issues.apache.org/jira/browse/CASSANDRA-13360
> (committed)
>
> Since we had 3 of those in a short period of time, I've created an umbrella
> ticket for any other libraries we want to upgrade with 4.0 -
> https://issues.apache.org/jira/browse/CASSANDRA-13361 . We don't typically
> upgrade libraries until we have a reasons, so if you've been unable to use
> a feature of a bundled library because we were stuck on an old version, the
> time to update it in trunk is now (before 4.0 ships).
>
> Finally, we still have a TON of open "missing unit tests" tickets -
> https://issues.apache.org/jira/browse/CASSANDRA-9012 - new contributors
> may
> find those to be good places to have an immediate impact on the project.
>
> - Jeff
>



-- 
Dikang


[DISCUSS] Implementing code quality principles, and rules (was: Code quality, principles and rules)

2017-03-27 Thread Nate McCall
I don't want to lose track of the original idea from François, so
let's do this formally in preparation for a vote. Having this all in
place will make transition to new testing infrastructure more
goal-oriented and keep us more focused moving forward.

Does anybody have specific feedback/discussion points on the following
(awesome, IMO) proposal:

Principles:

1. Tests always pass. This is the starting point. If we don't care
about test failures, then we should stop writing tests. A recurring
failing test carries no signal and is better deleted.
2. The code is tested.

Assuming we can align on these principles, here is a proposal for
their implementation.

Rules:

1. Each new release passes all tests (no flakinesss).
2. If a patch has a failing test (test touching the same code path),
the code or test should be fixed prior to being accepted.
3. Bugs fixes should have one test that fails prior to the fix and
passes after fix.
4. New code should have at least 90% test coverage.


Re: [DISCUSS] Implementing code quality principles, and rules (was: Code quality, principles and rules)

2017-03-27 Thread Josh McKenzie
How do we plan on verifying #4? Also, root-cause to tie back new code that
introduces flaky tests (i.e. passes on commit, fails 5% of the time
thereafter) is a non-trivial pursuit (thinking #2 here), and a pretty
common problem in this environment.

On Mon, Mar 27, 2017 at 6:51 PM, Nate McCall  wrote:

> I don't want to lose track of the original idea from François, so
> let's do this formally in preparation for a vote. Having this all in
> place will make transition to new testing infrastructure more
> goal-oriented and keep us more focused moving forward.
>
> Does anybody have specific feedback/discussion points on the following
> (awesome, IMO) proposal:
>
> Principles:
>
> 1. Tests always pass. This is the starting point. If we don't care
> about test failures, then we should stop writing tests. A recurring
> failing test carries no signal and is better deleted.
> 2. The code is tested.
>
> Assuming we can align on these principles, here is a proposal for
> their implementation.
>
> Rules:
>
> 1. Each new release passes all tests (no flakinesss).
> 2. If a patch has a failing test (test touching the same code path),
> the code or test should be fixed prior to being accepted.
> 3. Bugs fixes should have one test that fails prior to the fix and
> passes after fix.
> 4. New code should have at least 90% test coverage.
>


Re: [DISCUSS] Implementing code quality principles, and rules (was: Code quality, principles and rules)

2017-03-27 Thread Edward Capriolo
On Mon, Mar 27, 2017 at 7:03 PM, Josh McKenzie  wrote:

> How do we plan on verifying #4? Also, root-cause to tie back new code that
> introduces flaky tests (i.e. passes on commit, fails 5% of the time
> thereafter) is a non-trivial pursuit (thinking #2 here), and a pretty
> common problem in this environment.
>
> On Mon, Mar 27, 2017 at 6:51 PM, Nate McCall  wrote:
>
> > I don't want to lose track of the original idea from François, so
> > let's do this formally in preparation for a vote. Having this all in
> > place will make transition to new testing infrastructure more
> > goal-oriented and keep us more focused moving forward.
> >
> > Does anybody have specific feedback/discussion points on the following
> > (awesome, IMO) proposal:
> >
> > Principles:
> >
> > 1. Tests always pass. This is the starting point. If we don't care
> > about test failures, then we should stop writing tests. A recurring
> > failing test carries no signal and is better deleted.
> > 2. The code is tested.
> >
> > Assuming we can align on these principles, here is a proposal for
> > their implementation.
> >
> > Rules:
> >
> > 1. Each new release passes all tests (no flakinesss).
> > 2. If a patch has a failing test (test touching the same code path),
> > the code or test should be fixed prior to being accepted.
> > 3. Bugs fixes should have one test that fails prior to the fix and
> > passes after fix.
> > 4. New code should have at least 90% test coverage.
> >
>

True #4 is hard to verify in he current state. This was mentioned in a
separate thread: If the code was in submodules, the code coverage tools
should have less work to do because they typically only count coverage for
a module and the tests inside that module. At that point it should be easy
to write a plugin on top of something like this:
http://alvinalexander.com/blog/post/java/sample-cobertura-ant-build-script.

This is also an option:

https://about.sonarqube.com/news/2016/05/02/continuous-analysis-for-oss-projects.html


Re: [DISCUSS] Implementing code quality principles, and rules (was: Code quality, principles and rules)

2017-03-27 Thread Blake Eggleston
In addition to it’s test coverage problem, the project has a general 
testability problem, and I think it would be more effective to introduce some 
testing guidelines and standards that drive incremental improvement of both, 
instead of requiring an arbitrary code coverage metric be hit, which doesn’t 
tell the whole story anyway.

It’s not ready yet, but I’ve been putting together a testing standards document 
for the project since bringing it up in the “Code quality, principles and 
rules” email thread a week or so ago.

On March 27, 2017 at 4:51:31 PM, Edward Capriolo (edlinuxg...@gmail.com) wrote:
On Mon, Mar 27, 2017 at 7:03 PM, Josh McKenzie  wrote:  

> How do we plan on verifying #4? Also, root-cause to tie back new code that  
> introduces flaky tests (i.e. passes on commit, fails 5% of the time  
> thereafter) is a non-trivial pursuit (thinking #2 here), and a pretty  
> common problem in this environment.  
>  
> On Mon, Mar 27, 2017 at 6:51 PM, Nate McCall  wrote:  
>  
> > I don't want to lose track of the original idea from François, so  
> > let's do this formally in preparation for a vote. Having this all in  
> > place will make transition to new testing infrastructure more  
> > goal-oriented and keep us more focused moving forward.  
> >  
> > Does anybody have specific feedback/discussion points on the following  
> > (awesome, IMO) proposal:  
> >  
> > Principles:  
> >  
> > 1. Tests always pass. This is the starting point. If we don't care  
> > about test failures, then we should stop writing tests. A recurring  
> > failing test carries no signal and is better deleted.  
> > 2. The code is tested.  
> >  
> > Assuming we can align on these principles, here is a proposal for  
> > their implementation.  
> >  
> > Rules:  
> >  
> > 1. Each new release passes all tests (no flakinesss).  
> > 2. If a patch has a failing test (test touching the same code path),  
> > the code or test should be fixed prior to being accepted.  
> > 3. Bugs fixes should have one test that fails prior to the fix and  
> > passes after fix.  
> > 4. New code should have at least 90% test coverage.  
> >  
>  

True #4 is hard to verify in he current state. This was mentioned in a  
separate thread: If the code was in submodules, the code coverage tools  
should have less work to do because they typically only count coverage for  
a module and the tests inside that module. At that point it should be easy  
to write a plugin on top of something like this:  
http://alvinalexander.com/blog/post/java/sample-cobertura-ant-build-script.  

This is also an option:  

https://about.sonarqube.com/news/2016/05/02/continuous-analysis-for-oss-projects.html