## Overview

  * **Coverage on recent changesets** - A list of recent changesets and
the percent of new lines covered by tests.
https://firefox-code-coverage.herokuapp.com
  * **Coverage addon** - Use it to show coverage on your patches in
Bugzilla -
https://addons.mozilla.org/en-US/firefox/addon/gecko-code-coverage/
  * **Daily coverage aggregates** - by directory and file -
https://codecov.io/gh/mozilla/gecko-dev
  * **Zero coverage reports** -
https://marco-c.github.io/code-coverage-reports/ with ore details here:
https://release.mozilla.org/tooling/codecoverage/2018/03/23/code-coverage.html
  * **Not-so-raw coverage artifacts** - If you are interested in the
details, you may want to see the artifacts produced by coverage builds:
https://treeherder.mozilla.org/#/jobs?repo=mozilla-central&filter-searchStr=ccov


## Contact Us

If you want to know more, or want to participate, stop by and say "hi":

  * **Mail list** - codecover...@mozilla.com
  * **IRC** - #codecoverage  


## Details 

This past 6 months have been spent working on minutia and hard problems
of the CodeCoverage system. There are many platforms, many suites, a
complex build, and a lot of data. Taming the complexity so we can
provide "clean" coverage artifacts is proving arduous. Here are some
examples:

 * **translating filenames** - Our build has generated source files, and
files copied to appropriate directories before compile. The file
reported in coverage artifacts must be mapped back to files we see in
the tree. Our coverage processing now does this work.
 * **chunk <-> file mapping** - Our coverage is collected by running
hundreds of tasks, each responsible for running some part of a test
suite; called a "chunk".  An interesting question is "if we have a
changeset, and we know what files it changed, can we schedule less
chunks?" The chunk<->file mapping allows use to answer that question.
Turns out the answer is "not really".
 * **windows coverage** - If you visit Treeherder you will see we are
collecting coverage on Windows (using clang-cl). This allows us to see
platform-specific lines.
 * **grcov improvements** - now faster than before, and can process llvm
coverage output
 * **jsdcov e10s** - We had to verify that coverage was getting
collected by all processes
 * **frontend refinements** - There has been continued work on the
frontend to meet Release Management's use cases.  Another cohort of
UCOSP students has helped fix bugs, and added functionality to the
frontend. 
 * **variability analysis** - Coverage at the Firefox-scale has
variability from many different sources. For instance, running the same
test on the same revision multiple times shows differences from one run
to the next. Plus, in some other cases, tests fail, which results in
different coverage. These situations were explored in more detail to
better understand what variability we are dealing with.
 * **jsdcov vs jsvm** - Both are collecting coverage on javascript; they
appear to be capturing different coverage; we must look into this more
 * **bugs!** - Many bugs, timing bugs in the tests, and bugs in the
coverage collection/reporting pipeline.
 * **android experiments** - Android will be be given more focus in the
coming year, so we should look at how coverage will help.
 * **backend improvements** - Work that must be done to deal with data
volume, but is barely visible to the end user. 
 * **backend moved to production** - CodeCoverage has moved to
production; and this took time.
 * **subtracting baselines** - We reduce our coverage artifact size by
subtracting no-test coverage from test coverage.  In theory, after
subtraction, the coverage represents what is uniquely used by the test
and removes all standard browser/test start and shutdown coverage.
However, we have found that coverage variability is affecting these
results as well and are now looking into mitigation strategies.
 * **per test coverage** - One of our goals is to collect coverage at
the test level; this can help guide what tests should be run when a file
changes. It may also help with understanding gaining a deeper
understanding of what our tests are testing.

## The Future

  * **move off codecov.io** - Our data volume and our use cases, do not
fit with codecov.io. We will use our own database to store coverage
artifacts, and enable us to do more. 
  * **per test coverage** - One of our goals is to collect coverage at
the test level; this can help guide what tests should be run when a file
changes. It might also help understanding which tests are important
(e.g. for evaluating whether to disable or fix intermittent tests)
  * **compare coverage across revisions** - We want to markup the
coverage so a reasonable comparison can be done between different
revisions. This should mitigate much of our coverage variability:
https://github.com/mozilla/TUID/blob/dev/docs/CodeCoverage%20TUID.md


## Meetings

We have weekly CodeCoverage meetings, and you are welcome to attend:

  * **When** - Held every Friday @ 11:30 EDT (08:30 PDT)
  * **Where** - Kyle's video room
https://v.mozilla.com/flex.html?roomdirect.html&key=huhL8WaTwCwC
  * **Meeting Notes** -
https://docs.google.com/document/d/1TdhaA25S80fxHmuQke9_qU5kKduivHftIuLuh7RXgzY/edit#

 

_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform

Reply via email to