Website cleanup

* Removed wikisearch.md as content was moved to README.md of wikisearch repo
* Removed glyphicons files as this now being served by CDN
* Updated wikisearch links to point to wikisearch mirror on GitHub
* Moved downloads.md to pages/
* Updated related-projects.md


Project: http://git-wip-us.apache.org/repos/asf/accumulo-website/repo
Commit: http://git-wip-us.apache.org/repos/asf/accumulo-website/commit/62b91e01
Tree: http://git-wip-us.apache.org/repos/asf/accumulo-website/tree/62b91e01
Diff: http://git-wip-us.apache.org/repos/asf/accumulo-website/diff/62b91e01

Branch: refs/heads/master
Commit: 62b91e01736804d223a0ed61855dbfaa44059ef3
Parents: a71056c
Author: Mike Walch <mwa...@apache.org>
Authored: Mon Dec 12 15:58:50 2016 -0500
Committer: Mike Walch <mwa...@apache.org>
Committed: Wed Dec 14 09:52:25 2016 -0500

----------------------------------------------------------------------
 contributor/contrib-projects.md          |   6 +-
 downloads/index.md                       | 206 ------------------
 example/wikisearch.md                    | 223 --------------------
 fonts/glyphicons-halflings-regular.eot   | Bin 20127 -> 0 bytes
 fonts/glyphicons-halflings-regular.svg   | 288 --------------------------
 fonts/glyphicons-halflings-regular.ttf   | Bin 45404 -> 0 bytes
 fonts/glyphicons-halflings-regular.woff  | Bin 23424 -> 0 bytes
 fonts/glyphicons-halflings-regular.woff2 | Bin 18028 -> 0 bytes
 pages/downloads.md                       | 207 ++++++++++++++++++
 pages/related-projects.md                |   8 +-
 10 files changed, 214 insertions(+), 724 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/accumulo-website/blob/62b91e01/contributor/contrib-projects.md
----------------------------------------------------------------------
diff --git a/contributor/contrib-projects.md b/contributor/contrib-projects.md
index 360cf0c..b6bca4c 100644
--- a/contributor/contrib-projects.md
+++ b/contributor/contrib-projects.md
@@ -33,8 +33,8 @@ stack. The Wikisearch application provides an example of 
indexing and querying
 Wikipedia data within Accumulo. It is a great place to start if you want to get
 familiar with good development practices building on Accumulo. 
 
-For details on setting up the application, see the project&apos;s README. You 
can
-also read [an overview and some performance numbers][wikisearch].
+For details on setting up the application, see the project&apos;s 
[README.md][wiki-readme]
+where you can also read an overview and some performance numbers.
 
 The Apache Accumulo Wikisearch Example uses [Git][gitbook] version control
 ([browse][wikisearch-browse]|[checkout][wikisearch-checkout]). It builds with
@@ -74,7 +74,7 @@ codebase][examples-simple] and the related published 
documentation for versions
 [instamo-browse]: 
https://git-wip-us.apache.org/repos/asf?p=accumulo-instamo-archetype.git;a=summary
 [instamo-checkout]: 
https://git-wip-us.apache.org/repos/asf/accumulo-instamo-archetype.git
 [maven-proj]: https://maven.apache.org
-[wikisearch]: example/wikisearch
+[wiki-readme]: 
https://github.com/apache/accumulo-wikisearch/blob/master/README.md
 [wikisearch-browse]: 
https://git-wip-us.apache.org/repos/asf?p=accumulo-wikisearch.git;a=summary
 [wikisearch-checkout]: 
https://git-wip-us.apache.org/repos/asf/accumulo-wikisearch.git
 [bsp-alg]: https://hama.apache.org/hama_bsp_tutorial

http://git-wip-us.apache.org/repos/asf/accumulo-website/blob/62b91e01/downloads/index.md
----------------------------------------------------------------------
diff --git a/downloads/index.md b/downloads/index.md
deleted file mode 100644
index cf2c9c3..0000000
--- a/downloads/index.md
+++ /dev/null
@@ -1,206 +0,0 @@
----
-title: Downloads
----
-
-<script type="text/javascript">
-/**
-* Function that tracks a click on an outbound link in Google Analytics.
-* This function takes a valid URL string as an argument, and uses that URL 
string
-* as the event label.
-*/
-var gaCallback = function(event) {
-  var hrefUrl = event.target.getAttribute('href')
-  if (event.ctrlKey || event.shiftKey || event.metaKey || event.which == 2) {
-    var newWin = true;}
-
-  // $(this) != this
-  var url = window.location.protocol + "//accumulo.apache.org" + 
$(this).attr("id")
-  if (newWin) {
-    ga('send', 'event', 'outbound', 'click', url, {'nonInteraction': 1});
-    return true;
-  } else {
-    ga('send', 'event', 'outbound', 'click', url, {'hitCallback':
-    function () {window.location.href = hrefUrl;}}, {'nonInteraction': 1});
-    return false;
-  }
-};
-
-$( document ).ready(function() {
-  if (ga.hasOwnProperty('loaded') && ga.loaded === true) {
-    $('.download_external').click(gaCallback);
-  }
-});
-
-var updateLinks = function(mirror) {
-  $('a[link-suffix]').each(function(i, obj) {
-    $(obj).attr('href', mirror.replace(/\/+$/, "") + 
$(obj).attr('link-suffix'));
-  });
-};
-
-var mirrorsCallback = function(json) {
-  var htmlContent = '<div class="row"><div class="col-md-3"><h5>Select an 
Apache download mirror:</h5></div>' +
-    '<div class="col-md-5"><select class="form-control" 
id="apache-mirror-select">';
-  htmlContent += '<optgroup label="Preferred Mirror (based on location)">';
-  htmlContent += '<option selected="selected">' + json.preferred + '</option>';
-  htmlContent += '</optgroup>';
-  htmlContent += '<optgroup label="HTTP Mirrors">';
-  for (var i = 0; i < json.http.length; i++) {
-    htmlContent += '<option>' + json.http[i] + '</option>';
-  }
-  htmlContent += '</optgroup>';
-  htmlContent += '<optgroup label="FTP Mirrors">';
-  for (var i = 0; i < json.ftp.length; i++) {
-    htmlContent += '<option>' + json.ftp[i] + '</option>';
-  }
-  htmlContent += '</optgroup>';
-  htmlContent += '<optgroup label="Backup Mirrors">';
-  for (var i = 0; i < json.backup.length; i++) {
-    htmlContent += '<option>' + json.backup[i] + '</option>';
-  }
-  htmlContent += '</optgroup>';
-  htmlContent += '</select></div></div>';
-
-  $("#mirror_selection").html(htmlContent);
-
-  $( "#apache-mirror-select" ).change(function() {
-    var mirror = $("#apache-mirror-select option:selected").text();
-    updateLinks(mirror);
-  });
-
-  updateLinks(json.preferred);
-};
-
-// get mirrors when page is ready
-var mirrorURL = window.location.protocol + 
"//accumulo.apache.org/mirrors.cgi"; // 
http[s]://accumulo.apache.org/mirrors.cgi
-$(function() { $.getJSON(mirrorURL + "?as_json", mirrorsCallback); });
-
-</script>
-
-<div id="mirror_selection"></div>
-
-Be sure to verify your downloads by these [procedures][VERIFY_PROCEDURES] 
using these [KEYS][GPG_KEYS].
-
-## Current Releases
-
-### 1.8.0 **latest**{: .label .label-primary }
-
-The most recent Apache Accumulo&trade; release is version 1.8.0. See the 
[release notes][REL_NOTES_18] and [CHANGES][CHANGES_18].
-
-For convenience, [MD5][MD5SUM_18] and [SHA1][SHA1SUM_18] hashes are also 
available.
-
-{: .table }
-| **Generic Binaries** | [accumulo-1.8.0-bin.tar.gz][BIN_18] | 
[ASC][ASC_BIN_18] |
-| **Source**           | [accumulo-1.8.0-src.tar.gz][SRC_18] | 
[ASC][ASC_SRC_18] |
-
-#### 1.8 Documentation
-* [README][README_18]
-* [HTML User Manual][MANUAL_HTML_18]
-* [Examples][EXAMPLES_18]
-* [Javadoc][JAVADOC_18]
-
-
-### 1.7.2
-
-The most recent 1.7.x release of Apache Accumulo&trade; is version 1.7.2. See 
the [release notes][REL_NOTES_17] and [CHANGES][CHANGES_17].
-
-For convenience, [MD5][MD5SUM_17] and [SHA1][SHA1SUM_17] hashes are also 
available.
-
-{: .table }
-| **Generic Binaries** | [accumulo-1.7.2-bin.tar.gz][BIN_17] | 
[ASC][ASC_BIN_17] |
-| **Source**           | [accumulo-1.7.2-src.tar.gz][SRC_17] | 
[ASC][ASC_SRC_17] |
-
-#### 1.7 Documentation
-* [README][README_17]
-* [HTML User Manual][MANUAL_HTML_17]
-* [Examples][EXAMPLES_17]
-* [Javadoc][JAVADOC_17]
-
-### 1.6.6
-
-The most recent 1.6.x release of Apache Accumulo&trade; is version 1.6.6. See 
the [release notes][REL_NOTES_16] and [CHANGES][CHANGES_16].
-
-For convenience, [MD5][MD5SUM_16] and [SHA1][SHA1SUM_16] hashes are also 
available.
-
-{: .table }
-| **Generic Binaries** | [accumulo-1.6.6-bin.tar.gz][BIN_16] | 
[ASC][ASC_BIN_16] |
-| **Source**           | [accumulo-1.6.6-src.tar.gz][SRC_16] | 
[ASC][ASC_SRC_16] |
-
-#### 1.6 Documentation
-* [README][README_16]
-* [PDF manual][MANUAL_PDF_16]
-* [html manual][MANUAL_HTML_16]
-* [examples][EXAMPLES_16]
-* [Javadoc][JAVADOC_16]
-
-## Older releases
-
-Older releases can be found in the [archives][ARCHIVES].
-
-
-[VERIFY_PROCEDURES]: https://www.apache.org/info/verification "Verify"
-[GPG_KEYS]: https://www.apache.org/dist/accumulo/KEYS "KEYS"
-[ARCHIVES]: https://archive.apache.org/dist/accumulo
-
-[ASC_BIN_16]: 
https://www.apache.org/dist/accumulo/1.6.6/accumulo-1.6.6-bin.tar.gz.asc
-[ASC_SRC_16]: 
https://www.apache.org/dist/accumulo/1.6.6/accumulo-1.6.6-src.tar.gz.asc
-
-[ASC_BIN_17]: 
https://www.apache.org/dist/accumulo/1.7.2/accumulo-1.7.2-bin.tar.gz.asc
-[ASC_SRC_17]: 
https://www.apache.org/dist/accumulo/1.7.2/accumulo-1.7.2-src.tar.gz.asc
-
-[ASC_BIN_18]: 
https://www.apache.org/dist/accumulo/1.8.0/accumulo-1.8.0-bin.tar.gz.asc
-[ASC_SRC_18]: 
https://www.apache.org/dist/accumulo/1.8.0/accumulo-1.8.0-src.tar.gz.asc
-
-[BIN_16]: 
https://www.apache.org/dyn/closer.lua/accumulo/1.6.6/accumulo-1.6.6-bin.tar.gz
-{: .download_external link-suffix="/accumulo/1.6.6/accumulo-1.6.6-bin.tar.gz" 
id="/downloads/accumulo-1.6.6-bin.tar.gz" }
-[SRC_16]: 
https://www.apache.org/dyn/closer.lua/accumulo/1.6.6/accumulo-1.6.6-src.tar.gz
-{: .download_external link-suffix="/accumulo/1.6.6/accumulo-1.6.6-src.tar.gz" 
id="/downloads/accumulo-1.6.6-src.tar.gz" }
-
-[BIN_17]: 
https://www.apache.org/dyn/closer.lua/accumulo/1.7.2/accumulo-1.7.2-bin.tar.gz
-{: .download_external link-suffix="/accumulo/1.7.2/accumulo-1.7.2-bin.tar.gz" 
id="/downloads/accumulo-1.7.2-bin.tar.gz" }
-[SRC_17]: 
https://www.apache.org/dyn/closer.lua/accumulo/1.7.2/accumulo-1.7.2-src.tar.gz
-{: .download_external link-suffix="/accumulo/1.7.2/accumulo-1.7.2-src.tar.gz" 
id="/downloads/accumulo-1.7.2-src.tar.gz" }
-
-[BIN_18]: 
https://www.apache.org/dyn/closer.lua/accumulo/1.8.0/accumulo-1.8.0-bin.tar.gz
-{: .download_external link-suffix="/accumulo/1.8.0/accumulo-1.8.0-bin.tar.gz" 
id="/downloads/accumulo-1.8.0-bin.tar.gz" }
-[SRC_18]: 
https://www.apache.org/dyn/closer.lua/accumulo/1.8.0/accumulo-1.8.0-src.tar.gz
-{: .download_external link-suffix="/accumulo/1.8.0/accumulo-1.8.0-src.tar.gz" 
id="/downloads/accumulo-1.8.0-src.tar.gz" }
-
-[README_16]: 
https://git-wip-us.apache.org/repos/asf?p=accumulo.git;a=blob_plain;f=README;hb=rel/1.6.6
-{: .download_external id="/1.6/README" }
-[README_17]: https://github.com/apache/accumulo/blob/rel/1.7.2/README.md
-{: .download_external id="/1.7/README" }
-[README_18]: https://github.com/apache/accumulo/blob/rel/1.8.0/README.md
-{: .download_external id="/1.8/README" }
-
-[JAVADOC_16]: {{ site.baseurl }}/1.6/apidocs/
-{: .download_external id="/1.6/apidocs/" }
-[JAVADOC_17]: {{ site.baseurl }}/1.7/apidocs/
-{: .download_external id="/1.7/apidocs/" }
-[JAVADOC_18]: {{ site.baseurl }}/1.8/apidocs/
-{: .download_external id="/1.8/apidocs/" }
-
-[MANUAL_PDF_16]: 
https://search.maven.org/remotecontent?filepath=org/apache/accumulo/accumulo-docs/1.6.6/accumulo-docs-1.6.6-user-manual.pdf
-{: .download_external id="/1.6/accumulo_user_manual.pdf" }
-[MANUAL_HTML_16]: {{ site.baseurl }}/1.6/accumulo_user_manual "1.6 user manual"
-[MANUAL_HTML_17]: {{ site.baseurl }}/1.7/accumulo_user_manual "1.7 user manual"
-[MANUAL_HTML_18]: {{ site.baseurl }}/1.8/accumulo_user_manual "1.8 user manual"
-
-[EXAMPLES_16]: {{ site.baseurl }}/1.6/examples "1.6 examples"
-[EXAMPLES_17]: {{ site.baseurl }}/1.7/examples "1.7 examples"
-[EXAMPLES_18]: {{ site.baseurl }}/1.8/examples "1.8 examples"
-
-[CHANGES_16]: 
https://issues.apache.org/jira/browse/ACCUMULO/fixforversion/12334846 "1.6.6 
CHANGES"
-[CHANGES_17]: 
https://issues.apache.org/jira/browse/ACCUMULO/fixforversion/12333776 "1.7.2 
CHANGES"
-[CHANGES_18]: 
https://issues.apache.org/jira/browse/ACCUMULO/fixforversion/12329879 "1.8.0 
CHANGES"
-
-[REL_NOTES_16]: {{ site.baseurl }}/release/accumulo-1.6.6/ "1.6.6 Release 
Notes"
-[REL_NOTES_17]: {{ site.baseurl }}/release/accumulo-1.7.2/ "1.7.2 Release 
Notes"
-[REL_NOTES_18]: {{ site.baseurl }}/release/accumulo-1.8.0/ "1.8.0 Release 
Notes"
-
-[MD5SUM_16]: https://www.apache.org/dist/accumulo/1.6.6/MD5SUM "1.6.6 MD5 file 
hashes"
-[MD5SUM_17]: https://www.apache.org/dist/accumulo/1.7.2/MD5SUM "1.7.2 MD5 file 
hashes"
-[MD5SUM_18]: https://www.apache.org/dist/accumulo/1.8.0/MD5SUM "1.8.0 MD5 file 
hashes"
-
-[SHA1SUM_16]: https://www.apache.org/dist/accumulo/1.6.6/SHA1SUM "1.6.6 SHA1 
file hashes"
-[SHA1SUM_17]: https://www.apache.org/dist/accumulo/1.7.2/SHA1SUM "1.7.2 SHA1 
file hashes"
-[SHA1SUM_18]: https://www.apache.org/dist/accumulo/1.8.0/SHA1SUM "1.8.0 SHA1 
file hashes"

http://git-wip-us.apache.org/repos/asf/accumulo-website/blob/62b91e01/example/wikisearch.md
----------------------------------------------------------------------
diff --git a/example/wikisearch.md b/example/wikisearch.md
deleted file mode 100644
index 51631f0..0000000
--- a/example/wikisearch.md
+++ /dev/null
@@ -1,223 +0,0 @@
----
-title: The wikipedia search example explained, with performance numbers.
----
-
-## Apache Accumulo Query Performance
-
-## Sample Application
-
-Starting with release 1.4, Accumulo includes an example web application that
-provides a flexible, scalable search over the articles of Wikipedia, a widely
-available medium-sized corpus.
-
-The example uses an indexing technique helpful for doing multiple logical tests
-against content. In this case, we can perform a word search on Wikipedia
-articles. The sample application takes advantage of 3 unique capabilities of
-Accumulo:
-
-1. Extensible iterators that operate within the distributed tablet servers of
-   the key-value store
-1. Custom aggregators which can efficiently condense information during the
-   various life-cycles of the log-structured merge tree 
-1. Custom load balancing, which ensures that a table is evenly distributed on
-   all tablet servers
-
-In the example, Accumulo tracks the cardinality of all terms as elements are
-ingested. If the cardinality is small enough, it will track the set of
-documents by term directly. For example:
-
-<style type="text/css">
-table.wiki, table.wiki td, table.wiki th {
-  padding-right: 5px;
-  padding-left: 5px;
-  border: 1px solid black;
-  border-collapse: collapse;
-}
-table.wiki td {
-  text-align: right;
-}
-</style>
-
-{: .wiki }
-| Row (word) | Value (count) | Value (document list)       |
-|------------|--------------:|:----------------------------|
-| Octopus    | 2             | [Document 57, Document 220] |
-| Other      | 172,849       | []                          |
-| Ostrich    | 1             | [Document 901]              |
-
-Searches can be optimized to focus on low-cardinality terms. To create these
-counts, the example installs "aggregators" which are used to combine inserted
-values. The ingester just writes simple "(Octopus, 1, Document 57)" tuples.
-The tablet servers then used the installed aggregators to merge the cells as
-the data is re-written, or queried. This reduces the in-memory locking
-required to update high-cardinality terms, and defers aggregation to a later
-time, where it can be done more efficiently.
-
-The example also creates a reverse word index to map each word to the document
-in which it appears. But it does this by choosing an arbitrary partition for
-the document. The article, and the word index for the article are grouped
-together into the same partition. For example:
-
-{: .wiki }
-| Row (partition) | Column Family | Column Qualifier | Value           |
-|-----------------|---------------|------------------|-----------------|
-| 1               | D             | Document 57      | "smart Octopus" |
-| 1               | Word, Octopus | Document 57      |                 |
-| 1               | Word, smart   | Document 57      |                 |
-| ...             |               |                  |                 |
-| 2               | D             | Document 220     | "big Octopus"   |
-| 2               | Word, big     | Document 220     |                 |
-| 2               | Word, Octopus | Document 220     |                 |
-
-Of course, there would be large numbers of documents in each partition, and the
-elements of those documents would be interlaced according to their sort order.
-
-By dividing the index space into partitions, the multi-word searches can be
-performed in parallel across all the nodes. Also, by grouping the document
-together with its index, a document can be retrieved without a second request
-from the client. The query "octopus" and "big" will be performed on all the
-servers, but only those partitions for which the low-cardinality term "octopus"
-can be found by using the aggregated reverse index information. The query for a
-document is performed by extensions provided in the example. These extensions
-become part of the tablet server's iterator stack. By cloning the underlying
-iterators, the query extensions can seek to specific words within the index,
-and when it finds a matching document, it can then seek to the document
-location and retrieve the contents.
-
-We loaded the example on a cluster of 10 servers, each with 12 cores, and 32G
-RAM, 6 500G drives. Accumulo tablet servers were allowed a maximum of 3G of
-working memory, of which 2G was dedicated to caching file data.
-
-Following the instructions in the example, the Wikipedia XML data for articles
-was loaded for English, Spanish and German languages into 10 partitions. The
-data is not partitioned by language: multiple languages were used to get a
-larger set of test data. The data load took around 8 hours, and has not been
-optimized for scale. Once the data was loaded, the content was compacted which
-took about 35 minutes.
-
-The example uses the language-specific tokenizers available from the Apache
-Lucene project for Wikipedia data.
-
-Original files:
-
-{: .wiki }
-| Articles | Compressed size | Filename                               |
-|----------|-----------------|----------------------------------------|
-| 1.3M     | 2.5G            | dewiki-20111120-pages-articles.xml.bz2 |
-| 3.8M     | 7.9G            | enwiki-20111115-pages-articles.xml.bz2 |
-| 0.8M     | 1.4G            | eswiki-20111112-pages-articles.xml.bz2 |
-
-The resulting tables:
-
-    > du -p wiki.*
-          47,325,680,634 [wiki]
-           5,125,169,305 [wikiIndex]
-                     413 [wikiMetadata]
-           5,521,690,682 [wikiReverseIndex]
-
-Roughly a 6:1 increase in size.
-
-We performed the following queries, and repeated the set 5 times. The query
-language is much more expressive than what is shown below. The actual query
-specified that these words were to be found in the body of the article. Regular
-expressions, searches within titles, negative tests, etc are available.
-
-{: .wiki }
-| Query                                   | Sample 1 (seconds) | Sample 2 
(seconds) | Sample 3 (seconds) | Sample 4 (seconds) | Sample 5 (seconds) | 
Matches | Result Size |
-|-----------------------------------------|------|------|------|------|------|--------|-----------|
-| "old" and "man" and "sea"               | 4.07 | 3.79 | 3.65 | 3.85 | 3.67 | 
22,956 | 3,830,102 |
-| "paris" and "in" and "the" and "spring" | 3.06 | 3.06 | 2.78 | 3.02 | 2.92 | 
10,755 | 1,757,293 |
-| "rubber" and "ducky" and "ernie"        | 0.08 | 0.08 | 0.1  | 0.11 | 0.1  | 
6      | 808       |
-| "fast" and ( "furious" or "furriest")   | 1.34 | 1.33 | 1.3  | 1.31 | 1.31 | 
2,973  | 493,800   |
-| "slashdot" and "grok"                   | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 
14     | 2,371     |
-| "three" and "little" and "pigs"         | 0.92 | 0.91 | 0.9  | 1.08 | 0.88 | 
2,742  | 481,531   |
-
-Because the terms are tested together within the tablet server, even fairly
-high-cardinality terms such as "old," "man," and "sea" can be tested
-efficiently, without needing to return to the client, or make distributed calls
-between servers to perform the intersection between terms.
-
-For reference, here are the cardinalities for all the terms in the query
-(remember, this is across all languages loaded):
-
-{: .wiki }
-| Term     | Cardinality |
-|----------|-------------|
-| ducky    | 795         |
-| ernie    | 13,433      |
-| fast     | 166,813     |
-| furious  | 10,535      |
-| furriest | 45          |
-| grok     | 1,168       |
-| in       | 1,884,638   |
-| little   | 320,748     |
-| man      | 548,238     |
-| old      | 720,795     |
-| paris    | 232,464     |
-| pigs     | 8,356       |
-| rubber   | 17,235      |
-| sea      | 247,231     |
-| slashdot | 2,343       |
-| spring   | 125,605     |
-| the      | 3,509,498   |
-| three    | 718,810     |
-
-Accumulo supports caching index information, which is turned on by default, and
-for the non-index blocks of a file, which is not. After turning on data block
-  caching for the wiki table:
-
-{: .wiki }
-| Query                                   | Sample 1 (seconds) | Sample 2 
(seconds) | Sample 3 (seconds) | Sample 4 (seconds) | Sample 5 (seconds) |
-|-----------------------------------------|------|------|------|------|------|
-| "old" and "man" and "sea"               | 2.47 | 2.48 | 2.51 | 2.48 | 2.49 |
-| "paris" and "in" and "the" and "spring" | 1.33 | 1.42 | 1.6  | 1.61 | 1.47 |
-| "rubber" and "ducky" and "ernie"        | 0.07 | 0.08 | 0.07 | 0.07 | 0.07 |
-| "fast" and ( "furious" or "furriest")   | 1.28 | 0.78 | 0.77 | 0.79 | 0.78 |
-| "slashdot" and "grok"                   | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 |
-| "three" and "little" and "pigs"         | 0.55 | 0.32 | 0.32 | 0.31 | 0.27 |
-
-For comparison, these are the cold start lookup times (restart Accumulo, and
-drop the operating system disk cache):
-
-{: .wiki }
-| Query                                   | Sample |
-|-----------------------------------------|--------|
-| "old" and "man" and "sea"               | 13.92  |
-| "paris" and "in" and "the" and "spring" | 8.46   |
-| "rubber" and "ducky" and "ernie"        | 2.96   |
-| "fast" and ( "furious" or "furriest")   | 6.77   |
-| "slashdot" and "grok"                   | 4.06   |
-| "three" and "little" and "pigs"         | 8.13   |
-
-### Random Query Load
-
-Random queries were generated using common english words. A uniform random
-sample of 3 to 5 words taken from the 10000 most common words in the Project
-Gutenberg's online text collection were joined with "and". Words containing
-anything other than letters (such as contractions) were not used. A client was
-started simultaneously on each of the 10 servers and each ran 100 random
-queries (1000 queries total).
-
-{: .wiki }
-| Time  | Count   |
-|-------|---------|
-| 41.97 | 440,743 |
-| 41.61 | 320,522 |
-| 42.11 | 347,969 |
-| 38.32 | 275,655 |
-
-### Query Load During Ingest
-
-The English wikipedia data was re-ingested on top of the existing, compacted
-data. The following query samples were taken in 5 minute intervals while
-ingesting 132 articles/second:
-
-{: .wiki }
-| Query                                   | Sample 1 (seconds)  | Sample 2 
(seconds) | Sample 3 (seconds) | Sample 4 (seconds) | Sample 5 (seconds) |
-|-----------------------------------------|------|------|-------|------|-------|
-| "old" and "man" and "sea"               | 4.91 | 3.92 | 11.58 | 9.86 | 10.21 
|
-| "paris" and "in" and "the" and "spring" | 5.03 | 3.37 | 12.22 | 3.29 | 9.46  
|
-| "rubber" and "ducky" and "ernie"        | 4.21 | 2.04 | 8.57  | 1.54 | 1.68  
|
-| "fast" and ( "furious" or "furriest")   | 5.84 | 2.83 | 2.56  | 3.12 | 3.09  
|
-| "slashdot" and "grok"                   | 5.68 | 2.62 | 2.2   | 2.78 | 2.8   
|
-| "three" and "little" and "pigs"         | 7.82 | 3.42 | 2.79  | 3.29 | 3.3   
|

http://git-wip-us.apache.org/repos/asf/accumulo-website/blob/62b91e01/fonts/glyphicons-halflings-regular.eot
----------------------------------------------------------------------
diff --git a/fonts/glyphicons-halflings-regular.eot 
b/fonts/glyphicons-halflings-regular.eot
deleted file mode 100644
index b93a495..0000000
Binary files a/fonts/glyphicons-halflings-regular.eot and /dev/null differ

Reply via email to