I set up two virtual data centers, one for analytics and one for REST
service. The analytics data center sits top on Hadoop cluster. I want to
bulk load my ETL results into the analytics data center so that the REST
service won't have the heavy load. I'm using CQLTableInputFormat in my
Spark Applic
CQLSSTableWriter only accepts an INSERT or UPDATE statement. I'm wondering
whether make it accept DELETE statement.
I need to update my cassandra table with a lot of data everyday.
* I may need to delete a row (given the partition key)
* I may need to delete some columns. For example, there are 2
See https://bintray.com/docs/api/#_debian_upload for how to upload/sign
artifacts with the Bintray API. Your release managers will need to get
accounts on Bintray and then open INFRA tickets to get added to the
Cassandra team in the Apache org there, at which point you'll have full
admin rights ove
We have a distr per minor version. You can see them here (we only currently
use the last 3 major releases 21x, 20x and 12x) you can see them here
https://dist.apache.org/repos/dist/release/cassandra/debian/dists/
The release managers have a local repo with our reprepro settings. We
deploy the art
Great! So to start, can you give us a sense of what your needs for the repo
are? What are you putting in it? Are you splitting it up by distro at all
or just putting new versions of Cassandra in each time?
A.
On Wed, Jan 7, 2015 at 7:15 AM, Jake Luciani wrote:
> Hi Andrew,
>
> I'm happy to help
Hi Andrew,
I'm happy to help.
-Jake
On Tue, Jan 6, 2015 at 2:19 PM, Andrew Bayer wrote:
> Hi Cassandra team -
>
> So as you're probably aware, the Cassandra Debian packages currently on
> dist need to be moved off there. The Infra team has been working on a
> solution for that - we've got an o