On Thu, 11 Sep 2008, Coey Minear wrote:
Actually, based on my reading of the DBI reference, you should be able
to do the following to create a table (although possibly not temporary):
dbWriteTable(connection, "r_user_ids", r)
Then you can use the following to drop the table:
dbRemoveTable(c
Just a small correction:
start with
s <- paste(r$userid,collapse=",")
and not
s <- paste(r$userid,sep=",")
--- On Fri, 12/9/08, Moshe Olshansky <[EMAIL PROTECTED]> wrote:
> From: Moshe Olshansky <[EMAIL PROTECTED]>
> Subject: Re: [R] dat
s,
Moshe.
--- On Fri, 12/9/08, Avram Aelony <[EMAIL PROTECTED]> wrote:
> From: Avram Aelony <[EMAIL PROTECTED]>
> Subject: [R] database table merging tips with R
> To: [EMAIL PROTECTED]
> Received: Friday, 12 September, 2008, 4:33 AM
> Dear R list,
>
> What
At some point I'd like to try to compile the DBI-based ROracle package as well.
For now though, I'll stick with RODBC as it seems to do what I need. I
believe RODBC is not based on DBI, but that shouldn't preclude using the
pre-built subquery option which is a great idea and should work regar
Avram Aelony writes:
>
> I have not devoted time to setting up ROracle since binaries are
> not available and it seems to require some effort to compile (see
> http://cran.r-project.org/web/packages/ROracle/index.html). On the
> other hand, RODBC worked more or less magically once I set up t
I have not devoted time to setting up ROracle since binaries are not available
and it seems to require some effort to compile (see
http://cran.r-project.org/web/packages/ROracle/index.html). On the other hand,
RODBC worked more or less magically once I set up the data sources.
What is your su
While the subquery with a temporary table is probably the better
option, you could just manually generate the subquery and pass it in
with the query. As an example, if you have user_ids 1000-1005,
instead of having "... where user_id in (select user_id from
r_user_id)", you would have "... where u
Aaron Mackey writes:
> I guess I'd do it something like this:
>
> dbGetQuery(con, "CREATE TEMPORARY TABLE foo ( etc etc)")
> sapply(@userids, function (x) { dbGetQuery(con, paste("INSERT INTO foo
> (userid) VALUES (", x, ")")) })
>
> then later:
>
> dbGetQuery(con, "DROP TABLE foo");
>
I guess I'd do it something like this:
dbGetQuery(con, "CREATE TEMPORARY TABLE foo ( etc etc)")
sapply(@userids, function (x) { dbGetQuery(con, paste("INSERT INTO foo
(userid) VALUES (", x, ")")) })
then later:
dbGetQuery(con, "DROP TABLE foo");
-Aaron
On Thu, Sep 11, 2008 at 3:21 PM, Avram Ae
Perhaps I will need to create a temp table, but I am asking if there is a way
to avoid it. It would be great if there were a way to tie the R data frame
temporarily to the query in a transparent fashion. If not, I will see if I can
create/drop the temp table directly from sqlQuery.
-Avram
Sorry, I see now you want to avoid this, but you did ask what was the
"best way to efficiently ...", and the temp. table solution certainly
matches your description. What's wrong with using a temporary table?
-Aaron
On Thu, Sep 11, 2008 at 3:05 PM, Aaron Mackey <[EMAIL PROTECTED]> wrote:
> I wou
I would load your set of userid's into a temporary table in oracle,
then join that table with the rest of your SQL query to get only the
matching rows out.
-Aaron
On Thu, Sep 11, 2008 at 2:33 PM, Avram Aelony <[EMAIL PROTECTED]> wrote:
>
> Dear R list,
>
> What is the best way to efficiently marr
Dear R list,
What is the best way to efficiently marry an R dataset with a very large
(Oracle) database table?
The goal is to only return Oracle table rows that match IDs present in the R
dataset.
I have an R data frame with 2000 user IDs analogous to: r =
data.frame(userid=round(runif(20
13 matches
Mail list logo