Dear Thomas,
Thank you very much for the answering!
Yet why the situation happens only on some model, not all models? -
that is, why for other model it can drop some variables but for this
one it can't?
Thanks!!
Best regards,
Maggie
On Wed, Mar 18, 2009 at 3:38 PM, Thomas Lumley wrote:
>
>
Hi all,
I am strugling with date formates and caliculating diferent operations like
different between 2 dates and
getting minimum in vector of dates
that is i m working with dates in formate "6/22/1992 12:00:00 AM"
and the vector is
[1] 6/4/1992 12:00:00 AM 2/13/1992 12:00:00 AM6
G'day Kevin,
On Wed, 18 Mar 2009 21:46:51 -0700
wrote:
> I was trying to find source for optimize and I ran across
>
> function (f, interval, ..., lower = min(interval), upper =
> max(interval), maximum = FALSE, tol = .Machine$double.eps^0.25)
> {
> if (maximum) {
> val <- .Interna
I was trying to find source for optimize and I ran across
function (f, interval, ..., lower = min(interval), upper = max(interval),
maximum = FALSE, tol = .Machine$double.eps^0.25)
{
if (maximum) {
val <- .Internal(fmin(function(arg) -f(arg, ...), lower,
upper, tol))
Dear all,
I have the following code that try to plot
simple sinus curve into 2x2 grid in 1 page.
But this code of mine create 4 plots in 1 page
each. What's wrong with my approach?
__BEGIN__
library(lattice)
library(grid)
test.plot <- function(x,y) {
pushViewport(viewport(layout.pos.col=x, la
You were doing for a small size where the overhead masked the gains.
Here a larger case where you can see the differences:
> #matrix
> beta.mat <- matrix(0,nr=500, nc=200)
>
> #list
> gamma.mat <- list(a= matrix(1L, 500, 100), b=matrix(1.0, 500, 100))
>
> object.size(beta.mat)
[1] 800112
Hello,
My program calculates several variables at each iteration and some of them are
integers and the rest are numeric. When I save them into a matrix, all of them
are of numeric type, of course.
I'm trying to find a way to save time/memory of my program and I was thinking
that it might help t
Hi,
I'm involved in a bioinformatics project at my university, and we're doing a
comparison paper between some methods of classification of nc-RNA. I've been
encharged of ploting the ROC curves' graphs. But I'm new on working with R and
I'm having some difficulty with the prediction-class.
On 18 March 2009 at 17:44, Petar Milin wrote:
| I recently switched to Debian testing OS and explanation at the:
| http://cran.at.r-project.org/bin/linux/debian/
| is a little bit fuzzy. Can anyone give step-by-step how-to have updated
| R on Debian testing and/or unstable.
Please consider subscr
Hi All,
I've found a few references in the mailing list and documentation to
constrained least squares fitting, but little on restrained least squares.
To clarify what I mean, a constraint might limit a parameter to a particular
value (e.g. x=5.0, or exactly within the bounds 4.9 - 5.1), whereas a
It might be. I use Tinn-R, but always start my RGUI session
separately and not as part of Tinn-R.
On Wed, Mar 18, 2009 at 9:14 PM, Farrel Buchinsky wrote:
> Indeed. I can also get it to work if I type directly in Rterm. But not when
> running from Tinn-R. If this is a Tinn-R problem (relating to
thoeb gmail.com> writes:
> Is it possible to create a superior r script containing
> commands to run the sub-scripts one by one?
?source
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting gu
You can do something like this using connections and read in a set of
lines and saving the results in bigmemory, or in this case a 'save'
image:
zz <- file("ex.data", "w") # open an output file
for (i in 1:1)cat( "1\t2\t3\t4\t5\t6\t7\t8\t9\t10\t\t555\t\t",
file = zz, sep ="\n")
close(zz)
# re
I just compile mine from scratch. Debian ppc ibook G4. I would be
interested how to do this also, but the compiling is not all that
difficult. If you need help send me an email.
Stephen Sefick
On Wed, Mar 18, 2009 at 12:44 PM, Petar Milin wrote:
> Hello!
> I recently switched to Debian testin
Indeed. I can also get it to work if I type directly in Rterm. But not when
running from Tinn-R. If this is a Tinn-R problem (relating to however
command lines get from Tinn-R to the Rterm) then perhaps I should post it in
the Tinn-R problem.
Farrel Buchinsky
Sent from: Pittsburgh Pennsylvania Un
Maggie Wang wrote:
Dear R-users,
I use glm() to do logistic regression and use stepAIC() to do stepwise model
selection.
The common AIC value comes out is about 100, a good fit is as low as around
70. But for some model, the AIC went to extreme values like 1000. When I
check the P-values, All t
(1) In CRAN - Contributed Packages
***the order is AaBb***
It has a useful list a of alphabetical links at the top
(2) In the package index for
R version 2.8.1 (2008-12-22)
i386-apple-darwin8.11.1
locale:
en_US.UTF-8/en_US.UTF-8/C/C/en_US.UTF-8/en_US.UTF-8
***the order is AB
ab
***
It does not
I would like to experiment with testing the fit of a loess model against
the fit from an ordinary linear regression. The 1988 JASA paper by
Cleveland and Devlin *appears* to indicate that this can be done, at
least ``approximately''. They, as I read it, advocate the use of an
ANOVA type test wi
Folks,
The code below reliably crashes an R 2.8.1 session on XP when connected to
an SQL Server 2005 database. The problem arises when appending data using
sqlSave() with rownames=FALSE to a table that had been previously created
with rownames=TRUE. Certainly it's daft to do this as a regular
i wonder about the following examples showing incoherence in how type
conversions are done in r:
x = TRUE
x[2] = as.raw(1)
# Error in x[2] = as.raw(1) :
# incompatible types (from raw to logical) in subassignment type fix
it seems that there is an attempt to coerce the raw value
If you're interested in comparing empirical to simulation
distributions, I see two alternatives to your density() approach
(which will be sensitive to your choice of bandwidth). Both of the
following have been used in my field to look at the fit of empirical
response time data to models of human in
Hello all.
I wish to read a large data set into R. My current issue is in getting the
data so that R would be able to access it. Using read.table won't work
since the data is over 1GB in size (and I am using windows XP), so my plan
was to read the file chunk by chunk and each time move it into b
Hello again.
I'm trying to use package.skeleton to build my package. However, my
package will contain a Fortran subroutine.
Can you use package.skeleton with that subroutine, please or do you
need to add it manually?
Thanks again,
Sincerely,
Edna Bell
__
You could do it indirectly by saving your command history (a good idea
in any case :-) ), doing {something} to dump it to a file, and
sorting/seeking to see in what order you created items.
Or, if you're really creative ^_^, write some sort of script/function
which automatically creates a me
Hi all,
I am having a problem with the function "p.adjust" in stats. I have looked at
the manuals and searched the R site, but didn't get anything that seems
directly relevant. Can anybody throw any light on it or confirm my suspicion
that this might be a bug?
I am trying to use the p.adjust()
Hi again.
I found out how to download the manuals from the website.
Sorry
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide comment
Bill, Jim and Martin,
Great! The code is much faster and even looks more R-ish. I'm very new to R
and have some difficulties getting rid my procedural programming habits.
Many thanks to all for the great help.
Olivier.
On Wed, Mar 18, 2009 at 9:38 PM, William Dunlap wrote:
> Olivier,
> You ca
Dear R Gurus:
I have several package building questions, please:
a. Where do you find the list of keywords needed for the help files, please?
b. I'm using a SuSe Linux system at the moment. How do I download
the help manuals, please?
c. What is the command (not in R) to compile Fortran code t
can you show the list a more specific example of what you are trying to do?
most of the database packages support writeTable commands. So, if you
can represent the data you are trying to write in a dataframe, then
you can probably send it to the database with R.
-Whit
On Wed, Mar 18, 2009 at
Thanks very much.
-Original Message-
From: David M Smith [mailto:da...@revolution-computing.com]
Sent: Wednesday, March 18, 2009 12:55 PM
To: Yu, Changhong
Cc: r-help@r-project.org
Subject: Re: [R] numeric equality
On Wed, Mar 18, 2009 at 8:58 AM, Yu, Changhong wrote:
> Dear all,
> I am
Thank you, Dr. Lumley. I have implemented the following code for
the pareto distribution (see below). However, the estimates obtained
from survreg are very small & inaccurate. What I need help with is the
function for the deviance (the code below is wrong). I just don't
understand how to ob
Some of the simpler commands could surely be done using SQL-only queries, but
thats too simple for my project. I need to create some more complex commands
that would only be possible using R. Indeed if anyone has any suggestions
for this, Id also appreciate that. Every little bit of information I
Maithili Shiva wrote:
Dear R helpers
I have r file which estimate the parameters of 3 parameter
Weibull -
(A) - continuous shape parameter (alpha)
- continuous scale parameter (beta)
- continuous location parameter (gamma)
(B) Also, I have a r file which calculates the
Maithili Shiva wrote:
Dear R helpers
How to estimate teh parameters of 3 Parameter Gamma Distribution. I tried LMOM
> package but it deals with two parameter GAMMA.
Function pelpe3 in package lmom provides estimation for the Pearson type
III distribution, which is an extended form of the 3-p
Hi,
I use nls to fit Gaussian curves to datasets that are expected to be
Gaussian-shaped:
gauss.fit = nls(y ~ amp*exp(-0.5*(x-x0)^2/theVariance^2) + theNoise,
data = smooth, start = gauss.fit.start)
Some of these datasets are indeed shaped like Gaussians, while others
are not. I would like
On Wed, Mar 18, 2009 at 7:43 PM, Berend Hasselman wrote:
>>> system.time(ans.nl <- nleqslv(x=p0, fn=broydt))[1]
>>
>> user.self
>> 8.17
>
> On my Imac 2.16Ghz and R 2.8.1 and Mac OS X 10.5.6
> this took approximately 5 seconds.
>
> Interesting is this experiment.
> I set the jacobian for a star
On 18-03-2009, at 18:36, Ravi Varadhan wrote:
system.time(ans.nl <- nleqslv(x=p0, fn=broydt))[1]
user.self
8.17
On my Imac 2.16Ghz and R 2.8.1 and Mac OS X 10.5.6
this took approximately 5 seconds.
Interesting is this experiment.
I set the jacobian for a starting point with all x-values
Hi Agus,
try this two ones
d<-matrix(rpois(45,3),5,9)
barplot(d,beside=T,col=rainbow(5),names=c("CRTL","LSB","ONEMKR",
"TWOMKR","BLUP","BLUPQ","BLUP1M","BLUP2M","GAS"),las=2)
barplot(d,beside=T,col=rainbow(5),names=c("CRTL","LSB","ONEMKR",
"TWOMKR","BLUP","BLUPQ","BLUP1M","BLUP2M","GAS"),cex.nam
If you make your calls to density with common lenth and interval
parameters you should be able to get better "registration":
?density
# this example sums the squared differences
x <- rnorm(200,1,1)
x2 <- rnorm(200,1,1)
d1 <- density(x, n=512, from=-1, to= 4)
d2 <- density(x2, n=512, from=-
Dear all,
I want to put 9 barplots side by side. My code below only print 5 names from
9 names I gave.
Problem: how to print all of those 9 names? I use cex=0.8 but did not work,
it gave me error message.
d<-matrix(rpois(45,3),5,9)
barplot(d,beside=T,col=rainbow(5),names=c("CRTL","LSB","ONEMKR",
DonkeyRhubarb gmail.com> writes:
>
> I'm quite new to R and have limited experience. What Im trying to do is very
> important as it is part of my final year project; or more so the central
> idea behind it.
>
> I will be creating BibTex files to enter into a mySQL database. I then need
> to per
Hi,
This is my first time posting to the mailing list, so if I'm doing something
wrong, just let me know. I've taken ~1000 samples from 8 biological
replicates, and I want to somehow combine the density functions of the
replicates. Currently, I can plot the density function for each biological
re
Hi Paul and Berend,
Here is an example for Broyden's tridiagonal system:
broydt <- function(x, h=2) {
n <- length(x)
f <- rep(NA, n)
f[1] <- ((3 - h*x[1]) * x[1]) - 2*x[2] + 1
tnm1 <- 2:(n-1)
f[tnm1] <- ((3 - h*x[tnm1]) * x[tnm1]) - x[tnm1-1] - 2*x[tnm1+1] + 1
f[n] <- ((3 - h*x[n]) * x[n]) - x[n-
Try this way. Took less than 1 second for 50,000
> system.time({
+ x <- sample(5) # test data
+ x[sample(5,1)] <- 'asdfasdf' # characters strings
+ which.num <- grep("^[ 0-9]+$", x) # find numbers
+ # convert to leading 0
+ x[which.num] <- sprintf("%018.0f", as.
Dear Gabor,
Thank you very much for this very helpful suggestion. Indeed, I was
making it too complicated by spliting the string into a vector.
Thank you!
/Fredrik
On Wed, Mar 18, 2009 at 3:37 PM, Gabor Grothendieck
wrote:
> Try this:
>
>> L <- c("!H*L", "L%", "%L", "H*L", "L%", "%L", "H*L", "
On Wed, Mar 18, 2009 at 8:58 AM, Yu, Changhong wrote:
> Dear all,
> I am totally confused by the following R output, but don't have a clue
> for it.
>
>> a <- 1 - 0.2
>
>> a == 0.8
>
> [1] TRUE
>
>> a <- 1 - 0.8
>
>> a == 0.2
>
> [1] FALSE
This is expected behaviour. The bottom line is that you
This is a FAQ:
http://cran.r-project.org/doc/FAQ/R-FAQ.html#Why-doesn_0027t-R-think-these-numbers-are-equal_003f
and also covered nicely in Burns' R Inferno:
www.burns-stat.com/pages/Tutor/R_inferno.pdf
Sarah
On Wed, Mar 18, 2009 at 11:58 AM, Yu, Changhong wrote:
> Dear all,
>
>
>
> I am totally
Paul Smith wrote:
>
> On Tue, Mar 17, 2009 at 7:55 PM, Berend Hasselman wrote:
>> You can also try my package "nleqslv" for solving systems of non linear
>> equations (using Broyden or Newton with a selection of global
>> strategies).
>>
>> library(nleqslv)
>>
>> xinit <- rep(1,3)
Hello!
I recently switched to Debian testing OS and explanation at the:
http://cran.at.r-project.org/bin/linux/debian/
is a little bit fuzzy. Can anyone give step-by-step how-to have updated
R on Debian testing and/or unstable.
Thanks in advance!
PM
__
Dear all,
I am totally confused by the following R output, but don't have a clue
for it.
> a <- 1 - 0.2
> a == 0.8
[1] TRUE
> a <- 1 - 0.8
> a == 0.2
[1] FALSE
> a <- 1 - 0.5
> a == 0.5
[1] TRUE
> a <- 1 - 0.6
> a == 0.4
[1] TRUE
> a <- 1 - 0.9
> a == 0.1
[1] FALSE
My R
On Mon, Mar 16, 2009 at 3:33 PM, Cristián wrote:
> Hi,
>
> I have seen a lot of problems from people trying to compile R with
> MKL.
It's not clear what platform you're on, but if you're just looking for
a precompiled binary of R with the MKL libraries, you might want to
try REvolution R (our dis
Check out this reference:
http://tolstoy.newcastle.edu.au/R/e2/help/07/02/9709.html
On Wed, Mar 18, 2009 at 11:16 AM, Dongyan Song wrote:
>
> Hi Jim,
>
> Thank you very much! I will try to sample them then.
>
> Best,
> Dongyan
>
>
> jholtman wrote:
>>
>> The amount of data that you want to rea
That's a much cleaner solution! It would be nice if biglm takes the
defaults from options(digits), but off course we can also just use
print() as you pointed out.
Thanks again for your replies and for making this library available to
the community.
Francisco
--
Francisco J. Zagmutt
Vose Co
Perhaps:
gc()
On Wed, Mar 18, 2009 at 8:26 AM, Abelian wrote:
> Dear all
> when the program is runing, we can realize that the memory size will
> be asked more and more..
> Therefore, we could meet the significant problem, such as out off
> memory size.
> However, even if i rm() some variable
Hi all,
I'm using R to find duplicates in a set of 6 files containing Part Number
information. Before applying the intersect method to identify the duplicates
I need to normalize the P/Ns. Converting the P/N to uppercase if
alphanumerical and applying an 18 char long zero padding if numerical.
Wh
If what you have is a character string, then convert it to POSIXlt and
use difftime:
> x1
[1] "Wed Mar 18 12:18:11 2009"
> x2
[1] "Wed Mar 18 12:18:18 2009"
> x1.c <- strptime(x1, "%a %b %d %H:%M:%S %Y")
> x2.c <- strptime(x2, "%a %b %d %H:%M:%S %Y")
> difftime(x2.c, x1.c, units='secs')
Time diffe
Could it be due to the sample size?...i seem to have this error when i try
to model smaller sample sizes...
If it is..is there a way to deal with that??
*Andreia*
-- Forwarded message --
From: Andreia Mendonça
Date: 2009/3/17
Subject: [R] initial value in 'vmmin' is not finite
To
Dear R helpers
I have r file which estimate the parameters of 3 parameter
Weibull -
(A) - continuous shape parameter (alpha)
- continuous scale parameter (beta)
- continuous location parameter (gamma)
(B) Also, I have a r file which calculates the parameters
of Generaliz
Dear R helpers
How to estimate teh parameters of 3 Parameter Gamma Distribution. I tried LMOM
package but it deals with two parameter GAMMA.
Please guide
Regards
Maithili
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r
On 3/18/2009 11:16 AM, David Haykazyan wrote:
Hi,
Our company is interested of using R in our project. We want to have an
optional module in our product which calls R functions (using its API).
However we do not distribute R and the users who wish to use that module
have to install and confi
Dear all,
I have two variables which contents are date() function answers, like:
d1<-date()
d2<-date()
Now I need to compute the difference between d2 and d1 (in minutes).
Any help are welcome.
Best wishes
miltinho
brazil
[[alternative HTML version deleted]]
Hi Maithili,
Some discussion was generated around this issue few years ago. I have not
experience doing this, so my help might not be a really one. Searching the
files I came up with this R post:
http://markmail.org/search/?q=fit+frechet+distribution
Also, a non-related website presents an imple
Try this:
> L <- c("!H*L", "L%", "%L", "H*L", "L%", "%L", "H*L", "H*L", "L%",
+ "%L", "H*L", "!H*L", "L%", "%L", "%L", "%L", "%L", "H*L", "!H*L",
+ "L%", "H*", "%L", "H*L", "L_%", "%L", "%H", "%H", "!H*L", "%H",
+ "H*", "%H", "%H", "H%", "H*", "!H*L", "H*L", "!H*L", "H*", "H*L",
+ "L_%", "%L", "%L
Well, it would have helped if you'd given us an example of what
exactly you did, but here's a working solution.
> Orthodont <- data.frame(age = 1:6, sex = factor(c("m", "f", "m", "f", "m",
> "f")))
> str(Orthodont)
'data.frame': 6 obs. of 2 variables:
$ age: int 1 2 3 4 5 6
$ sex: Factor w/
Dear Jorge,
Thanks a lot for great help. That will solve my problem of generating random
number generation etc. however, my main problem is HOW TO ESTIMATE THE
PARAMETERS of FRECHET Distribution from a given sample data?
That will be a great help for me.
Thanks a lot again
Regards
Maithili
Hi all,
I'm quite new to R and have limited experience. What Im trying to do is very
important as it is part of my final year project; or more so the central
idea behind it.
I will be creating BibTex files to enter into a mySQL database. I then need
to perform operations on this DB like 'return
Dear Maithili,
Take a look at this:
http://bm2.genes.nig.ac.jp/RGM2/R_current/library/evd/man/frechet.html
HTH,
Jorge
On Wed, Mar 18, 2009 at 11:14 AM, Maithili Shiva
wrote:
>
> Dear R Helpers
>
> Which package is available for estimatine the parameters of three parameter
> FRECHET distributi
Hi all
Im in desperate need of help. Im working with a grouped data object, called
Orthodont in the nlme package in R, and am trying to fit various models
(learning methods for my thesis), but one of the variables in the object is
numeric, (age) and I need it to be a factor. Ive tried:
as.
Hi Jim,
Thank you very much! I will try to sample them then.
Best,
Dongyan
jholtman wrote:
>
> The amount of data that you want to read in (136M numbers) will
> require about 1GB of memory (8 bytes per number for floating point -
> truncation does not reduce this number of bytes). So if you
Dear R-helpers,
I am writing to the list in order to inquire whether there exists any
R package or program that will help me "describe" clusters.
The situation is as follows:
1) I create some clusters (say, with any clustering method in R).
2) I want to "describe" and assign some kind of "label
Hi,
Our company is interested of using R in our project. We want to have an
optional module in our product which calls R functions (using its API).
However we do not distribute R and the users who wish to use that module
have to install and configure R themselves. The module that contains calls
As far as I know setwd() does not influence where temporay files (TMPDIR,
TMP, TEMP) are located - at least not for me on windows vista.
As mentioned before setting environment variable should work.
Also I carry R on a USB, so I don't want temporary files being written to
the PC / laptop where I c
Dear R Helpers
Which package is available for estimatine the parameters of three parameter
FRECHET distribution. Also, how to generate the random numbers for Frechet
using these three estimated parameters.
Thanking in advance
Maithili
__
R-help@r-p
Dear all
when the program is runing, we can realize that the memory size will
be asked more and more..
Therefore, we could meet the significant problem, such as out off
memory size.
However, even if i rm() some variables that i will not use it anymore,
the memory size still not enough.
By the way,
Dear list,
This seems like a very simple problem, but I am failing to loose a
dimension (I think).
I have list, like this:
...
[2072] "!H*L" "L%" "%L" "H*L" "L%" "%L" "H*L" "H*L" "L%"
"%L" "H*L" "!H*L" "L%" "%L" "%L" "%L" "%L" "H*L" "!H*L"
[2091] "L%" "H*" "%L" "H*L
Dear John,
Thanks for the reply. I will look for another way to compute adjusted
effect.
Frederic.
John Fox a écrit :
Dear Frederic,
The effect function won't handle a model with a nested effect.
Sorry,
John
-Original Message-
From: r-help-boun...@r-project.org [mailto:r-help
Hello, I am working on a project consisting of several r-scripts written in
the tinn editor. Is it possible to create a superior r script containing
commands to run the sub-scripts one by one?
-
Tamara Hoebinger
University of Vienna
--
View this message in context:
http://www.nabble.com/ru
The amount of data that you want to read in (136M numbers) will
require about 1GB of memory (8 bytes per number for floating point -
truncation does not reduce this number of bytes). So if you want to
read it all in, then find a 64-bit version of R and probably at least
4GB of memory for your proc
Hello,
Is there a way to cause Sweave or .Rnw files to create boxes around the
code segments in a sweave produced latex document? I have tried
variations of the \fbox and \makebox commands in the .Rnw file without
luck.
Thanks,
Matt
__
R-help@r-pro
Dear R users,
I am trying to minimize the distance between my data points and theoretical
gamma distribution over shape and scale parameters. the function "mde" from
actuar package does it for empirical distribution function and theoretical
gamma distribution. However, I would like to minimize the
Dear Frederic,
The effect function won't handle a model with a nested effect.
Sorry,
John
> -Original Message-
> From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org]
On
> Behalf Of Frédéric Hérault
> Sent: March-18-09 9:57 AM
> To: r-help@r-project.org
> Subject: [R]
Joris,
Ridge regression is a type of regularized estimation approach. The objective
function for least-squares, (Y - Xb)^t (Y - Xb) is modified by adding a
quadratic penalty, k b^t b. Because of this the log-likelihood value (sum of
squared residuals), for a fixed k, does not have much meanin
On Tue, Mar 17, 2009 at 7:55 PM, Berend Hasselman wrote:
> You can also try my package "nleqslv" for solving systems of non linear
> equations (using Broyden or Newton with a selection of global strategies).
>
> library(nleqslv)
>
> xinit <- rep(1,3) # or rep(0,3) for a singular star
Dear R helpers
I have r file which estimate the parameters of 3 parameter Weibull -
(A) - continuous shape parameter (alpha)
- continuous scale parameter (beta)
- continuous location parameter (gamma)
(B) Also, I have a r file which calculates the parameters of Generalized Pareto
Torsten and whoever else knows about multcomp:
I am using confint in multcomp to prepare a largish report with Sweave. I noted
that confint uses up a lot of time, about 10 seconds per run, which adds up when
several confints are calculated in draft mode.
I am using cached dummy results to get aro
I am out of the office from Mon 03/16/2009 until Fri 03/20/2009.
Note: This is an automated response to your message "R-help Digest, Vol
73, Issue 18" sent on 18.3.09 7:00:06.
This is the only notification you will receive while this person is away.
__
Just a side remark : "any" clustering method in R is pretty broad. Actually,
k-means clustering is a completely different animal than distance-based
neighbour-joining methods. You should be a bit more specific about your
research
I wouldn't worry too much about the scientific practice, as cluster
http://addictedtor.free.fr/graphiques/RGraphGallery.php?graph=130
http://bm2.genes.nig.ac.jp/RGM2/index.php?ctv=Cluster
#returns 14 pages of examples
http://bm2.genes.nig.ac.jp/RGM2/index.php?query=varclus&ctv=Cluster
_
_-
David Winsemius
On Mar 18, 2009, at 9:43 AM, Carlos J. Gil Bellosta w
Hi,
Thank you for your concern!
The file has 136,047,472 lines, with one value in each line, and is 1.7G in
size. I run in a Linux (OpenSuse OS) with 4G memory in total. The error
message is Error: cannot allocate vector of size 2.0 Gb. And the worst thing
is even if I read all the data into R
Dear R helpers,
I have the following model
model_1<-glm(y~A+B+C+E+A:D,contrasts=list(D=contrasts_D),data=mydata,na.action=na.omit)
with: options(contrasts=c("contr.sum", "contr.poly"))
A,B and E are 2-levels factor,
C is covariate,
D is 20 levels factor with 10 in relation with the first leve
Dear R-helpers,
I am writing to the list in order to inquire whether there exists any
R package or program that will help me "describe" clusters.
The situation is as follows:
1) I create some clusters (say, with any clustering method in R).
2) I want to "describe" and assign some kind of "label
Hi,
Be sure to take a look at the levelplot() function from the lattice
package. From the documentation of levelplot:
library(lattice)
x <- seq(pi/4, 5 * pi, length.out = 100)
y <- seq(pi/4, 5 * pi, length.out = 100)
r <- as.vector(sqrt(outer(x^2, y^2, "+")))
grid <- expand.grid(x=x, y=y)
grid
Hi Cristián.
>> However, It is a little different.( -lgomp and configure line)
>>
>> MKL=" -...@{mkl_lib_path@} \
>> -Wl,--start-group \
>> �...@{mkl_lib_path@}/libmkl_gf_lp64.a \
>> �...@{mkl_l
Josh wrote:
Hi,
I have three related variables (vectors) and would like to see their
distribution on a 2D plot of first two variables, having colors proportional
to the values from third variable. I could have done so by passing 3rd
variable to the color palette, but this would disrupt the relati
--- begin included message -
I would like to design a study (a two group comparison) based on reduction on
events(say hospital admissions). In a previous study hospital admission rate of
140 admissions per 72 patients (over a 4 month period) has been observed. That
is rate is about 1.9. In o
Dear all
I would appreciate it if someone could help me to install Scviews.
1) I am running Windows XP Service Pack 3 with 1Gb of memory and 120GB
HDrive.
2) R is installed and runs fine
RCommander is installed and seems to run reasonably well. Some issues where
it 'bombs' out for no reason.
Peter Dalgaard wrote:
> KunW wrote:
>
>> Dear people,
>>
>> I was using plot function, and kept getting information such as "Error in
>> axis(1, 1:4, LETTERS[1:4]) : too few arguments".
>> I use the examples given in 'axis' help, and got same error information.
>> Can any of you please tell me
Gundala Viswanath gmail.com> writes:
> ...In which I want to plot accumulative value of dat1 with respect
> to x-axis. Also plot it together with dat2.
>
>
> #x-axis dat1 dat2
> -10 0.0140149 0.0140146
> -9 0.00890835 0.00891768
> -8 0.00672276 0.00672
Dear all
I would appreciate it if someone could help me to install Scviews.
1) I am running Windows XP Service Pack 3 with 1Gb of memory and 120GB HDrive.
2) R is installed and runs fine
RCommander is installed and seems to run reasonably well. Some issues where it
'bombs' out for no reason.
3
Dear all,
I have data that looks like this.
In which I want to plot accumulative value of dat1 with respect
to x-axis. Also plot it together with dat2.
What's the common way to do it in R?
I looked at ECDF from Hmisc, it doesn't seem to do what I want.
In particular it doesn't allow us to give x
1 - 100 of 114 matches
Mail list logo