[Rd] still confused about vnsprintf, log1p, exp1m

2005-07-06 Thread Charles Geyer
I posted a question quite a while back about vsnprintf and Brian Ripley
replied that R always makes sure it's available, so I needn't worry.
A look at the source code shows the same for log1p and exp1m.  But I am
still confused.  Right now my code compiles and runs fine when using
the approved CFLAGS (no special gcc stuff), but when compiled
with -ansi -pedantic it can't find these functions (because R itself
was compiled with the default CFLAGS).

That's no problem, except the last time I submitted a package
it was kicked back because it didn't compile with -ansi -pedantic (which
Writing R Extensions says it should).  So what do I do?  Or will there
be no problem because KH tests with a version of R compiled
with -ansi -pedantic so R not the C libraries is providing these functions?

Pardon me if this is solved in 2.1.1 (I haven't looked there yet).
-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
[EMAIL PROTECTED]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] valgrind complains about regex.c (PR#8043)

2005-08-02 Thread Charles Geyer
gt;=3D=3D22324=3D=3D  Address 0x1C1E23A8 is 0 bytes after a block of size 32 
> >>a=
> >>lloc'd
> >>=3D=3D22324=3D=3Dat 0x1B90650D: calloc (in 
> >>/usr/lib/valgrind/vgpreload_=
> >>memcheck.so)
> >>=3D=3D22324=3D=3Dby 0x81247B5: parse_expression (regex.c:5406)
> >>=3D=3D22324=3D=3Dby 0x8125868: parse_branch (regex.c:4475)
> >>=3D=3D22324=3D=3Dby 0x8125913: parse_reg_exp (regex.c:4420)
> >>=3D=3D22324=3D=3D=20
> >>=3D=3D22324=3D=3D ERROR SUMMARY: 2 errors from 2 contexts (suppressed: 39 
> >>f=
> >>rom 2)
> >>=3D=3D22324=3D=3D malloc/free: in use at exit: 12691882 bytes in 6426 
> >>block=
> >>s.
> >>=3D=3D22324=3D=3D malloc/free: 32534 allocs, 26108 frees, 33105500 bytes 
> >>al=
> >>located.
> >>=3D=3D22324=3D=3D For a detailed leak analysis,  rerun with: --leak-check=
> >>=3Dyes
> >>=3D=3D22324=3D=3D For counts of detected errors, rerun with: -v
> >>linux$ exit
> >>
> >>Script done on Mon 01 Aug 2005 02:10:42 PM PDT
> >>-- end of script --
> >>--=20
> >>Charles Geyer
> >>Professor, School of Statistics
> >>University of Minnesota
> >>[EMAIL PROTECTED]
> >>
> >>__
> >>R-devel@r-project.org mailing list
> >>https://stat.ethz.ch/mailman/listinfo/r-devel
> >>
> >>
> >
> >

-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
[EMAIL PROTECTED]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] valgrind complains about regex.c (PR#8043)

2005-08-02 Thread Charles Geyer
On Tue, Aug 02, 2005 at 07:13:54AM +0100, Prof Brian Ripley wrote:
> Did you compile R without optimization?  Such reads are often the result 
> of read-aheads produced by the optimizer in an attempt to keep 
> pipelines full (and are harmless).

linux$ R CMD config CFLAGS
-g -O2

so no.  I just took the defaults.  I'll try that too along with R-patched.

> 
> On an un-optimized build of R I am unable to reproduce this.  (I also was 
> unable to reproduce it on my optimized build of 2.1.1 using gcc 3.4.4, 
> although I did get a report on other read-aheads.)
> 
> On Mon, 1 Aug 2005 [EMAIL PROTECTED] wrote:
> 
> >I think I am using objects according to the man page.
> >This seems to be a valid regular expression.  But whether
> >I know what I'm doing or no, it still shouldn't be doing
> >what valgrind seems to be saying it's doing.  (IMHO)
> 
> I think you need to take that up with compiler designers: it is common 
> practice.
> 
> >-- start of script --
> >Script started on Mon 01 Aug 2005 02:09:00 PM PDT
> >linux$ printenv VALGRIND_OPTS
> >--tool=3Dmemcheck
> >linux$ cat bar.R
> >
> >foo <- 1
> >bar <- 2:3
> >baz <- 4:6
> >qux <- matrix(7:10, 2)
> >
> >ls()
> >rm(list =3D objects(pattern =3D "^[a-pr-z]"))
> >ls()
> >
> >linux$ R --version
> >R 2.1.1 (2005-06-20).
> >Copyright (C) 2005 R Development Core Team
> >
> >R is free software and comes with ABSOLUTELY NO WARRANTY.
> >You are welcome to redistribute it under the terms of the GNU
> >General Public License.  For more information about these matters,
> >see http://www.gnu.org/copyleft/gpl.html.
> >linux$ gcc --version
> >=1B[0mgcc (GCC) 3.3.5 20050117 (prerelease) (SUSE Linux)
> >=1B[0mCopyright (C) 2003 Free Software Foundation, Inc.
> >=1B[0mThis is free software; see the source for copying conditions.  There 
> >=
> >is NO
> >=1B[0mwarranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR 
> >PU=
> >RPOSE.
> >=1B[0m
> >linux$ cat /etc/SuSE-release=20
> >SuSE Linux 9.3 (i586)
> >VERSION =3D 9.3
> >linux$ R --vanilla --debugger=3Dvalgrind < bar.R >| bar.Rout
> >=3D=3D22324=3D=3D Memcheck, a memory error detector for x86-linux.
> >=3D=3D22324=3D=3D Copyright (C) 2002-2004, and GNU GPL'd, by Julian Seward 
> >=
> >et al.
> >=3D=3D22324=3D=3D Using valgrind-2.2.0, a program supervision framework 
> >for=
> >x86-linux.
> >=3D=3D22324=3D=3D Copyright (C) 2000-2004, and GNU GPL'd, by Julian Seward 
> >=
> >et al.
> >=3D=3D22324=3D=3D For more details, rerun with: -v
> >=3D=3D22324=3D=3D=20
> >=3D=3D22324=3D=3D Invalid read of size 4
> >=3D=3D22324=3D=3Dat 0x81255AD: parse_expression (regex.c:5045)
> >=3D=3D22324=3D=3Dby 0x8125868: parse_branch (regex.c:4475)
> >=3D=3D22324=3D=3Dby 0x8125913: parse_reg_exp (regex.c:4420)
> >=3D=3D22324=3D=3Dby 0x81261B3: Rf_regcomp (regex.c:4384)
> >=3D=3D22324=3D=3D  Address 0x1C1E23A8 is 0 bytes after a block of size 32 
> >a=
> >lloc'd
> >=3D=3D22324=3D=3Dat 0x1B90650D: calloc (in 
> >/usr/lib/valgrind/vgpreload_=
> >memcheck.so)
> >=3D=3D22324=3D=3Dby 0x81247B5: parse_expression (regex.c:5406)
> >=3D=3D22324=3D=3Dby 0x8125868: parse_branch (regex.c:4475)
> >=3D=3D22324=3D=3Dby 0x8125913: parse_reg_exp (regex.c:4420)
> >=3D=3D22324=3D=3D=20
> >=3D=3D22324=3D=3D Invalid write of size 4
> >=3D=3D22324=3D=3Dat 0x81255B2: parse_expression (regex.c:5045)
> >=3D=3D22324=3D=3Dby 0x8125868: parse_branch (regex.c:4475)
> >=3D=3D22324=3D=3Dby 0x8125913: parse_reg_exp (regex.c:4420)
> >=3D=3D22324=3D=3Dby 0x81261B3: Rf_regcomp (regex.c:4384)
> >=3D=3D22324=3D=3D  Address 0x1C1E23A8 is 0 bytes after a block of size 32 
> >a=
> >lloc'd
> >=3D=3D22324=3D=3Dat 0x1B90650D: calloc (in 
> >/usr/lib/valgrind/vgpreload_=
> >memcheck.so)
> >=3D=3D22324=3D=3Dby 0x81247B5: parse_expression (regex.c:5406)
> >=3D=3D22324=3D=3Dby 0x8125868: parse_branch (regex.c:4475)
> >=3D=3D22324=3D=3Dby 0x8125913: parse_reg_exp (regex.c:4420)
> >=3D=3D22324=3D=3D=20
> >=3D=3D22324=3D=3D ERROR SUMMARY: 2 errors from 2 contexts (suppressed: 39 
> >f=
> >rom 2)
> >=3D=3D22324=3D=3D malloc/free: in use at exit: 12691882 bytes in 6426 
> >block=
> >s.
> >=3D=3D22324=3D=3D malloc/free: 32534 allocs, 26108 frees, 33105500 bytes 
> >al=
> >located.
> >=3D=3D22324=3D=3D For a detailed leak analysis,  rerun with: --l

Re: [Rd] valgrind complains about regex.c (PR#8043)

2005-08-03 Thread Charles Geyer
On Tue, Aug 02, 2005 at 07:50:54AM -0400, Duncan Murdoch wrote:
> Prof Brian Ripley wrote:
> >Did you compile R without optimization?  Such reads are often the result 
> >of read-aheads produced by the optimizer in an attempt to keep 
> >pipelines full (and are harmless).
> 
> There were both a read and a write.  I can see the read being harmless, 
> but is the write harmless?  I suspect this may be the bug I fixed on 
> July 16, since it had to do with character classes including ranges 
> (like Charlie's "[a-pr-z]").
> 
> Charlie, have you tried a recent version of R-patched?

Now I have.  The computer is my laptop and not connected to the net
so I can't upload details, but R-1.2.1-patched as of yesterday does
NOT exhibit the same bug.  So I guess you are right.  Sorry for wasting
your time.  I'm still a newbie at this R stuff.  Next time I'll check
against R-*-patched before submitting a bug report.
-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
[EMAIL PROTECTED]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] valgrind complains about regex.c (PR#8043)

2005-08-04 Thread Charles Geyer
On Wed, Aug 03, 2005 at 10:25:02PM +0200, Peter Dalgaard wrote:
> [EMAIL PROTECTED] writes:
> 
> > On Tue, Aug 02, 2005 at 07:50:54AM -0400, Duncan Murdoch wrote:
> 
> > > Charlie, have you tried a recent version of R-patched?
> > 
> > Now I have.  The computer is my laptop and not connected to the net
> > so I can't upload details, but R-1.2.1-patched as of yesterday does
>  ###
> 
> This email has been a LONG time underway? ;-)

No.  I have "numerical dyslexia".  I just typed it rather than cut and
paste (I was just using someone else's account to send mail when I couldn't
connect my laptop).  Makes it hell teaching intro stats.  I have to have
all numerical examples worked in advance.  I tell people I went into math
so I could work with letters instead of numbers.

-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
[EMAIL PROTECTED]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] bizarre signif stars in Sweave latex

2005-08-21 Thread Charles Geyer
OK.  I give up.  I'll ask a stupid question.
How do I get the [EMAIL PROTECTED] signif stars line printed by summaries
to not look extremely bizarre in the latex produced by Sweave?
For example, see p. 7 of

http://www.stat.umn.edu/geyer/aster/library/aster/doc/tutor.pdf

I can see what the problem is.  R emits non-ascii characters (as it
is supposed to do), Sweave puts them in the tex file, and latex can't
handle them.  But I don't see the solution.

H.  Well I just discovered a kludge

<>=
Sys.setlocale(category = "LC_ALL", locale = "C")
@

at the beginning of the Rnw file.  But is that TRT (the Right Thing)?

-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
[EMAIL PROTECTED]

__
[EMAIL PROTECTED] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] bizarre signif stars in Sweave latex

2005-08-22 Thread Charles Geyer
On Mon, Aug 22, 2005 at 07:42:21AM +0100, Prof Brian Ripley wrote:
> What locale is this?
> 
> My guess is that this is a UTF-8 locale.

Yes.

> Sys.getlocale()
[1] 
"LC_CTYPE=en_US.UTF-8;LC_NUMERIC=C;LC_TIME=en_US.UTF-8;LC_COLLATE=en_US.UTF-8;LC_MONETARY=en_US.UTF-8;LC_MESSAGES=en_US.UTF-8;LC_PAPER=C;LC_NAME=C;LC_ADDRESS=C;LC_TELEPHONE=C;LC_MEASUREMENT=C;LC_IDENTIFICATION=C"

> If so, you need to tell latex 
> the input is in UTF-8, which you can do in the current LaTeX release
> (you need 2003/12/01).  As I recall you do this by
> 
> \usepackage[utf8]{inputenc}

Right.  This works.

Silly me.  I fell like the drunk looking for his keys under the lamppost.
Here I was looking for a solution in the R docs, when I should have been
looking in the LaTeX Companion.
-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
[EMAIL PROTECTED]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] lattice and for loop

2005-09-02 Thread Charles Geyer
- Forwarded message from Sandy Weisberg <[EMAIL PROTECTED]> -

OK, here is my R bug:

library(lattice)
x <- rnorm(20)
y <- rnorm(20)
z <-rep(c(1,2),10)
xyplot(y~x|z)
# the above works fine.  Now try this:

for (j in 1:1) {xyplot(y~x|z)}

# no graph is produced.

-- 
Sanford Weisberg
University of Minnesota, School of Statistics
312 Ford Hall, Minneapolis, MN  55455
612-625-8355, FAX 612-624-8868
St. Paul office:  146 Classroom-Office Building, 612-625-8777
[EMAIL PROTECTED]

- End forwarded message -

Sandy originally found this in

R 2.1.1 (for Windows).

I have tried this in

R 2.1.1 Patched (2005-08-04).
R 2.1.1 Patched (2005-09-02).
R 2.2.0 Under development (unstable) (2005-09-01).

all on

Linux 2.6.11.4-21.7-smp i686 athlon i386 GNU/Linux
Suse 9.3
gcc (GCC) 3.3.5 20050117 (prerelease) (SUSE Linux)

So our question is: is this a bug or a feature?

-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
[EMAIL PROTECTED]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] looks in liblapack.a not liblapack.so

2005-09-17 Thread Charles Geyer
I can't compile R-alpha on AMD 64.  Rather than include a 1400 line script
I have put it on the web

http://www.stat.umn.edu/~charlie/typescript.txt

way down near the bottom it fails building lapack.so

gcc -shared -L/usr/local/lib64 -o lapack.so  Lapack.lo-llapack -lblas 
-lg2c -lm -lgcc_s

/usr/lib64/gcc-lib/x86_64-suse-linux/3.3.5/../../../../x86_64-suse-linux/bin/ld:
 
/usr/lib64/gcc-lib/x86_64-suse-linux/3.3.5/../../../../lib64/liblapack.a(dgecon.i):
 relocation R_X86_64_32 against `a local symbol' can not be used when making a 
shared object; recompile with -fPIC
/usr/lib64/gcc-lib/x86_64-suse-linux/3.3.5/../../../../lib64/liblapack.a: 
could not read symbols: Bad value

The 'recompile with -fPIC' is bullsh*t.  The problem is that is is looking
in /usr/lib64/liblapack.a rather than /usr/lib64/liblapack.so.3 both of which
exist.  Some searching for this error message on Google shows a lot of
questions about this problem but no solution that I found other than

rm /usr/lib64/liblapack.a

which I don't consider a solution.  It will link with the .so as the bottom
of the script shows

snowbank$ cd src/modules/lapack
snowbank$ gcc -shared -o lapack.so Lapack.lo -llapack -lblas -lg2c -lm 
-lgcc_s

/usr/lib64/gcc-lib/x86_64-suse-linux/3.3.5/../../../../x86_64-suse-linux/bin/ld:
 
/usr/lib64/gcc-lib/x86_64-suse-linux/3.3.5/../../../../lib64/liblapack.a(dgecon.i):
 relocation R_X86_64_32 against `a local symbol' can not be used when making a 
shared object; recompile with -fPIC
/usr/lib64/gcc-lib/x86_64-suse-linux/3.3.5/../../../../lib64/liblapack.a: 
could not read symbols: Bad value
collect2: ld returned 1 exit status
snowbank$ gcc -shared -o lapack.so Lapack.lo /usr/lib64/liblapack.so.3 
-lblas -l g2c -lm -lgcc_s

No problems with the second link.

So what do I do?  liblapack.so is there.  I've linked other (non-R) programs
to it.  So it SHOULD work with R.

Either I can't read (possible) or the solution to this isn't in the gcc info
pages.

System (more info in typescript).

   AMD 64
   SuSE linux 9.3
   GCC 3.3.5

I also observed the same problem with R-2.1.1 but didn't get around to
debugging it until today.

It occurred to me that /usr/local/lib/liblapack.so.3 which is 32 bit
(because right now we are running only one R on both 32 and 64 bit and
that's where the 32 bit R finds it's shared libraries), but I don't
think that's the problem.  Well maybe it is.  How do I tell configure
NOT to add /user/local ?
-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
[EMAIL PROTECTED]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] looks in liblapack.a not liblapack.so

2005-09-20 Thread Charles Geyer
On Mon, Sep 19, 2005 at 10:44:00AM +0200, Martyn Plummer wrote:
> On Sat, 2005-09-17 at 17:19 -0500, Charles Geyer wrote:
> > I can't compile R-alpha on AMD 64 ...
> 
> You would need to modify the LDFLAGS and CPPFLAGS environment variables,
> as these default to -L/usr/local/lib and -I/usr/local/include
> respectively.  See Appendix B.3.3 of the R Installation and
> Administration manual, which gives a warning about 64-bit systems.

That does not help.  The problem has (apparently) nothing to do
with /usr/local (and the 32 bit compatibility libraries we have there).

> You can also use the --with-readline configure flag to specify the exact
> location of the readline library you wish to use.

That's it.  I need

./configure --prefix=/APPS/Foo/Alpha64 --with-lapack=/usr/lib64/liblapack.so.3

> I hope this helps.

Yes it does.  Everything seems to work except the rpvm and rcdd contributed
packages did not install.  Looking at the problem with rcdd, I see what the
main problem was all along.  On 32 bit you can extract a .o out of a .a to
put in a .so.  On 64 bit, you can't.  It's pickier apparently.  The makefile
for cddlib doesn't make shared libraries, so I'm out of luck for rcdd on AMD64
until I get that fixed.

Now this problem makes a lot more sense.

Sorry to be so stupid.  I knew you could do --with-lapack=something but forgot
(meaning I have a vague recollection of reading about this once, now that I'm
reminded of it).

Anyway we now have R-2.2.0 alpha on AMD64 on SuSE 9.3 with

> dim(installed.packages())
[1] 80 10

Thanks for the help.

I still don't understand why gcc -shared even bothers to look in *.a
(on AMD64) when it won't do the slightest bit of good.  Maybe I'm still
ignorant of some important technical issue (maybe? more like with very
high probability!)

-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
[EMAIL PROTECTED]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] looks in liblapack.a not liblapack.so

2005-09-22 Thread Charles Geyer
On Tue, Sep 20, 2005 at 09:43:51PM -0500, Luke Tierney wrote:
> On Tue, 20 Sep 2005, Charles Geyer wrote:
> >
> >I still don't understand why gcc -shared even bothers to look in *.a
> >(on AMD64) when it won't do the slightest bit of good.  Maybe I'm still
> >ignorant of some important technical issue (maybe? more like with very
> >high probability!)
> >
> 
> The issue is not the library but whether the code is compiled as
> position-independent code (PIC) or not.  Many .a libraries are built
> as PIC and they can be used to create shared objects, you just get
> copies of the modules you use linked in.  PIC code can be slower,
> which is why some prefer to build .a libraries as non-PIC.

Oh.  Thanks.  That makes it all clear.  If I compile cddlib with

export CXX=gcc
export CFLAGS="-O -fPIC"
./configure --prefix=/APPS/64/

then I can build rcdd and it works!  (And, you're right, the fact that
cddlib builds libcddgmp.a instead of libcddgmp.so is irrelevant, it's
the -fPIC that matters.)

> I'm not sure why one rarely runs into non-PIC issues on i386--it may
> be that gcc at least is always producing PIC code there.  It does come
> up on other architectures though, in particular on x86_64.  It seems
> that most Linux distros that provide pvm only provide .a libraries,
> but some build these with PIC some don't.  Red Hat Enterprise WS4
> seems to be non-PIC, FC3 and FC4 seem to be PIC.  If your distro is
> non-PIC you will need to build your own PIC version of pvm and tell
> rpvm where to find it.

snowbank$ locate pvm | grep -E '\.so|\.a$'
/usr/lib/pvm3/lib/LINUX64/libfpvm3.a
/usr/lib/pvm3/lib/LINUX64/libgpvm3.a
/usr/lib/pvm3/lib/LINUX64/libpvm3.a
/usr/lib/pvm3/lib/LINUX64/libpvmtrc.a
/usr/lib64/libpvm3.so
/usr/lib64/libpvm3.so.3
/usr/lib64/libpvm3.so.3.4

I think I've got the libraries, so

  > install.packages("rpvm", repos = "http://www.biometrics.mtu.edu/CRAN/";)
  trying URL 'http://www.biometrics.mtu.edu/CRAN/src/contrib/rpvm_0.6-5.tar.gz'

  [lots of blather deleted]

  gcc -shared -L/usr/local/lib64 -o rpvm.so rpvm_core.o rpvm_ser.o utils.o 
-L/usr/lib/pvm3/lib/LINUX64 -lpvm3 -lgpvm3 -lreadline -lncurses
  
/usr/lib64/gcc-lib/x86_64-suse-linux/3.3.5/../../../../x86_64-suse-linux/bin/ld:
 /usr/lib/pvm3/lib/LINUX64/libpvm3.a(lpvmgen.o): relocation R_X86_64_32 against 
`a local symbol' can not be used when making a shared object; recompile with 
-fPIC

same problem, I have a .so (presumably PIC), but it's picked another library.
Reading the help for install.packages, I don't find anything about how to
make it link against /usr/lib64/libpvm3.so instead
of /usr/lib/pvm3/lib/LINUX64/libpvm3.a so I guess that means do it by hand.

I'm a little puzzled by that too.  Apparently the configure in rpvm wants
to use PVM_ROOT which for this (SuSE 9.3 AMD64) box is /usr/lib/pvm3 (which
is the default) to find the libraries it wants to link to, but that won't
work.  The appropriate library is /usr/lib64/libpvm3.so -- maybe.
I just noticed the -lpvm3 -lgpvm3 in the link that failed.  I'm not
sure /usr/lib64/libpvm3.so contains everything rpvm needs.

This just isn't going to work with the SuSE provided pvm stuff right?

I untarred the rpvm package and did R CMD check on it and it really
doesn't give any way to link to a library in an odd place -- at least
not that I can see.

-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
[EMAIL PROTECTED]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Bug in wilcox.test

2006-01-29 Thread Charles Geyer
There is a fairly new bug in wilcox.test in R-2.2.1 (stable).
It wasn't there when I last taught nonparametrics in fall 2003.

Line 86 of wilcox.test.R

achieved.alpha<-2*psignrank(trunc(qu),n)

It should be

achieved.alpha<-2*psignrank(trunc(qu)-1,n)

If you don't see why, decode the cookbook instructions p. 56 in
Hollander and Wolfe (2nd ed.) or see

http://www.stat.umn.edu/geyer/5601/examp/signrank.html#conf

or just do a sanity check: does this to the right thing when the confidence
interval is the range of the data, case qu = 1?  No.

Of course, this error isn't very visible, because wilcox.test still
prints the ASKED FOR confidence level instead of the ACTUAL ACHIEVED
confidence level (which sucks IMHO, but never mind) except when
it incorrectly thinks that the level cannot be achieved, in which
case it prints the incorrect achieved level.  Just great.

To see the bug do

   X <- read.table(url("http://www.stat.umn.edu/geyer/5601/hwdata/t3-3.txt";),
   header = TRUE)
   attach(X)
   wilcox.test(y, x, paired = TRUE, conf.int = TRUE)

and compare with what you get when you change t3-1.txt to t3-3.txt in
the Rweb form in

http://www.stat.umn.edu/geyer/5601/examp/signrank.html#conf

and submit.

Sorry to sound so grumpy about this, but I hate having my
homework solutions explain that R sucks (in this instance).

Better yet, NEVER use wilcox.test.  Always use wilcox.exact in exactRankTests
or fuzzy.signrank.ci in fuzzyRankTests.

   X <- read.table(url("http://www.stat.umn.edu/geyer/5601/hwdata/t3-3.txt";),
   header = TRUE)
   attach(X)
   library(fuzzyRankTests)
   fuzzy.signrank.ci(y - x)

prints

Wilcoxon signed rank test

data:  y - x
95 percent confidence interval:

Randomized confidence interval is mixture of two intervals

 probability lower end upper end
 0.9   -25   605
 0.1   -15   560

Corresponding fuzzy confidence interval is one on the narrower
interval, 0.9 elsewhere on the wider interval, and zero outside the
wider interval, with values at jumps that are the average of the left
and right limits

Sorry about the advert.  Couldn't resist the opportunity.

-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
[EMAIL PROTECTED]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] tcl problem with R-3.6.3?

2020-02-29 Thread Charles Geyer
Just built 3.6.3 from source and tcl doesn't work.  Worked fine with the
same laptop in 3.6.2.  Here's the exact error.

blurfle$ R --vanilla

R version 3.6.3 (2020-02-29) -- "Holding the Windsock"
Copyright (C) 2020 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.

  Natural language support but running in an English locale

R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.

> sessionInfo()
R version 3.6.3 (2020-02-29)
Platform: x86_64-pc-linux-gnu (64-bit)
Running under: Ubuntu 18.04.4 LTS

Matrix products: default
BLAS:   /home/geyer/local/current/lib/R/lib/libRblas.so
LAPACK: /home/geyer/local/current/lib/R/lib/libRlapack.so

locale:
 [1] LC_CTYPE=en_US.UTF-8   LC_NUMERIC=C
 [3] LC_TIME=en_US.UTF-8LC_COLLATE=en_US.UTF-8
 [5] LC_MONETARY=en_US.UTF-8LC_MESSAGES=en_US.UTF-8
 [7] LC_PAPER=en_US.UTF-8   LC_NAME=C
 [9] LC_ADDRESS=C   LC_TELEPHONE=C
[11] LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C

attached base packages:
[1] stats graphics  grDevices utils datasets  methods   base

loaded via a namespace (and not attached):
[1] compiler_3.6.3
> install.packages("aster")
--- Please select a CRAN mirror for use in this session ---
Error in structure(.External(.C_dotTclObjv, objv), class = "tclObj") :
  [tcl] grab failed: window not viewable.
> q()

What's up with that?

-- 
Charles Geyer
Professor, School of Statistics
Resident Fellow, Minnesota Center for Philosophy of Science
University of Minnesota
char...@stat.umn.edu

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] tcl problem with R-3.6.3?

2020-02-29 Thread Charles Geyer
I knew I could work around.  But this shouldn't happen.

And yes.  Same problem with your example.

blurfle$ R --vanilla

R version 3.6.3 (2020-02-29) -- "Holding the Windsock"
Copyright (C) 2020 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.

  Natural language support but running in an English locale

R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.

> ans <- utils::select.list(c("hello", "world", "again"), graphics=TRUE)
Error in structure(.External(.C_dotTclObjv, objv), class = "tclObj") :
  [tcl] grab failed: window not viewable.
> q()

I didn't bother with sessionInfo() this time.  I presume it would be the
same as before.


AFAIK this is a fully up to date Ubuntu 18.04 box.

On Sat, Feb 29, 2020 at 12:13 PM Henrik Bengtsson <
henrik.bengts...@gmail.com> wrote:

> Here's a simpler example that should reproduce that error for you:
>
>   ans <- utils::select.list(c("hello", "world", "again"), graphics=TRUE)
>
> Does it?
>
> FYI, I installed R 3.6.3 from source on CentOS 7 a few hours ago, and
> for me the above works just fine.
>
> For your immediate needs of selecting a CRAN mirror, you can set:
>
> options(menu.graphics = FALSE)
>
> as a workaround to skip Tcl-based menus.
>
> /Henrik
>
> On Sat, Feb 29, 2020 at 10:01 AM Charles Geyer 
> wrote:
> >
> > Just built 3.6.3 from source and tcl doesn't work.  Worked fine with the
> > same laptop in 3.6.2.  Here's the exact error.
> >
> > blurfle$ R --vanilla
> >
> > R version 3.6.3 (2020-02-29) -- "Holding the Windsock"
> > Copyright (C) 2020 The R Foundation for Statistical Computing
> > Platform: x86_64-pc-linux-gnu (64-bit)
> >
> > R is free software and comes with ABSOLUTELY NO WARRANTY.
> > You are welcome to redistribute it under certain conditions.
> > Type 'license()' or 'licence()' for distribution details.
> >
> >   Natural language support but running in an English locale
> >
> > R is a collaborative project with many contributors.
> > Type 'contributors()' for more information and
> > 'citation()' on how to cite R or R packages in publications.
> >
> > Type 'demo()' for some demos, 'help()' for on-line help, or
> > 'help.start()' for an HTML browser interface to help.
> > Type 'q()' to quit R.
> >
> > > sessionInfo()
> > R version 3.6.3 (2020-02-29)
> > Platform: x86_64-pc-linux-gnu (64-bit)
> > Running under: Ubuntu 18.04.4 LTS
> >
> > Matrix products: default
> > BLAS:   /home/geyer/local/current/lib/R/lib/libRblas.so
> > LAPACK: /home/geyer/local/current/lib/R/lib/libRlapack.so
> >
> > locale:
> >  [1] LC_CTYPE=en_US.UTF-8   LC_NUMERIC=C
> >  [3] LC_TIME=en_US.UTF-8LC_COLLATE=en_US.UTF-8
> >  [5] LC_MONETARY=en_US.UTF-8LC_MESSAGES=en_US.UTF-8
> >  [7] LC_PAPER=en_US.UTF-8   LC_NAME=C
> >  [9] LC_ADDRESS=C   LC_TELEPHONE=C
> > [11] LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C
> >
> > attached base packages:
> > [1] stats graphics  grDevices utils datasets  methods   base
> >
> > loaded via a namespace (and not attached):
> > [1] compiler_3.6.3
> > > install.packages("aster")
> > --- Please select a CRAN mirror for use in this session ---
> > Error in structure(.External(.C_dotTclObjv, objv), class = "tclObj") :
> >   [tcl] grab failed: window not viewable.
> > > q()
> >
> > What's up with that?
> >
> > --
> > Charles Geyer
> > Professor, School of Statistics
> > Resident Fellow, Minnesota Center for Philosophy of Science
> > University of Minnesota
> > char...@stat.umn.edu
> >
> > [[alternative HTML version deleted]]
> >
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
>


-- 
Charles Geyer
Professor, School of Statistics
Resident Fellow, Minnesota Center for Philosophy of Science
University of Minnesota
char...@stat.umn.edu

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] tcl problem with R-3.6.3?

2020-02-29 Thread Charles Geyer
No. I didn't do any of that and am now at a hockey game.  But since I can't
reproduce the problem after an Ubuntu online update and reboot, I assume
the issue is moot.  But I will check these things in an hour or so.

On Sat, Feb 29, 2020, 3:24 PM Dirk Eddelbuettel  wrote:

>
> Charles,
>
> Did you try a build of the provided alpha, beta and rc releases made
> available to allow you to ensure that the released version would build and
> perform as expected?
>
> FWIW the new 3.6.3 made ~ 12 hours ago are already available for Debian,
> built for the Ubuntu backports at CRAN (thanks to Michael) and also in the
> base Rocker container behaves as expected (and as the one RC build did):
>
> edd@rob:~$ docker run --rm -ti rocker/r-base
>
> R version 3.6.3 (2020-02-29) -- "Holding the Windsock"
> Copyright (C) 2020 The R Foundation for Statistical Computing
> Platform: x86_64-pc-linux-gnu (64-bit)
>
> R is free software and comes with ABSOLUTELY NO WARRANTY.
> You are welcome to redistribute it under certain conditions.
> Type 'license()' or 'licence()' for distribution details.
>
>   Natural language support but running in an English locale
>
> R is a collaborative project with many contributors.
> Type 'contributors()' for more information and
> 'citation()' on how to cite R or R packages in publications.
>
> Type 'demo()' for some demos, 'help()' for on-line help, or
> 'help.start()' for an HTML browser interface to help.
> Type 'q()' to quit R.
>
> > capabilities()
>jpeg pngtiff   tcltk X11aqua
>TRUETRUETRUETRUE   FALSE   FALSE
>http/ftp sockets  libxmlfifo  cledit   iconv
>TRUETRUETRUETRUETRUETRUE
> NLS profmem   cairo ICU long.double libcurl
>TRUETRUETRUETRUETRUETRUE
> >
>
>
> And (to echo Martin Maechler) tcltk comes up as TRUE as it should.
>
> Dirk
>
> --
> http://dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] tcl problem with R-3.6.3?

2020-02-29 Thread Charles Geyer
I realized I don't have to do those checks.  It was not working again (same
error) message when I got home, but after a reboot it worked fine.  Of
course it has tcl/tk because when it works, it brings up a gui chooser
thingy that allows me to choose a CRAN mirror.

On Sat, Feb 29, 2020 at 3:33 PM Charles Geyer  wrote:

> No. I didn't do any of that and am now at a hockey game.  But since I
> can't reproduce the problem after an Ubuntu online update and reboot, I
> assume the issue is moot.  But I will check these things in an hour or so.
>
> On Sat, Feb 29, 2020, 3:24 PM Dirk Eddelbuettel  wrote:
>
>>
>> Charles,
>>
>> Did you try a build of the provided alpha, beta and rc releases made
>> available to allow you to ensure that the released version would build and
>> perform as expected?
>>
>> FWIW the new 3.6.3 made ~ 12 hours ago are already available for Debian,
>> built for the Ubuntu backports at CRAN (thanks to Michael) and also in the
>> base Rocker container behaves as expected (and as the one RC build did):
>>
>> edd@rob:~$ docker run --rm -ti rocker/r-base
>>
>> R version 3.6.3 (2020-02-29) -- "Holding the Windsock"
>> Copyright (C) 2020 The R Foundation for Statistical Computing
>> Platform: x86_64-pc-linux-gnu (64-bit)
>>
>> R is free software and comes with ABSOLUTELY NO WARRANTY.
>> You are welcome to redistribute it under certain conditions.
>> Type 'license()' or 'licence()' for distribution details.
>>
>>   Natural language support but running in an English locale
>>
>> R is a collaborative project with many contributors.
>> Type 'contributors()' for more information and
>> 'citation()' on how to cite R or R packages in publications.
>>
>> Type 'demo()' for some demos, 'help()' for on-line help, or
>> 'help.start()' for an HTML browser interface to help.
>> Type 'q()' to quit R.
>>
>> > capabilities()
>>jpeg pngtiff   tcltk X11aqua
>>TRUETRUETRUETRUE   FALSE   FALSE
>>http/ftp sockets  libxmlfifo  cledit   iconv
>>TRUETRUETRUETRUETRUE    TRUE
>> NLS profmem   cairo ICU long.double libcurl
>>TRUETRUETRUETRUETRUETRUE
>> >
>>
>>
>> And (to echo Martin Maechler) tcltk comes up as TRUE as it should.
>>
>> Dirk
>>
>> --
>> http://dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
>>
>

-- 
Charles Geyer
Professor, School of Statistics
Resident Fellow, Minnesota Center for Philosophy of Science
University of Minnesota
char...@stat.umn.edu

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] possible error checking bug and documentation bug in ts

2010-10-22 Thread Charles Geyer
The documentation for ts says "Only one of ‘frequency’ or ‘deltat’ should
be provided" but why doesn't it enforce this with

    stopifnot(missing(frequency) || missing(deltat))

Wouldn't that work?  Also the documentation does not say what valid
time series parameters are.  To find that out one must know to RTFS
in src/main/attrib.c and find that

    end - start  = (n - 1) / frequency

or

   end - start  = (n - 1)  * deltat

is required to avoid an error, where
n is either length(data) or nrow(data).
Why can the help page for ts say that?

I suppose this is picky, but a newbie just wasted a couple of days
with the mysterious error thrown in badtsp in src/main/attrib.c
when he ran afoul of R's argument matching rules and
unintentionally supplied both frequency and deltat (thinking he
was supplying deltat and ts.eps).  It took me more time that it
should have to figure it out too.
--
Charles Geyer
Professor, School of Statistics
University of Minnesota
char...@stat.umn.edu

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] R CMD check bug or misfeature

2011-01-04 Thread Charles Geyer
ror in as.character(function (description = "", open = "", blocking = TRUE,  
: 
  cannot coerce type 'closure' to vector of type 'character'
Calls:  ... tryCatch -> tryCatchList -> tryCatchOne -> 
Execution halted
* checking tests ...
  Running ‘allfaces.R’
  Comparing ‘allfaces.Rout’ to ‘allfaces.Rout.save’ ... OK
  Running ‘arith.R’
  Comparing ‘arith.Rout’ to ‘arith.Rout.save’ ... OK
  Running ‘bar-gmp.R’
  Comparing ‘bar-gmp.Rout’ to ‘bar-gmp.Rout.save’ ... OK
  Running ‘bar.R’
  Comparing ‘bar.Rout’ to ‘bar.Rout.save’ ... OK
  Running ‘bug.R’
  Comparing ‘bug.Rout’ to ‘bug.Rout.save’ ... OK
  Running ‘bug2.R’
  Comparing ‘bug2.Rout’ to ‘bug2.Rout.save’ ... OK
  Running ‘chull.R’
  Comparing ‘chull.Rout’ to ‘chull.Rout.save’ ... OK
  Running ‘chull2.R’
  Comparing ‘chull2.Rout’ to ‘chull2.Rout.save’ ... OK
  Running ‘convert.R’
  Comparing ‘convert.Rout’ to ‘convert.Rout.save’ ... OK
  Running ‘fred.R’
  Comparing ‘fred.Rout’ to ‘fred.Rout.save’ ... OK
  Running ‘lpcdd.R’
  Comparing ‘lpcdd.Rout’ to ‘lpcdd.Rout.save’ ... OK
  Running ‘oops.R’
  Comparing ‘oops.Rout’ to ‘oops.Rout.save’ ... OK
  Running ‘qmatmult.R’
  Comparing ‘qmatmult.Rout’ to ‘qmatmult.Rout.save’ ... OK
  Running ‘qux-gmp.R’
  Comparing ‘qux-gmp.Rout’ to ‘qux-gmp.Rout.save’ ... OK
  Running ‘qux.R’
  Comparing ‘qux.Rout’ to ‘qux.Rout.save’ ... OK
  Running ‘redund.R’
  Comparing ‘redund.Rout’ to ‘redund.Rout.save’ ... OK
  Running ‘sammy.R’
  Comparing ‘sammy.Rout’ to ‘sammy.Rout.save’ ... OK
  Running ‘subset.R’
  Comparing ‘subset.Rout’ to ‘subset.Rout.save’ ... OK
  Running ‘zero.R’
  Comparing ‘zero.Rout’ to ‘zero.Rout.save’ ... OK
 OK
* checking package vignettes in ‘inst/doc’ ... OK
* checking PDF version of manual ... OK

oak$ # It checks o. k., but what about that NOTE ?
oak$ R --vanilla

R version 2.12.1 (2010-12-16)
Copyright (C) 2010 The R Foundation for Statistical Computing
ISBN 3-900051-07-0
Platform: x86_64-unknown-linux-gnu (64-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.

  Natural language support but running in an English locale

R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.

> library(tools)
> tools:::.check_packages_used_in_tests("rcdd")
Error in as.character(function (description = "", open = "", blocking = TRUE,  
: 
  cannot coerce type 'closure' to vector of type 'character'
> traceback()
8: sprintf(gettext(fmt, domain = domain), ...)
7: gettextf("parse error in file '%s':\n%s", file, 
.massage_file_parse_error_message(conditionMessage(e)))
6: stop(gettextf("parse error in file '%s':\n%s", file, 
.massage_file_parse_error_message(conditionMessage(e))), 
   domain = NA, call. = FALSE)
5: value[[3L]](cond)
4: tryCatchOne(expr, names, parentenv, handlers[[1L]])
3: tryCatchList(expr, classes, parentenv, handlers)
2: tryCatch(parse(file = f, n = -1L), error = function(e) stop(gettextf("parse 
error in file '%s':\n%s", 
   file, .massage_file_parse_error_message(conditionMessage(e))), 
   domain = NA, call. = FALSE))
1: tools:::.check_packages_used_in_tests("rcdd")
> parse("rcdd/tests/bug2.R")
Error in parse("rcdd/tests/bug2.R") : 
  rcdd/tests/bug2.R:5:8: unexpected numeric constant
4:  A <- matrix(scan(), byrow = TRUE, nrow = 9)
5: 0  1.000
  ^
> q()
oak$ exit

Script done on Tue 04 Jan 2011 04:14:53 PM CST
-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
char...@stat.umn.edu

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] calling native routines in another package (Sec 5.4.2 of Writing R Extensions)

2017-01-13 Thread Charles Geyer
I just (apparently) figured out how to do the stuff described in
Section 5.4.2 of
Writing R Extensions.  I put my test toy packages on github
 for anyone to copy.  If anyone
cares to read the README and the bits of code it links to and tell me
anywhere I am wrong, I would be grateful.

But the main point of this e-mail is a complaint about that section of
Writing R Extensions.  It says (even in R-devel) "A CRAN example of
the use of this mechanism is package lme4, which links to Matrix." but
that does not appear to be true anymore.  I cannot see any inclusion
of headers from Matrix in lme4 nor any call to R_GetCCallable.  I did
find the file inst/include/Matrix_stubs.c in the Matrix package
somewhat helpful (although mystifying at first).

So this can be considered a documentation bug report (if I am
correct).  Do I need to do an official bugzilla one?

Just a further check.  lme4 (1.1-12) does not have Matrix in the
LinkingTo field of its DESCRIPTION file, so headers from Matrix cannot
be used.  And

grep R_Get *.[ch]*

in the src directory of lme4 returns nothing.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] unlicense

2017-01-13 Thread Charles Geyer
I would like the unlicense (http://unlicense.org/) added to R
licenses.  Does anyone else think that worthwhile?

-- 
Charles Geyer
Professor, School of Statistics
Resident Fellow, Minnesota Center for Philosophy of Science
University of Minnesota
char...@stat.umn.edu

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] unlicense

2017-01-14 Thread Charles Geyer
Actually, CRAN does have an alternative to this.  "License: Unlimited"
can be used in the DESCRIPTION file, but does less than the cited
"unlicense".

On Fri, Jan 13, 2017 at 7:43 PM,   wrote:
> I don't see why Charles' question should be taken as anything other
> than an honest request for information.
>
> As for me, I've never heard of this license, but if CRAN doesn't have
> an option to license software in the public domain, then I would
> support the inclusion of some such option.
>
> FWIW, searching for "public domain software license" on Google turns
> up unlicense.org as the second result.
>
> Frederick
>
> On Fri, Jan 13, 2017 at 07:19:47PM -0500, Duncan Murdoch wrote:
>> On 13/01/2017 3:21 PM, Charles Geyer wrote:
>> > I would like the unlicense (http://unlicense.org/) added to R
>> > licenses.  Does anyone else think that worthwhile?
>> >
>>
>> That's a question for you to answer, not to ask.  Who besides you thinks
>> that it's a good license for open source software?
>>
>> If it is recognized by the OSF or FSF or some other authority as a FOSS
>> license, then CRAN would probably also recognize it.  If not, then CRAN
>> doesn't have the resources to evaluate it and so is unlikely to recognize
>> it.
>>
>> Duncan Murdoch
>>
>> __
>> R-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-devel
>>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] unlicense

2017-01-18 Thread Charles Geyer
In that case, perhaps the question could be changed to could CC0 be
added to the list of R licences.  Right now the only CC licence that
is in the R licenses is CC-BY-SA-4.0.

On Wed, Jan 18, 2017 at 7:23 AM, Brian G. Peterson  wrote:
>
> On Tue, 2017-01-17 at 22:46 -0500, Kevin Ushey wrote:
>> It appears that Unlicense is considered a free and GPL-compatible
>> license; however, the page does suggest using CC0 instead (which is
>> indeed a license approved / recognized by CRAN). CC0 appears to be
>> the primary license recommended by the FSF for software intended for
>> the public domain.
>
> I'd second the recommendation for CC0.  Lawyers at IP-restrictive firms
> I've worked for in the past have been OK with this license.
>
>  - Brian
>



-- 
Charles Geyer
Professor, School of Statistics
Resident Fellow, Minnesota Center for Philosophy of Science
University of Minnesota
char...@stat.umn.edu

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] unlicense

2017-01-18 Thread Charles Geyer
I was looking at https://www.r-project.org/Licenses/ which is first
when you google for "R licenses".  Silly me.  Kurt says I should have
been looking at share/licenses/license.db in the R source tree.
Thanks.  I'm satisfied now.

I don't have any CRAN packages with "Unlimited" on them, but I do have
some on github that are just examples for teaching.  I'll change them
to CC0.

On Wed, Jan 18, 2017 at 9:44 AM, Kurt Hornik  wrote:
>>>>>> Charles Geyer writes:
>
>> In that case, perhaps the question could be changed to could CC0 be
>> added to the list of R licences.  Right now the only CC licence that
>> is in the R licenses is CC-BY-SA-4.0.
>
> Hmm, I see
>
> Name: CC0
> FSF: free_and_GPLv3_compatible 
> (https://www.gnu.org/licenses/license-list.html#CC0)
> OSI: NA (https://opensource.org/faq#cc-zero)
> URL: https://creativecommons.org/publicdomain/zero/1.0/legalcode
> FOSS: yes
>
> in the R license db ...
>
> -k
>
>> On Wed, Jan 18, 2017 at 7:23 AM, Brian G. Peterson  
>> wrote:
>>>
>>> On Tue, 2017-01-17 at 22:46 -0500, Kevin Ushey wrote:
>>>> It appears that Unlicense is considered a free and GPL-compatible
>>>> license; however, the page does suggest using CC0 instead (which is
>>>> indeed a license approved / recognized by CRAN). CC0 appears to be
>>>> the primary license recommended by the FSF for software intended for
>>>> the public domain.
>>>
>>> I'd second the recommendation for CC0.  Lawyers at IP-restrictive firms
>>> I've worked for in the past have been OK with this license.
>>>
>>> - Brian
>>>
>
>
>
>> --
>> Charles Geyer
>> Professor, School of Statistics
>> Resident Fellow, Minnesota Center for Philosophy of Science
>> University of Minnesota
>> char...@stat.umn.edu
>
>> __
>> R-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-devel



-- 
Charles Geyer
Professor, School of Statistics
Resident Fellow, Minnesota Center for Philosophy of Science
University of Minnesota
char...@stat.umn.edu

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] makefile in inst/doc

2013-06-25 Thread Charles Geyer
Section 1.4 of Writing R Extensions is very clear about what not to do with
makefiles in the vignettes or inst/doc directory.  It provides a BAD
EXAMPLE.
But it doesn't say (to me) what does work.  So what does?

As I read the text, it seems that just removing the clean prerequisite from
the all target in the BAD EXAMPLE would give something that works.  I am
referring to the language "Finally, if there is a Makefile and it has
a ‘clean:’ target, make clean is run."  This doesn't seem to happen when
R CMD build is done, and consequently there are *.aux, *.log, and *.dvi
files in the package build that R CMD check --as-cran then complains about.

The only thing I can figure out that works is remove the makefile.  That
does work, but makes it difficult to make the vignettes.

Is my only option to remove the Makefile?  If so, why discuss makefiles
under vignettes at all?


-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
char...@stat.umn.edu

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] setting option in function

2012-10-19 Thread Charles Geyer
is it possible to set an option inside a function ((I want to set
na.action = na.fail) and have the previous state restored if there is
an error so that the function doesn't change the option behind the
user's back?

Sorry if this has been answered before, but this subject is hard to Google.

--
Charles Geyer
Professor, School of Statistics
University of Minnesota
char...@stat.umn.edu

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] setting option in function

2012-10-19 Thread Charles Geyer
That was easy.  Thanks.  That will fix a problem a lot of users are having
with the aster package.  If users have NA's in their data, then they almost
certainly don't know what they are doing (with aster).

On Fri, Oct 19, 2012 at 5:10 PM, Thomas Lumley  wrote:

> old_options <- options(na.action=na.fail)
> on.exit(options(old_options))
>
> You can also use this to define a wrapper that executes an expression
> using special options
>
> withOptions<-function(optlist,expr){
> oldopt<-options(optlist)
> on.exit(options(oldopt))
> expr<-substitute(expr)
> eval.parent(expr)
> }
>
>
> -thomas
>
>
>
> On Sat, Oct 20, 2012 at 10:35 AM, Charles Geyer 
> wrote:
> > is it possible to set an option inside a function ((I want to set
> > na.action = na.fail) and have the previous state restored if there is
> > an error so that the function doesn't change the option behind the
> > user's back?
> >
> > Sorry if this has been answered before, but this subject is hard to
> Google.
> >
> > --
> > Charles Geyer
> > Professor, School of Statistics
> > University of Minnesota
> > char...@stat.umn.edu
> >
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
>
>
>
> --
> Thomas Lumley
> Professor of Biostatistics
> University of Auckland
>



-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
char...@stat.umn.edu

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Regression stars

2013-02-13 Thread Charles Geyer
Please do not change the defaults for the show.signif.stars option or for
the default.stringsAsFactors option.  Backward compatibility is more
important than your convenience.  The same sort of argument could be made
for changing the default of the "[" function from drop = TRUE to drop =
FALSE.  It would lead to less gotchas when coding and make R a saner
programming language (less infernoish), but would annoy and confuse
ordinary users and is not "the R way".  In any case your philosophical
arguments about signif stars are bogus.  Non-simultaneous have exactly the
same problem as these "regression stars".  As I once said in a paper, they
are something "users think they can interpret" with the unstated
implication that they really cannot.  Charlie's law of users says ordinary
users of statistics actually ignore confidence levels and treat all
confidence intervals as if they cover (i. e., take the true confidence
level to be 100%).  You cannot fix lack of user understanding of statistics
by any such simplistic idea.  Yes R is a prime example of "worse is
better", but it is the way it is.  Don't try to turn it into C++.  Thank
you.
-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
char...@stat.umn.edu

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Floating point precision / guard digits? (PR#13771)

2009-06-22 Thread Charles Geyer
On Sat, 20 Jun 2009 08:44:21 -0400 Stavros Macrakis 
wrote:

> d) if it is important to your application to perform exact arithmetic
> on rational numbers (and I suspect it is not), you might want to use
> that instead of floating-point. But even if implemented in R, most R
> calculations cannot use it.

Just for the record, the contributed package rcdd links the GMP (GNU
multiprecision bignum) library and provides a crude interface, so it
is "implemented in R", although, as you say, "most R calculations cannot
use it".  The rcdd package can do exact linear programming and other
computational geometry operations.

-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
char...@stat.umn.edu

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] R CMD check --use-valgrind doesn't run valgrind on tests

2009-08-16 Thread Charles Geyer
R CMD check --use-valgrind  used to run valgrind on the
tests in the tests directory of the package.  But it seems to have stopped.
R-2.9.1 doesn't -- at least on my box -- and neither does R-2.10.0 (devel).
I am not sure when this stopped.  I think 2.8.x did this.  The only old
R I have around is 2.6.0 and it certainly does.

R CMD check --help for 2.9.1 says (among other things)

--use-valgrinduse 'valgrind' when running examples/tests/vignettes

so the documentation seems to say that the old behavior should also be
the current behavior, but it isn't -- at least on my box.

oak$ cat /etc/SuSE-release
openSUSE 11.0 (X86-64)
VERSION = 11.0
oak$ valgrind --version
valgrind-3.3.0
oak$ gcc --version
gcc (SUSE Linux) 4.3.1 20080507 (prerelease) [gcc-4_3-branch revision 135036]

If this is just a stupid user problem and not a bug, how do I get the old
behavior (valgrind is run on tests).  BTW valgrind is run on examples
under 2.9.0, as cat .Rcheck/-Ex.Rout shows.
-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
char...@stat.umn.edu

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R CMD check --use-valgrind doesn't run valgrind on tests

2009-08-18 Thread Charles Geyer
On Mon, Aug 17, 2009 at 10:34:55AM +0100, Prof Brian Ripley wrote:
> This was not implemented in R 2.9.x -- the comments in check.in don't 
> agree with the usage, and it seemed unlikely that anyone really wanted 
> this (it can be very slow, so perhaps a test at a time?), so this was 
> intentional (apart from not altering the reported usage).

Well I ran it on all of my packages just before they shipped to CRAN
as well as --use-gct if the package used .Call.  Yes it was very slow
and yes that meant shipping a new version of a package took hours.
But it did catch several bugs before they shipped.  And I thought
the whole point of regression testing was to be automatic.

> I've added it back for 2.9.2.

Thanks.  I think that's a good idea.

> It also showed up that R CMD BATCH -d valgrind did not work (but 
> --debugger=valgrind did): also fixed for 2.9.2.
> 
> On Sun, 16 Aug 2009, Charles Geyer wrote:
> 
> >R CMD check --use-valgrind  used to run valgrind on the
> >tests in the tests directory of the package.  But it seems to have stopped.
> >R-2.9.1 doesn't -- at least on my box -- and neither does R-2.10.0 (devel).
> >I am not sure when this stopped.  I think 2.8.x did this.  The only old
> >R I have around is 2.6.0 and it certainly does.
> >
> >R CMD check --help for 2.9.1 says (among other things)
> >
> >   --use-valgrinduse 'valgrind' when running examples/tests/vignettes
> >
> >so the documentation seems to say that the old behavior should also be
> >the current behavior, but it isn't -- at least on my box.

[rest deleted]

-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
char...@stat.umn.edu

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] question about --with-valgrind-instrumentation=level

2009-09-06 Thread Charles Geyer
Does --with-valgrind-instrumentation=2 slow down R when valgrind or gctorture
are not in use?  I am thinking of compiling the R that the whole department
uses for research and teachin with --with-valgrind-instrumentation=2.  Is
that a good idea or a bad idea?
-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
char...@stat.umn.edu

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] question about ... passed to two different functions

2009-09-06 Thread Charles Geyer
I have hit a problem with the design of the mcmc package I can't
figure out, possibly because I don't really understand the R function
call mechanism.  The function metrop in the mcmc package has a ... argument
that it passes to one or two user-supplied functions, which are other
arguments to metrop.  When the two functions don't have the same arguments,
this doesn't work.  Here's an example.

 library(mcmc)
 library(MASS)

 set.seed(42)

 n <- 100
 rho <- 0.5
 beta0 <- 0.25
 beta1 <- 0.5
 beta2 <- 1
 beta3 <- 1.5

 Sigma <- matrix(rho, 3, 3)
 diag(Sigma) <- 1
 Sigma <- 0.75 * Sigma
 Mu <- rep(0, 3)

 foo <- mvrnorm(n, Mu, Sigma)

 x1 <- foo[ , 1]
 x2 <- foo[ , 2]
 x3 <- foo[ , 3]

 modmat <- cbind(1, foo)

 eta <- beta0 + beta1 * x1 + beta2 * x2 + beta3 * x3
 p <- 1 / (1 + exp(- eta))
 y <- as.numeric(runif(n) < p)

 out <- glm(y ~ x1 + x2 + x3, family = binomial())
 summary(out)

 ### now we want to do a Bayesian analysis of the model, so we write
 ### a function that evaluates the log unnormalized density of the
 ### Markov chain we want to run (log likelihood + log prior)

 ludfun <- function(beta) {
 stopifnot(is.numeric(beta))
 stopifnot(length(beta) == ncol(modmat))
 eta <- as.numeric(modmat %*% beta)
 logp <- ifelse(eta < 0, eta - log1p(exp(eta)), - log1p(exp(- eta)))
 logq <- ifelse(eta < 0, - log1p(exp(eta)), - eta - log1p(exp(- eta)))
 logl <- sum(logp[y == 1]) + sum(logq[y == 0])
 val <- logl - sum(beta^2) / 2
 return(val)
 }

 beta.initial <- as.vector(out$coefficients)

 out <- metrop(ludfun, initial = beta.initial, nbatch = 20,
 blen = 10, nspac = 5, scale = 0.56789)

 ### Works fine.  Here are the Monte Carlo estimates of the posterior
 ### means for each parameter with Monte Carlo standard errors.

 apply(out$batch, 2, mean)
 sqrt(apply(out$batch, 2, function(x) initseq(x)$var.con) / out$nbatch)

 ### Now suppose I want Monte Carlo estimates of some function of
 ### the parameters other than the identity function.  I write a function
 ### outfun that does that.  Also suppose I want some extra arguments
 ### to outfun.  This example is a bit forced, but I hit on natural
 ### examples with a new function (not yet released) that works like
 ### metrop but does simulated tempering.

 outfun <- function(beta, degree) {
 stopifnot(is.numeric(beta))
 stopifnot(length(beta) == ncol(modmat))
 stopifnot(is.numeric(degree))
 stopifnot(length(degree) == 1)
 stopifnot(degree == as.integer(degree))
 stopifnot(length(degree) > 0)
 result <- NULL
 for (i in 1:degree)
 result <- c(result, beta^i)
 return(result)
 }

 out <- metrop(out, outfun = outfun, degree = 2)

 ### Oops!  Try it and you get
 ###
 ### Error in obj(state, ...) : unused argument(s) (degree = 2)

I don't understand what the problem is (mostly because of ignorance).  Because

 foo <- function(x, ...) x
 foo(x = 2, y = 3)

does work.  The error is happening when ludfun is called, and I assume
the complaint is that it doesn't have an argument "degree", but then
why doesn't the simple example just above fail?  So clearly I don't
understand what's going on.

An obvious solution is to ignore ... and just use global variables, i. e.,
define degree <- 2 in the global environment and make the signature of
outfun function(beta).  That does work.  But I don't want to have to
explain this issue on the help pages if I can actually fix the problem.

I have no idea whether one needs to look at the source code for the
mcmc package to diagnose the issue.  If one does, it's on CRAN.

-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
char...@stat.umn.edu

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] question about ... passed to two different functions

2009-09-07 Thread Charles Geyer
Many thanks to those (Martin Morgan, Duncan Murdoch) who tried to straighten
me out on ... arguments.  It didn't work until I accidentally made two examples
I thought were the same but one worked and the other didn't.  Finally I
achieved enlightenment.  The following section has been added to the
help page for the metrop function and will appear on CRAN when I get finished
with the simulated tempering function.

\section{Warning}{
If \code{outfun} is missing or not a function, then the log unnormalized
density can be defined without a \ldots argument and that works fine.
One can define it starting \code{ludfun <- function(state)} and that works
or \code{ludfun <- function(state, foo, bar)}, where \code{foo} and \code{bar}
are supplied as additional arguments to \code{metrop}.

If \code{outfun} is a function, then both it and the log unnormalized
density function can be defined without \ldots arguments \emph{if they
have exactly the same arguments list} and that works fine.  Otherwise it
doesn't work.  Start the definitions \code{ludfun <- function(state, foo)}
and \code{outfun <- function(state, bar)} and you get an error about
unused arguments.  Instead start the definitions
\code{ludfun <- function(state, foo, \ldots)}
and \code{outfun <- function(state, bar, \ldots)}, supply
\code{foo} and \code{bar} as additional arguments to \code{metrop},
and that works fine.

In short, the log unnormalized density function and \code{outfun} need
to have \ldots in their arguments list to be safe.  Sometimes it works
when \ldots is left out and sometimes it doesn't.

Of course, one can avoid this whole issue by always defining the log
unnormalized density function and \code{outfun} to have only one argument
\code{state} and use global variables (objects in the R global environment) to
specify any other information these functions need to use.  That too
follows the R way.  But some people consider that bad programming practice.
}

I hope that sums it up.  Apologies for submitting a rather stupid question
to the list.
-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
char...@stat.umn.edu

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] how to document stuff most users don't want to see

2009-10-05 Thread Charles Geyer
The functions metrop and temper in the mcmc package have a debug = FALSE
argument that when TRUE adds a lot of debugging information to the returned
list.  This is absolutely necessary to test the functions, because one
generally knows nothing about the simulated distribution except what what
one learns from MCMC samples.  Hence you must expose all details of the
simulation to have any hope of checking that it is doing what it is supposed
to do.  However, this information is of interested mostly (perhaps solely)
to developers.  So I didn't document it in the Rd files for these functions.

But it has ocurred to me that people might be interested in how these functions
are validated, and I would like to document the debug output somewhere, but I
don't want to clutter up the documentation that ordinary users see.  That
suggests a separate help page for debugging.  Looking at "Writing R Extensions"
it doesn't seem like there is a type of Rd file for this purpose.  I suppose
it could be added in (fairly long) sections titled "Debug Output" in metrop.Rd
and temper.Rd or it could be put in a package help page (although that's not
what that kind of page is really for).  Any other possibilities to consider?
-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
char...@stat.umn.edu

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] how to document stuff most users don't want to see

2009-10-08 Thread Charles Geyer
On Mon, Oct 05, 2009 at 02:03:51PM -0400, Duncan Murdoch wrote:
> On 10/5/2009 1:50 PM, Charles Geyer wrote:
> >The functions metrop and temper in the mcmc package have a debug = FALSE
> >argument that when TRUE adds a lot of debugging information to the returned
> >list.  This is absolutely necessary to test the functions, because one
> >generally knows nothing about the simulated distribution except what what
> >one learns from MCMC samples.  Hence you must expose all details of the
> >simulation to have any hope of checking that it is doing what it is 
> >supposed
> >to do.  However, this information is of interested mostly (perhaps solely)
> >to developers.  So I didn't document it in the Rd files for these 
> >functions.
> >
> >But it has ocurred to me that people might be interested in how these 
> >functions
> >are validated, and I would like to document the debug output somewhere, 
> >but I
> >don't want to clutter up the documentation that ordinary users see.  That
> >suggests a separate help page for debugging.  Looking at "Writing R 
> >Extensions"
> >it doesn't seem like there is a type of Rd file for this purpose.  I 
> >suppose
> >it could be added in (fairly long) sections titled "Debug Output" in 
> >metrop.Rd
> >and temper.Rd or it could be put in a package help page (although that's 
> >not
> >what that kind of page is really for).  Any other possibilities to 
> >consider?
> 
> I think writing it up in a vignette would probably be most appropriate. 
>  You can link directly to a vignette from a man page (though not, 
> unfortunately, vice versa).  For example, if you look at
> package?grid, you'll see a list that was generated by this code:
> 
> Further information is available in the following
> \link{vignettes}:
> \tabular{ll}{
> \code{grid} \tab Introduction to \code{grid} (\url{../doc/grid.pdf})\cr
> \code{displaylist} \tab Display Lists in \code{grid} 
> (\url{../doc/displaylist.pdf})\cr

So I decided to follow your advice mcmc_0.7-3.tar.gz just uploaded to CRAN
has such a vignette and such links in the appropriate Rd files.

Thanks

-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
char...@stat.umn.edu

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] location of Sweave.sty in R devel (2010-06-15 r52280)

2010-06-15 Thread Charles Geyer
Sorry if I was supposed to file a bug report, but I don't know whether
we're supposed to do that on R devel.  I just built R devel from source
(2010-06-15 r52280) and tried to check a package with it and Sweave
failed on the vignette.  It puts the line

\usepackage{/HOME/faculty/charlie/local/devel/lib64/R/share/texmf/Sweave}

in the *.tex file but, that's not where Sweave.sty is

oak$ find ~/local/devel -name Sweave.sty
/HOME/faculty/charlie/local/devel/lib64/R/share/texmf/tex/latex/Sweave.sty

All I did was configure; make; make install and the only configure
flags were --prefix=/HOME/faculty/charlie/local/devel
and --with-valgrind-instrumentation=2   Don't see why either would mess
up Sweave.

On a linux box (openSUSE 11.1 (x86_64)).

The day before I built R-2.11.1 the same way and it works fine.

-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
char...@stat.umn.edu

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] location of Sweave.sty in R devel (2010-06-15 r52280)

2010-06-15 Thread Charles Geyer
On Tue, Jun 15, 2010 at 11:23:09PM +0100, Prof Brian Ripley wrote:
> That is not the default behaviour of Sweave, which is to use 
> \usepackage{Sweave}.  Do you have SWEAVE_STYLEPATH_DEFAULT set, or 
> some other non-standard setting that would cause this?  (That is even 
> less needed than before, and it may be removed before 2.12.0 is 
> released.)

oak$ ~/local/devel/bin/R --vanilla --quiet
> Sys.getenv()["SWEAVE_STYLEPATH_DEFAULT"]
SWEAVE_STYLEPATH_DEFAULT
  "TRUE"

Sorry.  I forgot about that.

I should take that out of my .bashrc I suppose?  Done.

I apologize for bothering the list about a stupid luser problem.

> 
> All of CRAN and BioC have been checked in the last 24 hours with 
> standard settings, so this does look like a problem with some 
> non-standard setup.
> 
> If we can reproduce it we can address the problem, but there is 
> nothing here that I can do more than guess at.
> 
> On Tue, 15 Jun 2010, Charles Geyer wrote:
> 
> >Sorry if I was supposed to file a bug report, but I don't know whether
> >we're supposed to do that on R devel.  I just built R devel from source
> >(2010-06-15 r52280) and tried to check a package with it and Sweave
> >failed on the vignette.  It puts the line
> >
> >\usepackage{/HOME/faculty/charlie/local/devel/lib64/R/share/texmf/Sweave}
> >
> >in the *.tex file but, that's not where Sweave.sty is
> >
> >oak$ find ~/local/devel -name Sweave.sty
> >/HOME/faculty/charlie/local/devel/lib64/R/share/texmf/tex/latex/Sweave.sty
> >
> >All I did was configure; make; make install and the only configure
> >flags were --prefix=/HOME/faculty/charlie/local/devel
> >and --with-valgrind-instrumentation=2   Don't see why either would mess
> >up Sweave.
> >
> >On a linux box (openSUSE 11.1 (x86_64)).
> >
> >The day before I built R-2.11.1 the same way and it works fine.
> >
> >-- 
> >Charles Geyer
> >Professor, School of Statistics
> >University of Minnesota
> >char...@stat.umn.edu
> >
> >__
> >R-devel@r-project.org mailing list
> >https://stat.ethz.ch/mailman/listinfo/r-devel
> >
> 
> -- 
> Brian D. Ripley,  rip...@stats.ox.ac.uk
> Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
> University of Oxford, Tel:  +44 1865 272861 (self)
> 1 South Parks Road, +44 1865 272866 (PA)
> Oxford OX1 3TG, UKFax:  +44 1865 272595

-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
char...@stat.umn.edu

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] new TR about when MLE exist in GLM and what to do if not

2008-09-29 Thread Charles Geyer
This is just to alert the list about a new technical report

http://www.stat.umn.edu/geyer/gdor/

that shows how to determine, using exact infinite-precision rational arithmetic
when the MLE exists in a GLM (or for that matter in any exponential family,
such as aster models -- contributed package aster) and what to do when it
does not.

This suggests many changes to glm, glm.fit, predict.glm, and anova.glm.
Before any such changes could be implemented, the contributed package
rcdd would need to be made recommended, which in turn would require that
it be made available on Macintosh (the problem is getting the GNU bignum
package GMP to compile with gcc, shouldn't be hard, but apparently is on
Macs, but Jan de Leeuw has done it, so it's not impossible).

The tech report shows how correct data analysis is done in GLMs when
the MLE doesn't exist.  I haven't thought about how this could be worked
into the glm suite of functions, but it clearly could be done.  I would
be glad to help out any way I can if the core team think this should be done.
-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
[EMAIL PROTECTED]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Apparant bug in binomial model in GLM (PR#13434)

2009-01-07 Thread Charles Geyer
)
> > 
> > 
> > There appear to be a bug in the estimation of significance in the binomial 
> > model
> > in GLM. This bug apparently appears when the correlation between two 
> > variables
> > is to strong.
> > 
> > Such as this dummy example
> > c(0,0,0,0,0,1,1,1,1,1)->a
> > a->b
> > m1<-glm(a~b, binomial)
> > summary(m1)
> > 
> > It is sufficient that all 1's correspond to 1's such as this example
> > 
> > c(0,0,0,0,0,1,1,1,1,1)->a
> > c(0,0,0,0,1,1,1,1,1,1)->c
> > m1<-glm(a~c, binomial)
> > summary(m1)
> 
> That's not a bug, just the way things work. When the algorithm diverges,
>  as seen by the huge Std.Error, Wald tests (z) are unreliable. (Notice
> that the log OR in an a vs. c table is infinite whichever way you turn
> it.) The likelihood ratio test (as in drop1(m1, test="Chisq")) is
> somewhat less unreliable, but in these small examples, still quite some
> distance from the table based approaches of fisher.test(a,c) and
> chisq.test(a,c).
> 
> 
> > 
> > I hope that this message is understandable. 
> > 
> > Kind regards, S?ren
> > 
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
> 
> 
> -- 
>O__   Peter Dalgaard ?ster Farimagsgade 5, Entr.B
>   c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K
>  (*) \(*) -- University of Copenhagen   Denmark  Ph:  (+45) 35327918
> ~~ - (p.dalga...@biostat.ku.dk)  FAX: (+45) 35327907

-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
char...@stat.umn.edu

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Apparant bug in binomial model in GLM (PR#13434)

2009-01-07 Thread Charles Geyer
On Wed, Jan 07, 2009 at 04:48:03PM +0100, Peter Dalgaard wrote:
> Charles Geyer wrote:
> ...
> > BTW the particular example given doesn't make clear WHAT question cannot
> > be answered correctly.  Some questions can be without fuss, for example
> > 
> >> y <- c(0,0,0,0,0,1,1,1,1,1)
> >> x <- seq(along = y)
> >> out1 <- glm(y ~ x, family = binomial)
> > Warning messages:
> > 1: In glm.fit(x = X, y = Y, weights = weights, start = start, etastart = 
> > etastart,  :
> >   algorithm did not converge
> > 2: In glm.fit(x = X, y = Y, weights = weights, start = start, etastart = 
> > etastart,  :
> >   fitted probabilities numerically 0 or 1 occurred
> >> out0 <- glm(y ~ 1, family = binomial)
> >> anova(out0, out1, test = "Chisq")
> > Analysis of Deviance Table
> > 
> > Model 1: y ~ 1
> > Model 2: y ~ x
> >   Resid. Df Resid. Dev Df Deviance P(>|Chi|)
> > 1 913.8629
> > 2 8  7.865e-10  1  13.86290.0002
> > 
> > This P-value (P = 0.0002) is valid, because the MLE does exist for the null
> > hypothesis.  Hence we see that we have to use the model y ~ x for which
> > the MLE does not exist in the conventional sense.
> 
> 
> It may be  valid in some senses, but I can't help notice that it is off
> by a factor of at least 10, since the experiment has only 1024 outcomes,
> two of which are as extreme as the one observed, and where all outcomes
> are equally likely under the corresponding y~1 model.

It's valid in the sense that all of the P-values R produces for GLM are valid.
It's an asymptotic approximation.  As with all asymptotic approximations,
at best only the absolute error is small, not the relative error.  This
is no worse than any other P-value produced by anova.glm.  And the conclusion
that P < 0.05 or P < 0.01 is correct.

You already know all this, so why the e-mail?

Worst case, a P-value produced by anova.glm can be very questionable when
"n" is too small.  With n = 10 here, it's amazing that it does as well as it
does.  That's because the MLE in the null hypothesis says all of the
response variables have the symmetric binomial distribution, and the CLT does
work, more or less, down to n = 10 for the symmetric binomial distribution.

If you don't trust the asymptotics, then do a parametric bootstrap.  That's
trivial in R.

My point wasn't about the validity of asymptotics.  My point was that either
the asymptotic test done by anova.glm or the parametric bootstrap makes sense
only when the MLE exists for the null hypothesis.  Otherwise one has to follow
the procedures in my tech report.

-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota
char...@stat.umn.edu

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] documenting datasets with more than one object

2009-03-21 Thread Charles Geyer
I am trying to put an rda (R save image) file that contains multiple R
objects in a contributed package.  From the example of the BJsales data
in package "datasets" it seems this is o. k.  But I am puzzled by some
of what "Writing R Extensions" says.

Section 1.1.3 mentions datalist files.  Do I have to have one in this case?

Section 2.1.2 says

The \usage entry is always bar or (for packages which do not use
lazy-loading of data) data(bar). (In particular, only document a
single data object per Rd file.) 

What does that mean?  I have to describe all of the objects in the file,
but I only "document" one of them?  Is the Rd file for BJsales just wrong?

What if the rda file is sim.rda and so data(sim) is used to load it, but
"sim" is not the name of one of the R objects in rda file?

Is there a better example for how to do this?

My package passes R CMD check but I don't want to ship it to CRAN if it
is doing something considered harmful.
-- 
Charles Geyer
Professor, School of Statistics
University of Minnesota

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel