Wojciech Gryc wrote:
> Hi,
>
> I'm currently working with data that has values as large as 99,000,000
> but is accurate to 6 decimal places. Unfortunately, when I load the
> data using read.table(), it rounds everything to the nearest integer.
> Is there any way for me to preserve the information or work with
> arbitrarily large floating point numbers?
>
> Thank you,
> Wojciech
>
>   
Are you sure?

To my knowledge, read.table doesn't round anything, except when running
out of bits to store the values in, and 13 decimal places should fit in
ordinary double precision variables.

Printing the result is another matter. Try playing with the
print(mydata, digits=15) and the like.

-- 
   O__  ---- Peter Dalgaard             Ă˜ster Farimagsgade 5, Entr.B
  c/ /'_ --- Dept. of Biostatistics     PO Box 2099, 1014 Cph. K
 (*) \(*) -- University of Copenhagen   Denmark          Ph:  (+45) 35327918
~~~~~~~~~~ - ([EMAIL PROTECTED])                  FAX: (+45) 35327907

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to