Dear Experts,
I have a data spanning 56 years from 1963 to 2018.
The datetime format is in DOY hour:
1963 335 0
1963 335 1
1963 335 2
1963 335 3
1963 335 4
1963 335 5
1963 335 6
1963 335 7
1963 335 8
1963 335 9
1996 202 20
1996 202 21
1996 202 22
1996 202 23
1996 203 0
1996 203 1
1996 203 2
1996 20
Quoting Ogbos Okike :
Dear Experts,
I have a data spanning 56 years from 1963 to 2018.
The datetime format is in DOY hour:
1963 335 0
1963 335 1
1963 335 2
1963 335 3
1963 335 4
1963 335 5
1963 335 6
1963 335 7
1963 335 8
1963 335 9
1996 202 20
1996 202 21
1996 202 22
1996 202 23
1996 203 0
19
Hi Ogbos,
Try this:
oodates<-read.table(text="1963 335 0
1963 335 1
1963 335 2
1963 335 3
1963 335 4
1963 335 5
1963 335 6
1963 335 7
1963 335 8
1963 335 9
1996 202 20
1996 202 21
1996 202 22
1996 202 23
1996 203 0
1996 203 1
1996 203 2
1996 203 3
2018 365 20
2018 365 21
2018 365 22
2018 365 23")
Hi
as.Date converts to dates (without hours), which is clearly stated in the
first line of documentation.
If you want include hour you should use ?strptime
Something like
> strptime("1963 335 1", format="%Y %j %H")
[1] "1963-12-01 01:00:00 CET"
You definitely does not need to do it one by one,
Here's an idea:
> as.POSIXct(paste0("1963","-1-1"))+as.difftime(335,units="days") +
> as.difftime(3, units="hours")
[1] "1963-12-02 03:00:00 CET"
However, 2 caveats
(a) I think you need to subtract 1 from the DOY (1 should be Jan 1, right?)
(b) Beware Daylight Savings time:
> as.POSIXct(paste0
Jim's (and Petr's) solution wins
-pd
> On 23 Jan 2020, at 10:56 , peter dalgaard wrote:
>
> Here's an idea:
>
>> as.POSIXct(paste0("1963","-1-1"))+as.difftime(335,units="days") +
>> as.difftime(3, units="hours")
> [1] "1963-12-02 03:00:00 CET"
>
> However, 2 caveats
>
> (a) I think you
Dear Gurus,
I am so happy to thank you all for your great help!!!
Jim's code first did it.
Thanks again to everyone.
Warmest regards
Ogbos
On Thu, Jan 23, 2020 at 10:42 AM Jim Lemon wrote:
>
> Hi Ogbos,
> Try this:
>
> oodates<-read.table(text="1963 335 0
> 1963 335 1
> 1963 335 2
> 1963 335 3
> 1
Thanks Rui and everyone for your help!
Using ` constraint = "increase"` and `lambda = 0.1` did it.
If we ignore the first point, `lambda = -1` option is also possible. `cobs`
will automatically choose a lambda value
plot(fit_result)
summary(fit_result)
# COBS smoothing spl
Hello,
I have a data frame which looks like this:
> head(a,20)
rs pvalue
1: rs185642176 0.267407
2: rs184120752 0.787681
3: rs10904045 0.508162
4: rs35849539 0.875910
5: rs141633513 0.787759
6: rs4468273 0.542171
7: rs4567378 0.539484
8: rs7084251 0.126445
9: rs181
Hi Ana,
You seem to be working on an identification or classification problem.
Your sample plot didn't come through, perhaps try converting it to a
PDF or PNG.
I may be missing something, but I can't see how randomly selecting 30
values from almost 4 million is going to mean anything in terms of
st
Hi Jim,
thanks for getting back to me.
Can you please confirm if you can see this plot in attach?
Thanks
Ana
On Thu, Jan 23, 2020 at 8:06 PM Jim Lemon wrote:
>
> Hi Ana,
> You seem to be working on an identification or classification problem.
> Your sample plot didn't come through, perhaps try
Hi Ana,
Yes, this makes more sense. A bar plot of the number of simulations
performed by the proportion of SNPs with q-value < 0.05. I would
expect the axes to be swapped, but my notion that you would want to
know the proportion (DV) for a given number of simulations (IV) may
well be wrong.
Jim
O
Hi, I'm not experienced with R at all. I'm using some canned code to produce
RSA models in the RSA package. I'd like to be able to adjust the alpha for the
a1-a4 confidence intervals.
This doesn't appear to be a native option for the RSA package, unless I'm
missing something because I'm a nov
13 matches
Mail list logo