On Sun, Dec 13, 2009 at 9:27 AM, Robert Ferrell wrote:
>
> On Dec 13, 2009, at 7:07 AM, josef.p...@gmail.com wrote:
>
>> On Sun, Dec 13, 2009 at 3:31 AM, Pierre GM
>> wrote:
>>> On Dec 13, 2009, at 12:11 AM, Robert Ferrell wrote:
Have you considered creating a TimeSeries for each data series
On Dec 13, 2009, at 7:07 AM, josef.p...@gmail.com wrote:
> On Sun, Dec 13, 2009 at 3:31 AM, Pierre GM
> wrote:
>> On Dec 13, 2009, at 12:11 AM, Robert Ferrell wrote:
>>> Have you considered creating a TimeSeries for each data series, and
>>> then putting them all together in a dict, keyed by s
On Dec 13, 2009, at 1:31 AM, Pierre GM wrote:
> On Dec 13, 2009, at 12:11 AM, Robert Ferrell wrote:
>> Have you considered creating a TimeSeries for each data series, and
>> then putting them all together in a dict, keyed by symbol?
>
> That's an idea
>
>> One disadvantage of one big monster nump
On Sun, Dec 13, 2009 at 3:31 AM, Pierre GM wrote:
> On Dec 13, 2009, at 12:11 AM, Robert Ferrell wrote:
>> Have you considered creating a TimeSeries for each data series, and
>> then putting them all together in a dict, keyed by symbol?
>
> That's an idea
As far as I understand, that's what panda
On Dec 13, 2009, at 12:11 AM, Robert Ferrell wrote:
> Have you considered creating a TimeSeries for each data series, and
> then putting them all together in a dict, keyed by symbol?
That's an idea
> One disadvantage of one big monster numpy array for all the series is
> that not all series m
Have you considered creating a TimeSeries for each data series, and
then putting them all together in a dict, keyed by symbol?
One disadvantage of one big monster numpy array for all the series is
that not all series may have a full set of 1800 data points. So the
array isn't really nicely
On Sat, Dec 12, 2009 at 8:08 PM, THOMAS BROWNE wrote:
> Hello all,
>
> Quite new to numpy / timeseries module, please forgive the elementary
> question.
>
> I wish to do quite to do a bunch of multivariate analysis on 1000 different
> financial markets series, each holding about 1800 data points
Hello all,
Quite new to numpy / timeseries module, please forgive the elementary question.
I wish to do quite to do a bunch of multivariate analysis on 1000 different
financial markets series, each holding about 1800 data points (5 years of daily
data).
What's the best way to put this into a