Everyone,

We are trying to architect a new system, which will have to take several
large datastreams (total of ~200,000 parsed files per second) and place
them in a database.  I am trying to figure out the best way to import that
sort of data into Postgres.

I keep thinking i can't be the first to have this problem and there are
common solutions but I can't find any.  Does anyone know of some sort
method, third party program, etc, that can accept data from a number of
different sources, and push it into Postgres as fast as possible?

Thanks in advance,

Mark Moellering

Reply via email to