Re: Bulk Insert/Update Scenario

2018-01-31 Thread Peter J. Holzer
On 2018-01-04 13:47:49 -0800, Mana M wrote: > I was thinking about dumping everything into TEMP table and using that > as source for INSERT ... ON CONFLICT. However, I was not sure on how > to get thousands of rows from my Python application into TEMP table in > one shot. Or is there any better alt

Re: Bulk Insert/Update Scenario

2018-01-04 Thread Jordan Deitch
INSERT .. ON CONFLICT can be ran as a bulk operation: create table test(id int); insert into test(id) values (1), (2), (3), (4); Unless you mean you don't want to run this for every table? Thanks, Jordan Deitch http://id.rsa.pub

Re: Bulk Insert/Update Scenario

2018-01-04 Thread Mana M
Thanks Jordan. One more question I had was - anyway to avoid doing individual INSERT ... ON CONFLICT? I was thinking about dumping everything into TEMP table and using that as source for INSERT ... ON CONFLICT. However, I was not sure on how to get thousands of rows from my Python application into

Re: Bulk Insert/Update Scenario

2018-01-04 Thread legrand legrand
Hi, check documentation Populate a database this explains how to create a dummy table, load it using COPY command, and then INSERT / UPDATE target tables (using ON CONFLICT if needed) You can also investigate: - file_fdw

Re: Bulk Insert/Update Scenario

2018-01-04 Thread Jordan Deitch
Hi Mana, A starting point would be reading about the batch upsert functionality: https://www.postgresql.org/docs/current/static/sql-insert.html You would do something like: INSERT INTO table ON CONFLICT update... This operation would be atomic. You can also look into deferrable constraints such

Bulk Insert/Update Scenario

2018-01-04 Thread Mana M
I am setting up the data processing pipeline and the end result gets stored in Postgres. Have not been a heavy DB user in general and had question regarding how best to handle bulk insert/updates scenario with Postgres. Here is my use case: * I get file with thousands of entries (lines) periodical