Thanks for highlighting this issue.
I have tested this using .net 4.8 and Core 3.1 against Sql Server, they all
exhibit the same problem.
The best course of action is probably to identify a workaround, based on your
project, until Microsoft team have the time to fix the issue.
It would be good
Under windows environment, is there a way of compressing the network packet
reaching postgresql server?
Many thanks.
HI Arya,
Probably the easiest solution is to use cloud computing with dedicated network.
Good luck.
From: Neil
Sent: 2019 July 28 01:33
To: Arya F
Cc: pgsql-general@lists.postgresql.org; farjad.farid
; Alvaro Herrera ;
Tom Lane ; Ron
Subject: Re: Hardware for writing/updating 12,000,000
HI Arya,
It is not clear what is the budget and why there is so much data? Is this a
real time system, e.g. 24/7 operation. Even if each row takes up just 50 bytes,
that is a lot of data in/out of your CPUs/memory/hard disk, any one of which
could fail.
Personally I would recommend analyzing t
With this kind of design requirements it is worth considering hardware "failure
& recovery". Even SSDs can and do fail.
It is not just a matter of just speed. RAID disks of some kind, depending on
the budget is worth the effort.
-Original Message-
From: Alvaro Herrera
Sent: 2019
I have no problem with a code conduct. After all most people agree
that, even a mundane thing like crossing a road needs rules,
so something as complex as human interactions also needs rules.
That's my two cent worth of contribution.
Best Regards
Farjad Farid