Re: Is postorder tree traversal possible with recursive CTE's?

2018-06-19 Thread Asif Ali
how the fuck i unsubscribe to this mailing list , i get more than 100 emails a 
day

Bye


From: Paul Jungwirth 
Sent: Wednesday, June 20, 2018 2:31 AM
To: pgsql-general@lists.postgresql.org
Subject: Re: Is postorder tree traversal possible with recursive CTE's?

On 06/19/2018 02:05 PM, Alban Hertroys wrote:
> On the more theoretical front: The question remains whether it is possible to 
> calculate fields in post-order tree traversal. I think that would be a 
> semantically proper way to express this type of problem and it wouldn't need 
> the kinds of pre/post-processing that after-the-fact aggregation (like in 
> above solution) requires. So, leaner, and probably faster.
> That implies that the SQL committee thought of the possibility in the first 
> place though, which I'm beginning to doubt...

If this interests you, you might enjoy this StackOverflow question:

https://stackoverflow.com/questions/35956486/generate-nested-json-with-couting-in-postgresql

Briefly, how do you construct a nested JSON structure from a recursive
CTE? The only answers at that link rely on plpgsql, but of course that
is cheating. :-) I took a stab at it a couple years ago but couldn't
figure it out, and it seemed like post-order processing was exactly the
missing piece.

If anyone has any ideas I'd be intrigued to hear them!

--
Paul  ~{:-)
p...@illuminatedcomputing.com



Re: Load data from a csv file without using COPY

2018-06-19 Thread Asif Ali


how the fuck i unsubscribe to this mailing list , i get more than 100 emails a 
day


From: Tim Cross 
Sent: Wednesday, June 20, 2018 2:59 AM
To: Ravi Krishna
Cc: PG mailing List
Subject: Re: Load data from a csv file without using COPY


Ravi Krishna  writes:

> In order to test a real life scenario (and use it for benchmarking) I want to 
> load large number of data from csv files.
> The requirement is that the load should happen like an application writing to 
> the database ( that is, no COPY command).
> Is there a tool which can do the job.  Basically parse the csv file and 
> insert it to the database row by row.
>

Not clear what you mean by 'real world scenario', but you could possibly
use PG's foreign table support and define a csv file as a foreign table
and then have scripts which read from there and do whatever
insert/update etc you need. However, this has a high level of 'fakery'
going on and probably not testing what you really want.

There are lots of ways that applications write to the database -
different drivers (e.g. jdbc, odbc, pg etc), different commit
and transaction strategies and even different ways to handle things like
an update or insert process. You can even use streams and copy from an
application.

To get 'real world' equivalence, you really need to use the same
interface as the application you are comparing. Most languages have
support for processing CSV files, so you may be better off writing a
small 'wrapper' app which uses the same drivers and assuming your
database connectivity has been abstracted into some sort of
module/library/class, use the same interface to write to the database
that the application uses.

Tim
--
Tim Cross



Re: Load data from a csv file without using COPY

2018-06-19 Thread Asif Ali
how the fuck i unsubscribe to this mailing list , i get more than 100 emails a 
day

Bye


From: Alban Hertroys 
Sent: Wednesday, June 20, 2018 2:10 AM
To: Ravi Krishna
Cc: PG mailing List
Subject: Re: Load data from a csv file without using COPY


> On 19 Jun 2018, at 22:16, Ravi Krishna  wrote:
>
> In order to test a real life scenario (and use it for benchmarking) I want to 
> load large number of data from csv files.
> The requirement is that the load should happen like an application writing to 
> the database ( that is, no COPY command).
> Is there a tool which can do the job.  Basically parse the csv file and 
> insert it to the database row by row.
>
> thanks

I think an easy approach would be to COPY the CSV files into a separate 
database using psql's \copy command and then pg_dump that as separate insert 
statements with pg_dump —inserts.

Alban Hertroys
--
If you can't see the forest for the trees,
cut the trees and you'll find there is no forest.




Re: Is postorder tree traversal possible with recursive CTE's?

2018-06-19 Thread Asif Ali
how the fuck i unsubscribe to this mailing list , i get more than 100 emails a 
day

Bye


From: Alban Hertroys 
Sent: Wednesday, June 20, 2018 2:05 AM
To: Hellmuth Vargas
Cc: pgsql-gene...@postgresql.org
Subject: Re: Is postorder tree traversal possible with recursive CTE's?


> On 19 Jun 2018, at 21:14, Hellmuth Vargas  wrote:
>
>
> Hi
>
> with partial sum:
>
> with recursive pizza (name, step, ingredient, quantity, unit, rel_qty, path, 
> weight)
> as (
> select
> name, step, ingredient, quantity, unit
> ,   quantity::numeric(10,2)
> ,   step::text
> ,   case when unit = 'g' then quantity::numeric(10,2) else null 
> end
>   from recipe
>  where name = 'pizza'
> union all
> select
> recipe.name, recipe.step, recipe.ingredient, recipe.quantity, 
> recipe.unit
> ,   (pizza.rel_qty * recipe.quantity)::numeric(10,2)
> ,   pizza.path || '.' || recipe.step
> ,   case when recipe.unit = 'g' then (pizza.rel_qty * 
> recipe.quantity)::numeric(10,2) else null end
>   from pizza
>   join recipe on (recipe.name = pizza.ingredient)
> )
> select path, ingredient, quantity, rel_qty, unit, weight,sum(weight) 
> over(partition by split_part(path,'.',1)) as parcial_weight, sum(weight) 
> over() as total_weight
>   from pizza
>  order by path;
>
>  path  |  ingredient  | quantity | rel_qty | unit  | weight | parcial_weight 
> | total_weight
> ---+--+--+-+---+++--
>  1 | tomato sauce | 1.00 |1.00 | pcs   || 113.00 
> |   313.00
>  1.1   | tomato   |   100.00 |  100.00 | g | 100.00 | 113.00 
> |   313.00
>  1.2   | basil|10.00 |   10.00 | g |  10.00 | 113.00 
> |   313.00
>  1.3   | salt | 3.00 |3.00 | g |   3.00 | 113.00 
> |   313.00
>  2 | pizza bottom | 1.00 |1.00 | pcs   || 200.00 
> |   313.00
>  2.2   | dough| 1.00 |1.00 | pcs   || 200.00 
> |   313.00
>  2.2.1 | flour|   150.00 |  150.00 | g | 150.00 | 200.00 
> |   313.00
>  2.2.2 | water|50.00 |   50.00 | g |  50.00 | 200.00 
> |   313.00
>  2.2.3 | salt | 1.00 |1.00 | pinch || 200.00 
> |   313.00
> (9 rows)

That is certainly an interesting solution and it begs the question whether a 
text field ('path') is actually the right representation of the hierarchy (some 
type of array would seem to be a better fit). Good out-of-the-box thinking!
This is probably usable for my actual case, so thanks for that, wouldn't have 
thought of it myself (even though I already had all the right "bits" in place!).

On the more theoretical front: The question remains whether it is possible to 
calculate fields in post-order tree traversal. I think that would be a 
semantically proper way to express this type of problem and it wouldn't need 
the kinds of pre/post-processing that after-the-fact aggregation (like in above 
solution) requires. So, leaner, and probably faster.
That implies that the SQL committee thought of the possibility in the first 
place though, which I'm beginning to doubt...


> El mar., 19 de jun. de 2018 a la(s) 11:49, Hellmuth Vargas (hiv...@gmail.com) 
> escribió:
> Hi
>
> with recursive pizza (name, step, ingredient, quantity, unit, rel_qty, path, 
> weight)
> as (
> select
> name, step, ingredient, quantity, unit
> ,   quantity::numeric(10,2)
> ,   step::text
> ,   case when unit = 'g' then quantity::numeric(10,2) else null 
> end
>   from recipe
>  where name = 'pizza'
> union all
> select
> recipe.name, recipe.step, recipe.ingredient, recipe.quantity, 
> recipe.unit
> ,   (pizza.rel_qty * recipe.quantity)::numeric(10,2)
> ,   pizza.path || '.' || recipe.step
> ,   case when recipe.unit = 'g' then (pizza.rel_qty * 
> recipe.quantity)::numeric(10,2) else null end
>   from pizza
>   join recipe on (recipe.name = pizza.ingredient)
> )
> select path, ingredient, quantity, rel_qty, unit, weight, sum(weight) over() 
> as total_weight
>   from pizza
>  order by path;
>
>  path  |  ingredient  | quantity | rel_qty | unit  | weight | total_weight
> ---+--+--+-+---++--
>  1 | tomato sauce | 1.00 |1.00 | pcs   ||   313.00
>  1.1   | tomato   |   100.00 |  100.00 | g | 100.00 |   313.00
>  1.2   | basil|10.00 |   10.00 | g |  10.00 |   313.00
>  1.3   | salt | 3.00 |3.00 | g |   3.00 |   313.00
>  2 | pizza bottom | 1.00 |1.00 | pcs   || 

Re: Run Stored procedure - function from VBA

2018-06-19 Thread Asif Ali
how the fuck i unsubscribe to this mailing list , i get more than 100 emails a 
day

Bye


From: Rob Sargent 
Sent: Wednesday, June 20, 2018 12:54 AM
To: Łukasz Jarych
Cc: pgsql-gene...@postgresql.org
Subject: Re: Run Stored procedure - function from VBA



On 06/18/2018 09:51 PM, Łukasz Jarych wrote:
Thank you Rob,

question is it is the optimal way to run SP from VBA?
Or not?

Best,
Jacek

2018-06-19 1:34 GMT+02:00 Rob Sargent 
mailto:robjsarg...@gmail.com>>:


On Jun 18, 2018, at 9:47 AM, Łukasz Jarych 
mailto:jarys...@gmail.com>> wrote:

Hi Guys,

i have example function :

CREATE OR REPLACE FUNCTION totalRecords ()
RETURNS integer AS $total$
declare
total integer;
BEGIN
   SELECT count(*) into total FROM COMPANY;
   RETURN total;
END;
$total$ LANGUAGE plpgsql;

and i want to run it from VBA using odbc connection.

What is the best way to use it ?

something like this:


Dim dbCon as new ADODB.Connection
Dim rst as new ADODB.Recordset

Dbcon.connectionstring=”Your connection string goes here!”
Dbcon.open

Rst.open strsql

where strsql is "Select * from totalRecords" or this is not a good solution?

Best,
Jacek

You need the parentheses after the function name: “select * from 
totalrecords();"



Depends on the usage pattern.  I'm sure there is an ODBC construct for stored 
procedures/function, which you could build once and re-use with new parameter 
values if you're going to call this repeatedly.


Re: Load data from a csv file without using COPY

2018-06-19 Thread Asif Ali
please just tell me the site i will do it right away and i have marked it 
junked so many times , i will keep spamming it until my email address is 
removed from the list

Bye


From: James Keener 
Sent: Wednesday, June 20, 2018 3:11 AM
To: pgsql-general@lists.postgresql.org; Asif Ali; Alban Hertroys; Ravi Krishna
Cc: PG mailing List
Subject: Re: Load data from a csv file without using COPY

Seriously, stop spamming the list and stop cursing and acting like a petulant 
child. Go to the site and unsubscribe or use a mail client that understands the 
standard list headers.

On June 19, 2018 6:06:59 PM EDT, Asif Ali  wrote:
how the fuck i unsubscribe to this mailing list , i get more than 100 emails a 
day

Bye


From: Alban Hertroys 
Sent: Wednesday, June 20, 2018 2:10 AM
To: Ravi Krishna
Cc: PG mailing List
Subject: Re: Load data from a csv file without using COPY


> On 19 Jun 2018, at 22:16, Ravi Krishna  wrote:
>
> In order to test a real life scenario (and use it for benchmarking) I want to 
> load large number of data from csv files.
> The requirement is that the load should happen like an application writing to 
> the database ( that is, no COPY command).
> Is there a tool which can do the job.  Basically parse the csv file and 
> insert it to the database row by row.
>
> thanks

I think an easy approach would be to COPY the CSV files into a separate 
database using psql's \copy command and then pg_dump that as separate insert 
statements with pg_dump —inserts.

Alban Hertroys
--
If you can't see the forest for the trees,
cut the trees and you'll find there is no forest.



--
Sent from my Android device with K-9 Mail. Please excuse my brevity.


Re: Load data from a csv file without using COPY

2018-06-19 Thread Asif Ali
just tell me the site , i dont have time to waste on shitty things , i will 
program a spammer to send email to this list

Bye


From: James Keener 
Sent: Wednesday, June 20, 2018 3:16 AM
To: Asif Ali; pgsql-general@lists.postgresql.org; Alban Hertroys; Ravi Krishna
Subject: Re: Load data from a csv file without using COPY

It's people like you who make spam filters worse for the rest of us to the 
point they need to be checked daily for false positives. I'm sure you could 
have found it in less time than it took to spam the list with obscenities.

On June 19, 2018 6:13:49 PM EDT, Asif Ali  wrote:
please just tell me the site i will do it right away and i have marked it 
junked so many times , i will keep spamming it until my email address is 
removed from the list

Bye


From: James Keener 
Sent: Wednesday, June 20, 2018 3:11 AM
To: pgsql-general@lists.postgresql.org; Asif Ali; Alban Hertroys; Ravi Krishna
Cc: PG mailing List
Subject: Re: Load data from a csv file without using COPY

Seriously, stop spamming the list and stop cursing and acting like a petulant 
child. Go to the site and unsubscribe or use a mail client that understands the 
standard list headers.

On June 19, 2018 6:06:59 PM EDT, Asif Ali  wrote:
how the fuck i unsubscribe to this mailing list , i get more than 100 emails a 
day

Bye


From: Alban Hertroys 
Sent: Wednesday, June 20, 2018 2:10 AM
To: Ravi Krishna
Cc: PG mailing List
Subject: Re: Load data from a csv file without using COPY


> On 19 Jun 2018, at 22:16, Ravi Krishna  wrote:
>
> In order to test a real life scenario (and use it for benchmarking) I want to 
> load large number of data from csv files.
> The requirement is that the load should happen like an application writing to 
> the database ( that is, no COPY command).
> Is there a tool which can do the job.  Basically parse the csv file and 
> insert it to the database row by row.
>
> thanks

I think an easy approach would be to COPY the CSV files into a separate 
database using psql's \copy command and then pg_dump that as separate insert 
statements with pg_dump —inserts.

Alban Hertroys
--
If you can't see the forest for the trees,
cut the trees and you'll find there is no forest.



--
Sent from my Android device with K-9 Mail. Please excuse my brevity.


Re: Load data from a csv file without using COPY

2018-06-19 Thread Asif Ali
just tell me the site , i dont have time to waste on shitty things , i will 
program a spammer to send email to this list

Bye


From: James Keener 
Sent: Wednesday, June 20, 2018 3:16 AM
To: Asif Ali; pgsql-general@lists.postgresql.org; Alban Hertroys; Ravi Krishna
Subject: Re: Load data from a csv file without using COPY

It's people like you who make spam filters worse for the rest of us to the 
point they need to be checked daily for false positives. I'm sure you could 
have found it in less time than it took to spam the list with obscenities.

On June 19, 2018 6:13:49 PM EDT, Asif Ali  wrote:
please just tell me the site i will do it right away and i have marked it 
junked so many times , i will keep spamming it until my email address is 
removed from the list

Bye


From: James Keener 
Sent: Wednesday, June 20, 2018 3:11 AM
To: pgsql-general@lists.postgresql.org; Asif Ali; Alban Hertroys; Ravi Krishna
Cc: PG mailing List
Subject: Re: Load data from a csv file without using COPY

Seriously, stop spamming the list and stop cursing and acting like a petulant 
child. Go to the site and unsubscribe or use a mail client that understands the 
standard list headers.

On June 19, 2018 6:06:59 PM EDT, Asif Ali  wrote:
how the fuck i unsubscribe to this mailing list , i get more than 100 emails a 
day

Bye


From: Alban Hertroys 
Sent: Wednesday, June 20, 2018 2:10 AM
To: Ravi Krishna
Cc: PG mailing List
Subject: Re: Load data from a csv file without using COPY


> On 19 Jun 2018, at 22:16, Ravi Krishna  wrote:
>
> In order to test a real life scenario (and use it for benchmarking) I want to 
> load large number of data from csv files.
> The requirement is that the load should happen like an application writing to 
> the database ( that is, no COPY command).
> Is there a tool which can do the job.  Basically parse the csv file and 
> insert it to the database row by row.
>
> thanks

I think an easy approach would be to COPY the CSV files into a separate 
database using psql's \copy command and then pg_dump that as separate insert 
statements with pg_dump —inserts.

Alban Hertroys
--
If you can't see the forest for the trees,
cut the trees and you'll find there is no forest.



--
Sent from my Android device with K-9 Mail. Please excuse my brevity.