Re: [PERFORM] POWA doesn't show queries executed

2017-11-21 Thread phb07

Hi,
You should probably report your issue at 
https://github.com/dalibo/powa/issues

KR

Le 18/11/2017 à 02:52, Neto pr a écrit :

Dear all

I have successfully installed POWA (http://dalibo.github.io/powa), 
including all required extensions, see the following Printscreen of 
its operation of end email.


But when executing queries in psql- comand line, this queries are not 
monitored by powa. I have checked that only Postgresql internal 
catalog queries are shown. .
I need the Optimize Query functionality and mainly the suggestion of 
indexes.
But that does not work, by clicking on the optimize query option, 
returns zero suggestions.


See below that I created a scenario, with a table with a large amount 
of data, to check if the tool would suggest some index, and when 
making a complex query, no index is suggested.


Someone uses POWA, knows if they have to configure something so that 
the queries are monitored and show suggestions ??


-- Printscreens of my environment partially 
working:--


https://sites.google.com/site/eletrolareshop/repositorio/powa1.jpeg
https://sites.google.com/site/eletrolareshop/repositorio/powa2.jpeg
https://sites.google.com/site/eletrolareshop/repositorio/powa3.jpeg

---
 scenario to verify the suggestion of indices  


postgres=# create table city_habitant (number_habitant text);
CREATE TABLE
postgres=# insert into  city_habitant (number_habitant) select 'São 
Paulo' from (select generate_series (1, 400)) a;

INSERT 0 400
postgres=# insert into  city_habitant (number_habitant) select 'Rio de 
Janeiro' from (select generate_series (1, 800)) a;

INSERT 0 800
postgres=# insert into  city_habitant (number_habitant) select 
'Recife' from (select generate_series (1, 600)) a;

INSERT 0 600
postgres=# insert into  city_habitant (number_habitant) select 
'Santos' from (select generate_series (1, 200)) a;

INSERT 0 200
postgres=# insert into  city_habitant (number_habitant) select 'Chui' 
from (select generate_series (1, 6)) a;

INSERT 0 6
postgres=# SELECT number_habitant, count(number_habitant) FROM 
 city_habitant GROUP BY number_habitant;

 number_habitant   |  count
---+--
 Rio de Janeiro| 800
 Recife | 600
 Santos| 200
 São Paulo  | 400
 Chui   |  6
(5 rows)

 
	Livre de vírus. www.avast.com 
. 







Re: Performance

2018-02-25 Thread phb07


Le 23/02/2018 à 22:20, Andreas Kretschmer a écrit :



Am 23.02.2018 um 20:29 schrieb Daulat Ram:
  We have the following requirements in single query or any proper 
solution. Please help on this.

How many sessions are currently opened.

ask pg_stat_activity, via select * from pg_stat_activity



  -and if opened then how many queries have executed on that session.


Whot? There isn't a counter for that, AFAIK.



  -and also we have to trace how much time each query is taking.


You can use auto_explain for that


  -and also we have to find the cost of each query.


the same, auto_explain

please keep in mind: costs are imaginary.


You can also have a look at PoWA : https://github.com/dalibo/powa



Regards, Andreas






Re: Oracle to postgres migration

2019-04-08 Thread phb07


Le 08/04/2019 à 14:24, Rick Otten a écrit :



On Mon, Apr 8, 2019 at 8:04 AM Julien Rouhaud > wrote:


On Mon, Apr 8, 2019 at 1:49 PM Daulat Ram
mailto:[email protected]>>
wrote:
>
> Please confirm ! Can we migrate Oracle 12c database (12.1.0.1.0)
running on Solaris to PostgreSQL 11.2 on  Linux (Ubuntu). Also,
please suggest the tools and pre-requisites.
A database migration is likely feasible, but might require quite a lot
of work depending on what features you're using, and the amount of PL
code.  Also, obviously migrating the database is only a part of the
overall migration process, as you'll also need to take care of the
application(s), the backup/restore, monitoring and all other tools you
need.

Concerning the database migration, the best tool is probably Gilles
Darold's ora2pg.  The tool also provides a migration cost assessment
report, to evaluate the difficulty of the database migration.  More
information on http://ora2pg.darold.net/



The last big Oracle to PG migration that I did was several years ago.  
We stood up the PostgreSQL instance(s) and then used SymmetricDS to 
synchronize the Oracle and PG databases.   After tuning and testing 
the postgresql side, we cut over the applications live - with minimal 
downtime - by releasing the updated application code and 
configuration.   If we needed to fail back, it was also pretty easy to 
undo the release and configuration changes.


Another approach you can play with is to leverage Foreign Data 
Wrappers.  In that scenario, you can run queries on your Oracle 
database from within PostgreSQL.  You can use those queries to copy 
data directly into new tables without any interim files, or as a 
hybrid transition while you get the new database set up.


At the time I was working on that migration, we had too many 
data-edge-cases for ora2pg to be very useful.  It has come a long ways 
since then.  I'm not sure it can do a live cutover, so you may need to 
plan a bit of downtime if you have a lot of data to move into the new 
database.


Note that you will also almost certainly want to use a connection 
pooler like PGBouncer and/or PGPool II (or both at the same time), so 
be sure to include that in your plans from the beginning.


That said, none of this is on topic for the performance mailing list.  
Please try to direct your questions to the right group next time.



Just a few additional pieces of information.
1) migration from one DBMS to another must always be lead as a project 
(because your data are always important ;-)

2) a migration project always has the following main tasks:
- setting a proper postgres platform (with all softwares, procedures and 
documentation needed to provide a good PostgreSQL service to your 
applications/clients) (you may already have such a platform).
- migrating the data. This concerns both the structure (DDL) and the 
data content.
- migration the stored procedures, if any. In Oracle migrations, this is 
often a big workload in the project.
- adapting the client application. The needed effort here can be huge or 
... null, depending on the used languages, whether the data access API 
are compatible or whether an ORM is used.
- when all this has been prepared, a test phase can start. This is very 
often the most costly part of the project, in particular for mission 
critical databases.

- then, you are ready to switch to Postgres.
3) do not hesitate to invest in education and external professional support.
4) before launching such a project, it is highly recommended to perform 
a preliminary study. For this purpose, as Julien said, ora2pg brings a 
big help in analysing the Oracle database content. The cost estimates 
are pretty well computed, which gives you very quickly an idea of the 
global cost of the database migration. For the application side, you may 
also have a look at code2pg.


KR. Philippe.