Does logging a SLOW query impact postgres performance?

by Ratatouille   Last Updated February 13, 2018 14:06 PM


Our Production system is reporting(occasionally) query timing that is taking longer than usual.

I have already measured EXPLAIN ANALYZE against the SQL statement but nothing seems to explain the longer than usual query timing.

cprdb=# explain ANALYZE select ror from crns where crn='8000440008' limit 1;
                                                     QUERY PLAN
 Limit  (cost=0.29..8.30 rows=1 width=6) (actual time=0.092..0.092 rows=1 loops=1)
   ->  Index Scan using crn_idx on crns  (cost=0.29..8.30 rows=1 width=6) (actual time=0.091..0.091 rows=1 loops=1)
         Index Cond: (crn = '8000440008'::bpchar)
 Planning time: 0.200 ms
 Execution time: 0.132 ms

My next bet is to log the PostgreSQL (DML) query time using log_min_duration_statement=0 and get a conclusive evidence as to where the problem is.

For that, I want to cross check ..

By Enabling query logging I would not deteriorate the overall PostgreSQL performance right?

Tags : postgresql

Related Questions

Postgres gist slow index f_unaccent

Updated April 09, 2018 20:06 PM