Another way to avoid multi-line query issues is to use CSV logging, added to PostgreSQL in version 8.3. This logs every statement in a delimited fashion that clearly marks line breaks such that multi-line queries can be imported without a problem into tools. You can even import these logs files into the database for analysis; the documentation at http://www.postgresql.org/docs/current/static/runtime-config-logging.html shows a sample table for that purpose in the Using CSV-format log output section.
To turn on this feature, you need to adjust the log destination and make sure the collector is running:
log_destination = 'csvlog'
logging_collector = on
The server must be completely restarted after this change for it to take effect. After this change, the log files saved into the usual directory structure that the collector uses will now end with .csv rather than .log. If you followed the right documentation for your version to create the postgres_log file, you would import it like this:
postgres=# COPY postgres_log FROM '/home/postgres/data
/pg_log/postgresql-2010-03-28_215404.csv' WITH CSV;
Having all of the log data in the database allows you to write all sorts of queries to analyze your logs. Here's a simple example that shows the first and last commit among the logs imported:
SELECT min(log_time),max(log_time) FROM postgres_log WHERE
command_tag='COMMIT';
You might instead ask at what elapsed time since the start of the session each command happened at:
SELECT log_time,(log_time - session_start_time) AS elapsed FROM
postgres_log WHERE command_tag='COMMIT';
Using this particular example against an imported pgbench run will show you that pgbench keeps one session open for all the work each client does.
You can use this log to figure out what statement executes before a given commit too, by looking for entries with the same session_id but with lower session_line_num values. Often when people run into situations where a COMMIT statement is the one that takes a long time, for example when a large queue of CHECK constraints have built up, knowing what happened before the commit is critical information, which is also difficult to discover. The CSV format logs make this easy to determine because they contain every bit of session information.
The main disadvantage of the CSV log format is that you're writing out a lot of logging information, and that can be too intensive on the exact type of system that needs better logging the most: ones that are overloaded by bad queries.