1122

What is the easiest way to save PL/pgSQL output from a PostgreSQL database to a CSV file?

I'm using PostgreSQL 8.4 with pgAdmin III and PSQL plugin where I run queries from.

1

21 Answers 21

1638

Do you want the resulting file on the server, or on the client?

Server side

If you want something easy to re-use or automate, you can use Postgresql's built in COPY command. e.g.

Copy (Select * From foo) To '/tmp/test.csv' With CSV DELIMITER ',' HEADER;

This approach runs entirely on the remote server - it can't write to your local PC. It also needs to be run as a Postgres "superuser" (normally called "root") because Postgres can't stop it doing nasty things with that machine's local filesystem.

That doesn't actually mean you have to be connected as a superuser (automating that would be a security risk of a different kind), because you can use the SECURITY DEFINER option to CREATE FUNCTION to make a function which runs as though you were a superuser.

The crucial part is that your function is there to perform additional checks, not just by-pass the security - so you could write a function which exports the exact data you need, or you could write something which can accept various options as long as they meet a strict whitelist. You need to check two things:

  1. Which files should the user be allowed to read/write on disk? This might be a particular directory, for instance, and the filename might have to have a suitable prefix or extension.
  2. Which tables should the user be able to read/write in the database? This would normally be defined by GRANTs in the database, but the function is now running as a superuser, so tables which would normally be "out of bounds" will be fully accessible. You probably don’t want to let someone invoke your function and add rows on the end of your “users” table…

I've written a blog post expanding on this approach, including some examples of functions that export (or import) files and tables meeting strict conditions.


Client side

The other approach is to do the file handling on the client side, i.e. in your application or script. The Postgres server doesn't need to know what file you're copying to, it just spits out the data and the client puts it somewhere.

The underlying syntax for this is the COPY TO STDOUT command, and graphical tools like pgAdmin will wrap it for you in a nice dialog.

The psql command-line client has a special "meta-command" called \copy, which takes all the same options as the "real" COPY, but is run inside the client:

\copy (Select * From foo) To '/tmp/test.csv' With CSV DELIMITER ',' HEADER

Note that there is no terminating ;, because meta-commands are terminated by newline, unlike SQL commands.

From the docs:

Do not confuse COPY with the psql instruction \copy. \copy invokes COPY FROM STDIN or COPY TO STDOUT, and then fetches/stores the data in a file accessible to the psql client. Thus, file accessibility and access rights depend on the client rather than the server when \copy is used.

Your application programming language may also have support for pushing or fetching the data, but you cannot generally use COPY FROM STDIN/TO STDOUT within a standard SQL statement, because there is no way of connecting the input/output stream. PHP's PostgreSQL handler (not PDO) includes very basic pg_copy_from and pg_copy_to functions which copy to/from a PHP array, which may not be efficient for large data sets.

13
  • 154
    Obviously above example requires sometimes user to be a superuser, here's a version for ordinary people ;) echo “COPY (SELECT * from foo) TO STDOUT with CSV HEADER” | psql -o '/tmp/test.csv' database_name Apr 17, 2012 at 17:26
  • 10
    @Drachenfels: \copy works, too -- there, the paths are relative to the client, and no semicolon is needed/allowed. See my edit.
    – krlmlr
    Feb 13, 2013 at 10:12
  • 3
    @IMSoP: How would you add a COPY statement to an sql (on postgres 9.3) function? So the query gets saved to a .csv file?
    – jO.
    Nov 12, 2013 at 21:24
  • 14
    It looks like \copy needs to be a one-liner. So you don't get the beauty of formatting the sql the way you want, and just putting a copy/function around it.
    – isaaclw
    Jan 17, 2014 at 13:49
  • 1
    @AndreSilva As the answer states, \copy is a special meta-command in the psql command-line client. It won't work in other clients, like pgAdmin; they will probably have their own tools, such as graphical wizards, for doing this job.
    – IMSoP
    May 2, 2018 at 17:49
642

There are several solutions:

1 psql command

psql -d dbname -t -A -F"," -c "select * from users" > output.csv

This has the big advantage that you can using it via SSH, like ssh postgres@host command - enabling you to get

2 postgres copy command

COPY (SELECT * from users) To '/tmp/output.csv' With CSV;

3 psql interactive (or not)

>psql dbname
psql>\f ','
psql>\a
psql>\o '/tmp/output.csv'
psql>SELECT * from users;
psql>\q

All of them can be used in scripts, but I prefer #1.

4 pgadmin but that's not scriptable.

17
  • 37
    IMHO the first option is error prone, because it doesn't include proper escaping of comma in exported data.
    – Piohen
    May 6, 2013 at 21:07
  • 4
    Also, psql doesn't quote cell values, so if ANY of your data uses the delimiter, your file will be corrupted.
    – Cerin
    Apr 8, 2014 at 21:39
  • 7
    @Cerin -t is a synonym for --tuples-only (turn off printing of column names and result row count footers, etc.) - omit it to get column headers
    – ic3b3rg
    Jun 5, 2014 at 21:40
  • 25
    Just tested the comma-escaping claim—it’s true, method #1 does not escape commas in values.
    – MrColes
    Sep 17, 2014 at 21:07
  • 1
    also use "\pset footer" so the row counts don't hoe up in the file May 8, 2018 at 21:20
110

In terminal (while connected to the db) set output to the cvs file

1) Set field seperator to ',':

\f ','

2) Set output format unaligned:

\a

3) Show only tuples:

\t

4) Set output:

\o '/tmp/yourOutputFile.csv'

5) Execute your query:

:select * from YOUR_TABLE

6) Output:

\o

You will then be able to find your csv file in this location:

cd /tmp

Copy it using the scp command or edit using nano:

nano /tmp/yourOutputFile.csv
6
  • 4
    and \o in order to print console again
    – metdos
    Aug 6, 2012 at 14:57
  • 2
    This will not produce a CSV file, it will just record the command output to the text file (which does not make it the comma-separated). Nov 29, 2012 at 16:39
  • @RuslanKabalin yes I have just notticed that and ammended instruction to create comma-separated output (cvs) Nov 30, 2012 at 11:01
  • 5
    I'd improve this answer by noting that the "csv" output will not be properly escaped and each time a sql command is executed the results are concatenated to the output file. Feb 6, 2014 at 23:50
  • What about newlines in field values? The COPY or \copy approaches handle correctly (convert to standard CSV format); does this?
    – Wildcard
    Jan 7, 2017 at 4:19
67

CSV Export Unification

This information isn't really well represented. As this is the second time I've needed to derive this, I'll put this here to remind myself if nothing else.

Really the best way to do this (get CSV out of postgres) is to use the COPY ... TO STDOUT command. Though you don't want to do it the way shown in the answers here. The correct way to use the command is:

COPY (select id, name from groups) TO STDOUT WITH CSV HEADER

Remember just one command!

It's great for use over ssh:

$ ssh psqlserver.example.com 'psql -d mydb "COPY (select id, name from groups) TO STDOUT WITH CSV HEADER"' > groups.csv

It's great for use inside docker over ssh:

$ ssh pgserver.example.com 'docker exec -tu postgres postgres psql -d mydb -c "COPY groups TO STDOUT WITH CSV HEADER"' > groups.csv

It's even great on the local machine:

$ psql -d mydb -c 'COPY groups TO STDOUT WITH CSV HEADER' > groups.csv

Or inside docker on the local machine?:

docker exec -tu postgres postgres psql -d mydb -c 'COPY groups TO STDOUT WITH CSV HEADER' > groups.csv

Or on a kubernetes cluster, in docker, over HTTPS??:

kubectl exec -t postgres-2592991581-ws2td 'psql -d mydb -c "COPY groups TO STDOUT WITH CSV HEADER"' > groups.csv

So versatile, much commas!

Do you even?

Yes I did, here are my notes:

The COPYses

Using /copy effectively executes file operations on whatever system the psql command is running on, as the user who is executing it1. If you connect to a remote server, it's simple to copy data files on the system executing psql to/from the remote server.

COPY executes file operations on the server as the backend process user account (default postgres), file paths and permissions are checked and applied accordingly. If using TO STDOUT then file permissions checks are bypassed.

Both of these options require subsequent file movement if psql is not executing on the system where you want the resultant CSV to ultimately reside. This is the most likely case, in my experience, when you mostly work with remote servers.

It is more complex to configure something like a TCP/IP tunnel over ssh to a remote system for simple CSV output, but for other output formats (binary) it may be better to /copy over a tunneled connection, executing a local psql. In a similar vein, for large imports, moving the source file to the server and using COPY is probably the highest-performance option.

PSQL Parameters

With psql parameters you can format the output like CSV but there are downsides like having to remember to disable the pager and not getting headers:

$ psql -P pager=off -d mydb -t -A -F',' -c 'select * from groups;'
2,Technician,Test 2,,,t,,0,,                                                                                                                                                                   
3,Truck,1,2017-10-02,,t,,0,,                                                                                                                                                                   
4,Truck,2,2017-10-02,,t,,0,,

Other Tools

No, I just want to get CSV out of my server without compiling and/or installing a tool.

4
  • 2
    Where do the results get saved to ? My query runs but the file doesn't show up anywhere on my computer. This is what I'm doing : COPY (select a,b from c where d = '1') TO STDOUT WITH CSVHEADER > abcd.csv
    – kRazzy R
    Apr 25, 2018 at 17:00
  • 1
    @kRazzyR The output goes to stdout of the psql command, so ultimately whatever you do with stdout is where the data goes. In my examples I use '> file.csv' to redirect to a file. You want to make sure that is outside the command being sent to to the server through the psql -c parameter. See the 'local machine' example.
    – joshperry
    Apr 26, 2018 at 2:02
  • 1
    Thanks for the complete explanation. The copy command is hopelessly complex with psql. I end up usually using a free database client (dbeaver community edition) to import and export data files. It provides nice mapping and formatting tools. Your answer provides great detailed examples for copying from remote systems. Nov 28, 2019 at 5:44
  • 1
    This is an amazing solution. Thanks a lot.
    – harryghgim
    Sep 17, 2020 at 6:50
52

New version - psql 12 - will support --csv.

psql - devel

--csv

Switches to CSV (Comma-Separated Values) output mode. This is equivalent to \pset format csv.


csv_fieldsep

Specifies the field separator to be used in CSV output format. If the separator character appears in a field's value, that field is output within double quotes, following standard CSV rules. The default is a comma.

Usage:

psql -c "SELECT * FROM pg_catalog.pg_tables" --csv  postgres

psql -c "SELECT * FROM pg_catalog.pg_tables" --csv -P csv_fieldsep='^'  postgres

psql -c "SELECT * FROM pg_catalog.pg_tables" --csv  postgres > output.csv
40

If you're interested in all the columns of a particular table along with headers, you can use

COPY table TO '/some_destdir/mycsv.csv' WITH CSV HEADER;

This is a tiny bit simpler than

COPY (SELECT * FROM table) TO '/some_destdir/mycsv.csv' WITH CSV HEADER;

which, to the best of my knowledge, are equivalent.

1
  • 1
    If the query is custom (I.E. having column aliases or joining different tables), the header will print out the column aliases just as it display on the screen.
    – Devy
    Nov 13, 2013 at 21:58
28

I had to use the \COPY because I received the error message:

ERROR:  could not open file "/filepath/places.csv" for writing: Permission denied

So I used:

\Copy (Select address, zip  From manjadata) To '/filepath/places.csv' With CSV;

and it is functioning

1
  • 2
    I had the permission denied error as well. Fixed it by sending to the /tmp folder first. For example: \copy (SELECT * FROM messages) TO '/tmp/messages.csv' With CSV HEADER;
    – Somto
    Jan 1, 2021 at 2:40
24

I'm working on AWS Redshift, which does not support the COPY TO feature.

My BI tool supports tab-delimited CSVs though, so I used the following:

 psql -h dblocation -p port -U user -d dbname -F $'\t' --no-align -c "SELECT * FROM TABLE" > outfile.csv
3
  • Great, thanks! I've used ` psql -h dblocation -p port -U user -d dbname -F $',' --no-align -c "SELECT * FROM TABLE" > outfile.csv` to get CSVs. There's no quoting the fields, but it serves well enough for my purposes Jun 16, 2020 at 18:24
  • FYI, you can configure .pg_service.conf to alias the connection params to like psql service=default -F $'\t' ... . Jul 29, 2021 at 19:40
  • Redshift supports UNLOAD
    – Himanshu
    Dec 13, 2021 at 9:24
23

psql can do this for you:

edd@ron:~$ psql -d beancounter -t -A -F"," \
                -c "select date, symbol, day_close " \
                   "from stockprices where symbol like 'I%' " \
                   "and date >= '2009-10-02'"
2009-10-02,IBM,119.02
2009-10-02,IEF,92.77
2009-10-02,IEV,37.05
2009-10-02,IJH,66.18
2009-10-02,IJR,50.33
2009-10-02,ILF,42.24
2009-10-02,INTC,18.97
2009-10-02,IP,21.39
edd@ron:~$

See man psql for help on the options used here.

1
  • 13
    This isn't a true CSV file--watch it burn if there are commas in the data--so using the built-in COPY support is preferred. But this general technique is handy as a quick hack for exporting from Postgres in other delimited formats besides CSV.
    – Greg Smith
    Oct 6, 2009 at 5:19
13

In pgAdmin III there is an option to export to file from the query window. In the main menu it's Query -> Execute to file or there's a button that does the same thing (it's a green triangle with a blue floppy disk as opposed to the plain green triangle which just runs the query). If you're not running the query from the query window then I'd do what IMSoP suggested and use the copy command.

1
  • IMSoP's answer didn't work for me as I needed to be a super admin. This worked a treat. Thanks! Jan 31, 2012 at 22:08
11

I've written a little tool called psql2csv that encapsulates the COPY query TO STDOUT pattern, resulting in proper CSV. It's interface is similar to psql.

psql2csv [OPTIONS] < QUERY
psql2csv [OPTIONS] QUERY

The query is assumed to be the contents of STDIN, if present, or the last argument. All other arguments are forwarded to psql except for these:

-h, --help           show help, then exit
--encoding=ENCODING  use a different encoding than UTF8 (Excel likes LATIN1)
--no-header          do not output a header
0
10

I tried several things but few of them were able to give me the desired CSV with header details.

Here is what worked for me.

psql -d dbame -U username \
  -c "COPY ( SELECT * FROM TABLE ) TO STDOUT WITH CSV HEADER " > \
  OUTPUT_CSV_FILE.csv
9

If you have longer query and you like to use psql then put your query to a file and use the following command:

psql -d my_db_name -t -A -F";" -f input-file.sql -o output-file.csv
1
  • 3
    FWIW, I had to use -F"," instead of -F";" to generate a CSV file that would open correctly in MS Excel
    – CFL_Jeff
    May 31, 2018 at 19:44
9

Since Postgres 12, you can change the output format :

\pset format csv

The following formats are allowed :

aligned, asciidoc, csv, html, latex, latex-longtable, troff-ms, unaligned, wrapped

If you want to export the result of a request, you can use the \o filename feature.

Example :

\pset format csv

\o file.csv
SELECT * FROM table LIMIT 10;
\o

\pset format aligned
5

To Download CSV file with column names as HEADER use this command:

Copy (Select * From tableName) To '/tmp/fileName.csv' With CSV HEADER;
2

I found that psql --csv creates a CSV file with UTF8 characters but it is missing the UTF8 Byte Order Mark (0xEF 0xBB 0xBF). Without taking it into account, the default import of this CSV file will corrupt international characters such as CJK characters.

To fix it, I devised the following script:

# Define a connection to the Postgres database through environment variables
export PGHOST=your.pg.host
export PGPORT=5432
export PGDATABASE=your_pg_database
export PGUSER=your_pg_user

# Place credentials in $HOME/.pgpass with the format:
# ${PGHOST}:${PGPORT}:${PGUSER}:master:${PGPASSWORD}

# Populate long SQL query in a text file:
cat > /tmp/query.sql <<EOF
SELECT item.item_no,item_descrip,
invoice.invoice_no,invoice.sold_qty
FROM item
LEFT JOIN invoice
ON item.item_no=invoice.item_no;
EOF

# Generate CSV report with UTF8 BOM mark
printf '\xEF\xBB\xBF' > report.csv
psql -f /tmp/query.sql --csv | tee -a report.csv

Doing it this way, lets me script the CSV creation process for automation and allows me to succinctly maintain the script in a single source file.

0

When your query is too long and you can't write it inline, you can use a temporary table like this :

CREATE TABLE tmp_table as (

    SELECT *
    FROM my_table mt
    WHERE ...

);

\COPY tmp_table TO '~/Desktop/tmp_table.csv' DELIMITER ';' CSV HEADER;
DROP TABLE tmp_table;
-1

If you are using AWS, such as AWS RDS, then you can use PGAdmin. What I do is create a temporary table with the desired output:

CREATE TABLE export_descriptions AS SELECT description FROM products WHERE id = 406;

Then in PGAdmin, it has an export to CSV Option:

enter image description here

There, you can specify where to save it and in what format:

enter image description here

And then it saves right to your computer. Remember AWS RDS hides the underlying compute it runs on from you, so you do not have access to the underlying server (EC2 or Fargate instance). In other words, you cannot ssh into it. You can access the postgres cli though and connect to it from PGAdmin and with the new PGAdmin interface, it makes it easy to export to csv.

-3
import json
cursor = conn.cursor()
qry = """ SELECT details FROM test_csvfile """ 
cursor.execute(qry)
rows = cursor.fetchall()

value = json.dumps(rows)

with open("/home/asha/Desktop/Income_output.json","w+") as f:
    f.write(value)
print 'Saved to File Successfully'
3
  • 3
    Please expolain what you did editing answer, avoid code only answer
    – GGO
    Feb 27, 2018 at 12:09
  • 3
    Thank you for this code snippet, which might provide some limited short-term help. A proper explanation would greatly improve its long-term value by showing why this is a good solution to the problem, and would make it more useful to future readers with other, similar questions. Please edit your answer to add some explanation, including the assumptions you've made. Feb 27, 2018 at 12:48
  • 2
    This will produce a json file, not a csv file.
    – nvoigt
    Feb 27, 2018 at 13:23
-4

JackDB, a database client in your web browser, makes this really easy. Especially if you're on Heroku.

It lets you connect to remote databases and run SQL queries on them.

                                                                                                                                                       Source jackdb-heroku
(source: jackdb.com)


Once your DB is connected, you can run a query and export to CSV or TXT (see bottom right).


jackdb-export

Note: I'm in no way affiliated with JackDB. I currently use their free services and think it's a great product.

0
-4

Per the request of @skeller88, I am reposting my comment as an answer so that it doesn't get lost by people who don't read every response...

The problem with DataGrip is that it puts a grip on your wallet. It is not free. Try the community edition of DBeaver at dbeaver.io. It is a FOSS multi-platform database tool for SQL programmers, DBAs and analysts that supports all popular databases: MySQL, PostgreSQL, SQLite, Oracle, DB2, SQL Server, Sybase, MS Access, Teradata, Firebird, Hive, Presto, etc.

DBeaver Community Edition makes it trivial to connect to a database, issue queries to retrieve data, and then download the result set to save it to CSV, JSON, SQL, or other common data formats. It's a viable FOSS competitor to TOAD for Postgres, TOAD for SQL Server, or Toad for Oracle.

I have no affiliation with DBeaver. I love the price and functionality, but I wish they would open up the DBeaver/Eclipse application more and made it easy to add analytics widgets to DBeaver / Eclipse, rather than requiring users to pay for the annual subscription to create graphs and charts directly within the application. My Java coding skills are rusty and I don't feel like taking weeks to relearn how to build Eclipse widgets, only to find that DBeaver has disabled the ability to add third-party widgets to the DBeaver Community Edition.

Do DBeaver users have insight as to the steps to create analytics widgets to add into the Community Edition of DBeaver?

0

Not the answer you're looking for? Browse other questions tagged or ask your own question.