Sometimes you need to export a database, for backups or for transferring the data into another DB. The usual way of exporting a Postgres database is to use pg_dump, but if you’re using a shared hosting provider, you might not have the necessary permissions to do so. Then any pg_dump commands will fail with an error like this:

pg_dump: error: query failed: ERROR: permission denied for view pg_roles

I don’t know why pg_dump needs to access the pg_roles view, but it does and fails if it can’t. Maybe we can find a way to skip this roles access? I don’t want any roles, I just want the data.
There are multiple promising options in the help menu that I tried:

-O, --no-owner               skip restoration of object ownership in plain-text format
-a, --data-only              dump only the data, not the schema
-t, --table=PATTERN          dump the specified table(s) only

Unfortunately, none of them solved the problem, the error persisted.

So I searched for a different way to export the data and I found a way. It’s not as elegant as pg_dump, but it works and that’s what counts. We can use the COPY command to move data between a table and a file.

Here’s StackOverflow’s way to export a table from any SQL database into a CSV file:

echo "\COPY (select * from table ) TO 'table.csv' (format CSV);" > my_query.sql
psql <database> -af my_query.sql

We use the backslash to make it a client command, so psql will look in the current directory for the file. If you use COPY without the backslash, the server will look for the file.

An even shorter answer without a helper file is found in the comments.

psql <database> -c 'select * from table limit 5' -tAF , > table.csv

For importing the data back into a database, you can then use the \COPY command again:

psql <database> -a -c "\COPY table FROM 'table.csv' CSV"
an orange cat happily sleeping on another cat's back

Comments