This page provides you with instructions on how to extract data from PostgreSQL and load it into Google BigQuery. (If this manual process sounds onerous, check out Stitch, which can do all the heavy lifting for you in just a few clicks.)
What is PostgreSQL?
PostgreSQL, also called Postgres, is an open source object-relational database management system that runs on all major operating systems. It's known for its stability and its ability to handle high volumes of transactions.
What is Google BigQuery?
Google BigQuery is a data warehouse that delivers super-fast results from SQL queries, which it accomplishes using a powerful engine dubbed Dremel. With BigQuery, there's no spinning up (and down) clusters of machines as you work with your data. With that said, it's clear why some claim that BigQuery prioritizes querying over administration. It's super fast, and that's the reason why most folks use it.
Getting data out of PostgreSQL
Most people retrieve data from relational databases by writing SQL queries. If you're just looking to export data in bulk, however, you can use the command-line tool
pg_dump to export data from a PostgreSQL database as a CSV file or a script that you can run to restore the database on any PostgreSQL server.
Loading data into Google BigQuery
Google Cloud Platform offers a helpful guide for loading data into BigQuery. You can use the
bq command-line tool, and in particular the
bq load command, to upload files to your datasets, adding schema and data type information along the way. You can find the syntax in the Quickstart guide for bq. Iterate through this process as many times as it takes to load all of your tables into BigQuery.
Keeping PostgreSQL data up to date
The script you have now should satisfy all your data needs for PostgreSQL – right? Not yet. How do you load new or updated data? It's not a good idea to replicate all of your data each time you have updated records. That process would be painfully slow; if latency is important to you, it's not a viable option.
Instead, you can identify some key fields that your script can use to bookmark its progression through the data, and pick up where it left off as it looks for updated data. Auto-incrementing fields such as updated_at or created_at work best for this. When you've built in this functionality, you can set up your script as a cron job or continuous loop to get new data as it appears in PostgreSQL.
Other data warehouse options
BigQuery is great, but sometimes you need to optimize for different things when you're choosing a data warehouse. Some folks choose to go with Amazon Redshift, PostgreSQL, Snowflake, or Microsoft Azure SQL Data Warehouse, which are RDBMSes that use similar SQL syntax, or Panoply, which works with Redshift instances. Others choose a data lake, like Amazon S3. If you're interested in seeing the relevant steps for loading data into one of these platforms, check out To Redshift, To Postgres, To Snowflake, To Panoply, To Azure SQL Data Warehouse, and To S3.
Easier and faster alternatives
If all this sounds a bit overwhelming, don’t be alarmed. If you have all the skills necessary to go through this process, chances are building and maintaining a script like this isn’t a very high-leverage use of your time.
Thankfully, products like Stitch were built to move data from PostgreSQL to Google BigQuery automatically. With just a few clicks, Stitch starts extracting your PostgreSQL data via the API, structuring it in a way that's optimized for analysis, and inserting that data into your Google BigQuery data warehouse.