—/100
Checkpoints
Create Cloud Composer environment.
/ 25
Create two Cloud Storage buckets.
/ 25
Create a dataset.
/ 25
Uploading the DAG and dependencies to Cloud Storage
/ 25
Cloud Composer: Copying BigQuery Tables Across Different Locations
GSP283
Overview
In this advanced lab, you will learn how to create and run an Apache Airflow workflow in Cloud Composer that completes the following tasks:
- Reads from a config file the list of tables to copy
- Exports the list of tables from a BigQuery dataset located in US to Cloud Storage
- Copies the exported tables from US to EU Cloud Storage buckets
- Imports the list of tables into the target BigQuery Dataset in EU
Join Qwiklabs to read the rest of this lab...and more!
- Get temporary access to the Google Cloud Console.
- Over 200 labs from beginner to advanced levels.
- Bite-sized so you can learn at your own pace.