menu
arrow_back

Cloud Composer: Copying BigQuery Tables Across Different Locations

—/100

Checkpoints

arrow_forward

Create Cloud Composer environment.

Create two Cloud Storage buckets.

Create a dataset.

Uploading the DAG and dependencies to Cloud Storage

Cloud Composer: Copying BigQuery Tables Across Different Locations

1 ora 7 crediti

GSP283

Google Cloud Self-Paced Labs

Overview

In this advanced lab, you will learn how to create and run an Apache Airflow workflow in Cloud Composer that completes the following tasks:

  • Reads from a config file the list of tables to copy
  • Exports the list of tables from a BigQuery dataset located in US to Cloud Storage
  • Copies the exported tables from US to EU Cloud Storage buckets
  • Imports the list of tables into the target BigQuery Dataset in EU

cce6bf21555543ce.png

Crea un account Qwiklabs per leggere il resto del lab e tanto altro ancora.

  • Acquisisci accesso temporaneo a Google Cloud Console.
  • Oltre 200 lab dal livello iniziale a quelli più avanzati.
  • Corsi brevi per apprendere secondo i tuoi ritmi.
Crea un account per iniziare questo lab