menu
arrow_back

Cloud Composer: Copying BigQuery Tables Across Different Locations

—/100

Checkpoints

arrow_forward

Create Cloud Composer environment.

Create two Cloud Storage buckets.

Create a dataset.

Uploading the DAG and dependencies to Cloud Storage

Cloud Composer: Copying BigQuery Tables Across Different Locations

1 个小时 7 个积分

GSP283

Google Cloud Self-Paced Labs

Overview

In this advanced lab, you will learn how to create and run an Apache Airflow workflow in Cloud Composer that completes the following tasks:

  • Reads from a config file the list of tables to copy
  • Exports the list of tables from a BigQuery dataset located in US to Cloud Storage
  • Copies the exported tables from US to EU Cloud Storage buckets
  • Imports the list of tables into the target BigQuery Dataset in EU

cce6bf21555543ce.png

加入 Qwiklabs 即可阅读本实验的剩余内容…以及更多精彩内容!

  • 获取对“Google Cloud Console”的临时访问权限。
  • 200 多项实验,从入门级实验到高级实验,应有尽有。
  • 内容短小精悍,便于您按照自己的节奏进行学习。
加入以开始此实验