menu
arrow_back

Cloud Composer: Qwik Start - Command Line

Cloud Composer: Qwik Start - Command Line

1 heure 1 crédit

GSP606

Google Cloud Self-Paced Labs

Overview

Workflows are a common theme in data analytics - they involve ingesting, transforming, and analyzing data to figure out the meaningful information within. In Google Cloud Platform (GCP), the tool for hosting workflows is Cloud Composer which is a hosted version of the popular open source workflow tool Apache Airflow.

In this lab, you use the Cloud Shell command line to set up a Cloud Composer environment. You then use Cloud Composer to go through a simple workflow that verifies the existence of a data file, creates a Cloud Dataproc cluster, runs an Apache Hadoop wordcount job on the Cloud Dataproc cluster, and deletes the Cloud Dataproc cluster afterwards.

This lab also shows you how to access your Cloud Composer environment through the GCP Console and the Airflow web interface.

What you'll do

  • Use Cloud Shell command line to create the Cloud Composer environment and set up the Composer environment variables

  • Verify the Environment configuration in the GCP Console

  • Run an Apache Airflow workflow in Cloud Composer that runs an Apache Hadoop wordcount job on the cluster

  • View and run the DAG (Directed Acyclic Graph) in the Airflow web interface

  • View the results of the wordcount job in storage

Suggested experience

The following experience can help maximize your learning:

  • Basic CLI knowledge

  • Basic understanding of Python

Inscrivez-vous sur Qwiklabs pour consulter le reste de cet atelier, et bien plus encore.

  • Obtenez un accès temporaire à Google Cloud Console.
  • Plus de 200 ateliers, du niveau débutant jusqu'au niveau expert.
  • Fractionné pour vous permettre d'apprendre à votre rythme.
Inscrivez-vous pour démarrer cet atelier