menu
arrow_back

Processing Data with Google Cloud Dataflow

—/100

Checkpoints

arrow_forward

Create a BigQuery Dataset

Copy the airport geolocation file to your Cloud Storage bucket

Process the Data using Cloud Dataflow (submit Dataflow job)

Run Query

Processing Data with Google Cloud Dataflow

1 个小时 15 分钟 7 个积分

GSP198

Google Cloud Self-Paced Labs

Overview

In this lab you will simulate a real-time real world data set from a historical data set. This simulated data set will be processed from a set of text files using Python and Google Cloud Dataflow, and the resulting simulated real-time data will be stored in BigQuery. You will then use BigQuery to analyse some features of the real-time data set.

Cloud Dataflow is a fully-managed service for transforming and enriching data in stream (real time) and batch (historical) modes via Java and Python APIs with the Apache Beam SDK. Cloud dataflow provides a serverless architecture that can be used to shard and process very large batch data sets, or high volume live streams of data, in parallel.

BigQuery is a RESTful web service that enables interactive analysis of massively large datasets working in conjunction with Google Storage.

The data set that is used provides historic information about internal flights in the United States retrieved from the US Bureau of Transport Statistics website. This data set can be used to demonstrate a wide range of data science concepts and techniques and will be used in all of the other labs in the Data Science on Google Cloud Platform quest.

加入 Qwiklabs 即可阅读本实验的剩余内容…以及更多精彩内容!

  • 获取对“Google Cloud Console”的临时访问权限。
  • 200 多项实验,从入门级实验到高级实验,应有尽有。
  • 内容短小精悍,便于您按照自己的节奏进行学习。
加入以开始此实验