menu

Data Engineering

Advanced 5 Steps 시간 37 크레딧

This advanced-level quest is unique amongst the other Qwiklabs offerings. The labs have been curated to give IT professionals hands-on practice with topics and services that appear in the Google Cloud Certified Professional Data Engineer Certification. From Big Query, to Dataproc, to Tensorflow, this quest is composed of specific labs that will put your GCP data engineering knowledge to the test. Be aware that while practice with these labs will increase your skills and abilities, you will need other preparation too. The exam is quite challenging and external studying, experience, and/or background in cloud data engineering is recommended.

Data Machine Learning Business Transformation

Prerequisites:

This Quest requires proficiency with GCP Services, particularly those relating to working with large datasets. It is recommended that the student have at least earned a Badge by completing the hands-on labs in the Baseline: Data, ML, and AI and/or the GCP Essentials Quests before beginning. Additional lab experience with the Scientific Data Processing and the Machine Learning APIs Quests will be useful.

Quest Outline

실습

Creating a Data Transformation Pipeline with Cloud Dataprep

Cloud Dataprep by Trifacta is an intelligent data service for visually exploring, cleaning, and preparing structured and unstructured data for analysis. In this lab you will explore the Cloud Dataprep UI to build a data transformation pipeline.

Deutsch English español (Latinoamérica) français 日本語 português (Brasil)
실습

Run a Big Data Text Processing Pipeline in Cloud Dataflow

In this lab you will use Google Cloud Dataflow to create a Maven project with the Cloud Dataflow SDK, and run a distributed word count pipeline using the Google Cloud Platform Console.

Deutsch English español (Latinoamérica) français 日本語 português (Brasil)
실습

Building an IoT Analytics Pipeline on Google Cloud Platform

This lab shows you how to connect and manage devices using Cloud IoT Core; ingest the steam of information using Cloud Pub/Sub; process the IoT data using Cloud Dataflow; use BigQuery to analyze the IoT data. Watch this short video, Easily Build an IoT Analytics Pipeline.

Deutsch English español (Latinoamérica) français 日本語 português (Brasil)
실습

Streaming IoT Kafka to Google Cloud Pub/Sub

In this lab you create an instance of Confluent Kafka to communicate with Google Cloud Pub/Sub using source and sink mechanisms.

Deutsch English español (Latinoamérica) français 日本語 português (Brasil)
실습

ETL Processing on GCP Using Dataflow and BigQuery

In this lab you will build several Data Pipelines that will ingest data from a publicly available dataset into BigQuery.

Deutsch English español (Latinoamérica) français 日本語 português (Brasil)
실습

Predict Visitor Purchases with a Classification Model in BQML

In this lab you will use a newly available ecommerce dataset to run some typical queries that businesses would want to know about their customers’ purchasing habits.

Deutsch English español (Latinoamérica) français 日本語 português (Brasil)
실습

BigQuery ML 예측 모델을 사용하여 택시 요금 예측하기

이 실습에서는 BigQuery 공개 데이터세트에서 제공되는 수백만 건의 뉴욕 옐로캡 택시 운행 데이터를 살펴보고, BigQuery 내에서 ML 모델을 생성하여 요금을 예측하며, 이러한 예측을 수행하는 모델의 성능을 평가해봅니다.

Deutsch English español (Latinoamérica) français 日本語 한국어 português (Brasil)
실습

Predict Housing Prices with Tensorflow and AI Platform

In this lab you will build an end to end machine learning solution using Tensorflow + AI Platform and leverage the cloud for distributed training and online prediction.

Deutsch English español (Latinoamérica) français 日本語 português (Brasil)

지금 등록

배지 획득에 대한 진행 상황을 추적하려면 이 퀘스트에 등록하세요.