menu

Engineer Data in Google Cloud

Advanced 7 passaggi 8 ore 51 crediti

Earn a skill badge by completing the Engineer Data in Google Cloud quest, where you will learn how to: 1. Build data pipelines using Cloud Dataprep by Trifacta, Pub/Sub, and Dataflow. 2. Use Cloud IoT Core to collect and manage MQTT-based devices. 3. Use BigQuery to analyze IoT data. 4. Use Cloud Storage, Dataflow, and BigQuery to perform ETL. 5. Build a machine learning model using BigQuery ML. 6. Build a machine learning model using TensorFlow 1.x and AI Platform.

This quest is a great resource for understanding topics that will appear in the Google Cloud Certified Professional Data Engineer Certification.

A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete the skill badge quest, and final assessment challenge lab, to receive a digital badge that you can share with your network.

Data Machine Learning Business Transformation

Prerequisiti:

Prior to enrolling in this skill badge quest, it is recommended that you complete the the following quests:

Quest Outline

Lab

Creating a Data Transformation Pipeline with Cloud Dataprep

Cloud Dataprep by Trifacta is an intelligent data service for visually exploring, cleaning, and preparing structured and unstructured data for analysis. In this lab you will explore the Cloud Dataprep UI to build a data transformation pipeline.

Deutsch English español (Latinoamérica) français 日本語 português (Brasil)
Lab

Building an IoT Analytics Pipeline on Google Cloud

This lab shows you how to connect and manage devices using Cloud IoT Core; ingest the stream of information using Cloud Pub/Sub; process the IoT data using Cloud Dataflow; use BigQuery to analyze the IoT data. Watch this short video, Easily Build an IoT Analytics Pipeline.

Deutsch English español (Latinoamérica) français 日本語 português (Brasil)
Lab

ETL Processing on Google Cloud Using Dataflow and BigQuery

In this lab you will build several Data Pipelines that will ingest data from a publicly available dataset into BigQuery.

Deutsch English español (Latinoamérica) français 日本語 português (Brasil)
Lab

Predict Visitor Purchases with a Classification Model in BQML

In this lab you will use a newly available ecommerce dataset to run some typical queries that businesses would want to know about their customers’ purchasing habits.

Deutsch English español (Latinoamérica) français 日本語 português (Brasil)
Lab

Predict Housing Prices with Tensorflow and AI Platform

In this lab you will build an end to end machine learning solution using Tensorflow + AI Platform and leverage the cloud for distributed training and online prediction.

Deutsch English español (Latinoamérica) français 日本語 português (Brasil)
Lab

Cloud Composer: Copying BigQuery Tables Across Different Locations

In this advanced lab you will create and run an Apache Airflow workflow in Cloud Composer that exports tables from a BigQuery dataset located in Cloud Storage bucktes in the US to buckets in Europe, then import th0se tables to a BigQuery dataset in Europe.

Deutsch English español (Latinoamérica) français 日本語 português (Brasil)
Lab

Engineer Data in Google Cloud: Challenge Lab

This challenge lab tests your skills and knowledge from the labs in the Engineer Data in Google Cloud quest. You should be familiar with the content of labs before attempting this lab.

Deutsch English español (Latinoamérica) français 日本語 português (Brasil)

Registrati subito

Registrati a questa Quest per monitorare i tuoi progressi, grazie ai quali potrai ottenere un badge.