menu

ETL Processing on Google Cloud Using Dataflow and BigQuery

Go to Lab

7813 Reviews

avatar image

Some steps are outdated (couldn´t find usa_names.csv). Also when I trid some "apache beam pipeline" steps, my machine never stopped processing, therefore I couldn´t finish the lab :c.

Dante -. · Reviewed 26 minutes ago

avatar image

data_transformation.py looks almost empty!!

KOJI I. · Reviewed about 3 hours ago

avatar image

Antonio B. · Reviewed about 8 hours ago

avatar image

Command provided not matching. No direction for troubleshooting

Mohammad Kamar U. · Reviewed about 14 hours ago

avatar image

$ gsutil cp gs://spls/gsp290/data_files/usa_names.csv gs://$PROJECT/data_files/

미란 진. · Reviewed about 16 hours ago

avatar image

the copy data files step for usa_names.csv and head_usa_names.csv is not working due to insufficient permission

Ilung P. · Reviewed about 17 hours ago

avatar image

Yogita V. · Reviewed about 23 hours ago

avatar image

Yogita V. · Reviewed about 23 hours ago

avatar image

Some commands needed to add the "--region us-central1" parameter

Jimmy M. · Reviewed about 24 hours ago

avatar image

There was some issue. I have been asked by chat support that this will be resolved later.

Siraj S. · Reviewed 1 day ago

avatar image

This lab is full of errors, cannot even be solved with the help of chatters. It is very annoying as it is not a free lab.

Oscar L. · Reviewed 1 day ago

avatar image

Errors encountered: (venv) student_04_2e873d239108@cloudshell:~ (qwiklabs-gcp-04-d9e92eb486d7)$ gsutil cp gs://spls/gsp290/usa_names.csv gs://$PROJECT/data_files/ CommandException: No URLs matched: gs://spls/gsp290/usa_names.csv (venv) student_04_2e873d239108@cloudshell:~ (qwiklabs-gcp-04-d9e92eb486d7)$ (venv) student_04_2e873d239108@cloudshell:~/professional-services/examples/dataflow-python-examples (qwiklabs-gcp-04-d9e92eb486d7)$ python dataflow_python_examples/data_ingestion.py --project=$PROJECT --runner=DataflowRunner --staging_location=gs://$PROJECT/test --temp_location gs://$PROJECT/test --input gs://$PROJECT/data_files/head_usa_names.csv --save_main_session Traceback (most recent call last): File "dataflow_python_examples/data_ingestion.py", line 134, in <module> run() File "dataflow_python_examples/data_ingestion.py", line 100, in run p = beam.Pipeline(options=PipelineOptions(pipeline_args)) File "/home/student_04_2e873d239108/professional-services/examples/dataflow-python-examples/venv/lib/python3.7/site-packages/apache_beam/pipeline.py", line 198, in __init__ 'Pipeline has validations errors: \n' + '\n'.join(errors)) ValueError: Pipeline has validations errors: Missing required option: region. (venv) student_04_2e873d239108@cloudshell:~/professional-services/examples/dataflow-python-examples (qwiklabs-gcp-04-d9e92eb486d7)$

Paul R. · Reviewed 1 day ago

avatar image

俊樹 金. · Reviewed 1 day ago

avatar image

code is incomplete and doesn't run in latest version oF GCP

George T. · Reviewed 2 days ago

avatar image

Ari W. · Reviewed 2 days ago

avatar image

could not find the python files

irfan a. · Reviewed 2 days ago

avatar image

Durgesh M. · Reviewed 2 days ago

avatar image

Diptaparna B. · Reviewed 2 days ago

avatar image

Durgesh M. · Reviewed 2 days ago

avatar image

TATSUYA K. · Reviewed 2 days ago

avatar image

instruction not clear and some path is wrong

Fajri R. · Reviewed 2 days ago

avatar image

instruction not clear and some path is wrong

Fajri R. · Reviewed 2 days ago

avatar image

TATSUYA K. · Reviewed 2 days ago

avatar image

There was an error in the command below "python dataflow_python_examples/data_ingestion.py --project=$PROJECT --runner=DataflowRunner --staging_location=gs://$PROJECT/test --temp_location gs://$PROJECT/test --input gs://$PROJECT/data_files/head_usa_names.csv --save_main_session"

KOJI I. · Reviewed 2 days ago

avatar image

Adarsh K. · Reviewed 3 days ago