Learn Beyond Consulting LLCRemote

Data Engineer- Google Cloud Platform

Project-Based

Description

Tittle : Mid Level Data Engineer- Google Cloud Platform Location :Remote Any visa Job Description : Mandatory Skills- Google Cloud Platform, AWS, Databricks, PySpark, DBT, Airflow or similar tools Overview We are also hiring multiple Data Engineers (Contractors) to support the buildout of a next-generation data platform. These engineers will focus on pipeline development, data transformation, and supporting the overall platform modernization effort. Key Responsibilities Build and maintain data pipelines using Databricks and PySpark Work with structured and unstructured data sources (JSON, CSV, XML, etc.) Support data ingestion, transformation, and validation processes Collaborate with Lead Engineers on implementation and delivery Ensure data quality and consistency across pipelines Required Qualifications Hands-on experience with PySpark and Databricks Strong SQL and Python skills Experience building and supporting data pipelines Familiarity with Airflow or similar orchestration tools Nice to Have Experience with Google Cloud Platform / BigQuery Exposure to DBT Experience working in evolving or messy data environments

Skills

BigQueryAirflowGCPdbtSQLAWSPythonDatabricks

Want AI to find more roles like this?

Upload your CV once. Get matched to relevant assignments automatically.

Try personalized matching