CompanyRemote

Dynamic Apache Airflow DAGs

Deadline: 2026-04-10
Project-Based

Description

Budget: ₹750 - ₹1250/hr

We are looking for an experienced data engineer with strong expertise in Apache Airflow, especially in dynamic DAG generation and scalable workflow design.

Key Requirements:

Strong hands-on experience with Apache Airflow Proven experience in dynamic DAG generation Solid experience working with AWS, especially MWAA (Managed Workflows for Apache Airflow) Good understanding of modern data/median architecture (data pipelines, orchestration, ETL/ELT workflows) Ability to design scalable, maintainable, and efficient workflows Experience with debugging and optimizing Airflow performance

Nice to Have:

Experience with data lakes / warehouses (e.g., S3, Redshift, Snowflake, etc.) Infrastructure as Code (Terraform/CloudFormation) CI/CD for data pipelines

Project Scope:

Design and implement dynamic DAGs Optimize existing Airflow workflows Set up / improve MWAA environment Provide best practices for scalable data pipeline architecture

Skills

Data ProcessingDebuggingAmazon Web ServicesCI/CDETLTerraformData PipelineAirflowData IntegrationApacheAWSSnowflakeLinuxRedshiftData GovernanceCloudFormation

Want AI to find more roles like this?

Upload your CV once. Get matched to relevant assignments automatically.

Try personalized matching