דרושים»דאטה» Data Engineer
Description
לפני 1 שעות חברה חסויה Location: Job Type:
and This role has been designed as Hybrid with an expectation that you will work on average 2 days per week from an office.
We are looking for a talented Data Engineer to help build and enhance the data platform that supports analytics, operations, and data-driven decision-making across the organization. You will work hands-on to develop scalable data pipelines, improve data models, ensure data quality, and contribute to the continuous evolution of our modern data ecosystem.
Youll collaborate closely with Senior Engineers, Analysts, Data Scientists, and stakeholders across the business to deliver reliable, well-structured, and well-governed data solutions.
What Youll Do:
Engineering & Delivery
Build, maintain, and optimize data pipelines for batch and streaming workloads.
Develop reliable data models and transformations to support analytics, reporting, and operational use cases.
Integrate new data sources, APIs, and event streams into the platform.
Implement data quality checks, testing, documentation, and monitoring.
Write clean, performant SQL and Python code.
Contribute to improving performance, scalability, and cost-efficiency across the data platform.
Collaboration & Teamwork
Work closely with senior engineers to implement architectural patterns and best practices.
Collaborate with analysts and data scientists to translate requirements into technical solutions.
Participate in code reviews, design discussions, and continuous improvement initiatives.
Help maintain clear documentation of data flows, models, and processes.
Platform & Process
Support the adoption and roll-out of new data tools, standards, and workflows.
Contribute to DataOps processes such as CI/CD, testing, and automation.
Assist in monitoring pipeline health and resolving data-related issues.Requirements: What Were Looking For
2-5+ years of experience as a Data Engineer or similar role.
Hands-on experience with Snowflake (mandatory)-including SQL, modeling, and basic optimization.
Experience with dbt (or similar)-model development, tests, documentation, and version control workflows.
Strong SQL skills for data modeling and analysis.
Proficiency with Python for pipeline development and automation.
Experience working with orchestration tools (Airflow, Dagster, Prefect, or equivalent).
Understanding of ETL/ELT design patterns, data lifecycle, and data modeling best practices.
Familiarity with cloud environments (AWS, GCP, or Azure).
Knowledge of data quality, observability, or monitoring concepts.
Good communication skills and the ability to collaborate with cross-functional teams.
Nice to Have:
Exposure to streaming/event technologies (Kafka, Kinesis, Pub/Sub).
Experience with data governance or catag tools.
Basic understanding of ML workflows or MLOps concepts.
Experience with infrastructure-as-code tools (Terraform, CloudFormation).
Familiarity with testing frameworks or data validation tools.
Additional Skills:
Cloud Architectures, Cross Domain Knowledge, Design Thinking, Development Fundamentals, DevOps, Distributed Computing, Microservices Fluency, Full Stack Development, Security-First Mindset, User Experience (UX).This position is open to all candidates. Hide
Skills
Want AI to find more roles like this?
Upload your CV once. Get matched to relevant assignments automatically.