Data Engineer
Description
Assignment Description
We are seeking a Data Engineer for an assignment delivered in a hybrid setup, where approximately half of the work is expected to be carried out on-site. In this role, you will work with technologies such as Databricks, including Apache Spark, Delta Lake, notebooks, job orchestration, and performance optimization, as well as Microsoft Azure services such as Azure Data Lake Storage. The work also involves DevOps and CI/CD practices, including version control, automated testing, deployment pipelines, and infrastructure as code.
As a person, you are comfortable working in the early stages of solution design and contributing throughout the full data engineering lifecycle, including ideation, high-level and low-level architecture, requirements specification, functional and technical design, estimation, sprint planning, development, testing, documentation, deployment, and operational follow-up. You are passionate about creating clean, scalable, resilient, and cost-efficient data solutions using modern cloud-based platforms, with a strong focus on maintainability and reusability. You also have strong communication skills and can clearly explain data architectures, pipelines, and trade-offs to stakeholders without a technical background.
Qualifications
- Proven experience applying DevOps methodologies, including version control (for example Git), CI/CD workflows, automated testing, promotion across environments, and infrastructure as code within data platform contexts
- Strong familiarity with Microsoft Azure, especially services typically used in data platforms such as Azure Data Lake Storage
- Practical experience working with Databricks, including Apache Spark, Delta Lake, orchestration of jobs, performance tuning, and management of environments
- Demonstrated experience in developing and delivering complete data pipelines, both batch and streaming, covering ingestion, transformation, and serving layers, following established practices in data modeling and lakehouse architecture
- Solid understanding of principles and standards within data engineering, including release workflows and quality requirements such as data validation, performance tuning, monitoring, and issue resolution in production environments
Location Requirement: 50% onsite in Malmö
Language Requirement: Swedish or English
Detaljer
Referens: 170366
Geografisk placering: Malmö, SE
Distansarbete:Hybrid
Omfattning:100%
Startdatum:2026-04-01
Slutdatum:2026-08-31
Ansök senast:2026-03-24
Publiceringsdatum:2026-03-18