Data Analytics engineer (IRC292594)
Description
Job Description Working knowledge of SQL, and data transformation. Understanding of data warehouses and business intelligence tools Have 2+ years of experience as a data analyst Problem-solving and analytical skills to troubleshoot data issues and technical challenges Good English skills
Nice to have: Experience with Looker Experience with dbt Experience writing analytical Python
Technologies: AWS Databases - Snowflake, Postgre, BigQuery DBT BI Tools - Looker, Tableau (optional) Metaplane Git Job Responsibilities Model and analyze data Work on data quality validation Leverage various data warehouses, ETL tools and BI platforms for client projects Maintain deliverables, projects, and timelines for your client engagements Communicate findings clearly to a broad range of stakeholders Write SQL scripts to validate data in the data warehouse against the data in the source system(s); write SQL scripts to validate data surfacing in BI assets against the data sources Provide training and support to users, improving their experience with BI tools. Department/Project Description About client: Our Client is the fast-paced digital healthcare company which creates the growing portfolio of online health communities for people with chronic conditions. Company’s mission is to improve patient’s quality of life by connecting them with each other, with caregivers and healthcare industry partners, building beneficial social interactions and meaningful health conversations. About project: The core of the project is working with the data; the target is to re-design, transition to cloud and enhance data-driven solutions. The team is responsible for expanding and optimizing the data flow and data pipeline architecture, model and analyze data, interpret trends or patterns in complex data sets and translate them into product and marketing insights. About position: The Data Analytics Engineer is responsible for bringing data from different data sources, modelling data with dbt, developing and performing data quality checks across all data assets. That includes ETL jobs, reports, dashboards, data pipelines and data applications. The primary goal for this role is to ensure high quality of data delivered to internal stakeholders and customers.
Skills
Want AI to find more roles like this?
Upload your CV once. Get matched to relevant assignments automatically.