A7 Recruitment
Data Pipelines
Design, build, and maintain end-to-end ETL pipelines for batch and streaming workloads using Azure Data Factory, Azure SQL, Synapse Analytics, and Databricks.
Automate and orchestrate workflows to ensure efficient data ingestion, transformation, and loading into the data warehouse and marts.
Integrate pipelines with CI/CD practices (Git, Azure DevOps, automated testing, deployment pipelines).
Monitor, debug, and optimize pipelines to meet SLAs for data timeliness, performance, and reliability.
Data Modeling & Warehouse Development
Develop conceptual, logical, and physical data models to represent business entities and relationships.
Implement fact and dimension tables following best practices in dimensional modeling (e.g., star or snowflake schema).
Translate business requirements into scalable warehouse schemas in Azure SQL Database and Azure Synapse.
ETL Process
Extract: Build robust connectors for multiple sources (e.g., APIs, MariaDB, flat files, SaaS systems).
Validate & Cleanse: Apply business rules from the data model, standardize formats, enforce data types, and resolve anomalies.
Transform & Aggregate: Shape data into target schemas, enrich datasets, and summarize measures (e.g., revenue per customer, churn KPIs).
Load: Populate warehouse and data marts with clean, transformed data aligned to the physical data model.
Data Quality & Governance
Implement automated data quality checks for accuracy, completeness, consistency, and lineage tracking.
Collaborate with the BI team to define governance processes, including data ownership, documentation, and access guidelines.
Ensure compliance with security and regulatory standards (HIPAA, PII).
Maintain metadata catalogs and lineage tracking within Azure Purview or similar tools.
Data Marts & Business Enablement
Deliver curated data marts tailored for Sales, Finance, and Marketing analytics.
Partner with BI developers to ensure marts meet reporting needs for Power BI and other visualization tools.
Provide datasets that support advanced analytics initiatives.
Requirements
Bachelor’s degree in Computer Science, Engineering, Information Systems, or related field.
5-7 years of experience in data engineering.
Strong proficiency in SQL and Python for building ETL pipelines.
Proven experience with Azure Data Services (Data Factory, Azure SQL, Synapse, Databricks, Data Lake).
Solid understanding of ETL pipelines and data warehouse architectures.
Experience designing and implementing data models (conceptual, logical, and physical).
Knowledge of data quality frameworks and governance practices.
Familiarity with CI/CD workflows (Git, Azure DevOps, or equivalent).
#J-18808-Ljbffr
Design, build, and maintain end-to-end ETL pipelines for batch and streaming workloads using Azure Data Factory, Azure SQL, Synapse Analytics, and Databricks.
Automate and orchestrate workflows to ensure efficient data ingestion, transformation, and loading into the data warehouse and marts.
Integrate pipelines with CI/CD practices (Git, Azure DevOps, automated testing, deployment pipelines).
Monitor, debug, and optimize pipelines to meet SLAs for data timeliness, performance, and reliability.
Data Modeling & Warehouse Development
Develop conceptual, logical, and physical data models to represent business entities and relationships.
Implement fact and dimension tables following best practices in dimensional modeling (e.g., star or snowflake schema).
Translate business requirements into scalable warehouse schemas in Azure SQL Database and Azure Synapse.
ETL Process
Extract: Build robust connectors for multiple sources (e.g., APIs, MariaDB, flat files, SaaS systems).
Validate & Cleanse: Apply business rules from the data model, standardize formats, enforce data types, and resolve anomalies.
Transform & Aggregate: Shape data into target schemas, enrich datasets, and summarize measures (e.g., revenue per customer, churn KPIs).
Load: Populate warehouse and data marts with clean, transformed data aligned to the physical data model.
Data Quality & Governance
Implement automated data quality checks for accuracy, completeness, consistency, and lineage tracking.
Collaborate with the BI team to define governance processes, including data ownership, documentation, and access guidelines.
Ensure compliance with security and regulatory standards (HIPAA, PII).
Maintain metadata catalogs and lineage tracking within Azure Purview or similar tools.
Data Marts & Business Enablement
Deliver curated data marts tailored for Sales, Finance, and Marketing analytics.
Partner with BI developers to ensure marts meet reporting needs for Power BI and other visualization tools.
Provide datasets that support advanced analytics initiatives.
Requirements
Bachelor’s degree in Computer Science, Engineering, Information Systems, or related field.
5-7 years of experience in data engineering.
Strong proficiency in SQL and Python for building ETL pipelines.
Proven experience with Azure Data Services (Data Factory, Azure SQL, Synapse, Databricks, Data Lake).
Solid understanding of ETL pipelines and data warehouse architectures.
Experience designing and implementing data models (conceptual, logical, and physical).
Knowledge of data quality frameworks and governance practices.
Familiarity with CI/CD workflows (Git, Azure DevOps, or equivalent).
#J-18808-Ljbffr