Logo
G T Limited

Data Engineer (Azure) | KD Pharma

G T Limited, Washington, District of Columbia, us, 20022

Save Job

Data Engineer (Azure)

GT was founded in 2019 by a former Apple, Nest, and Google executive. GT's mission is to connect the world's best talent with product careers offered by high-growth companies in the UK, USA, Canada, Germany, and the Netherlands. On behalf of KD Pharma, GT is looking for a Data Engineer (Azure) interested in building and scaling a modern data platform to support Finance, Operations/Supply Chain, and Quality/Manufacturing functions. About the Client & the Project

Founded in 1988, KD Pharma is a pure-play, technology-driven CDMO (Contract Development & Manufacturing Organization) dedicated to revolutionizing pharmaceutical and nutraceutical production. They are uniquely recognized for offering ultra-pure Omega-3 concentrates at commercial scale through patented supercritical fluid chromatography technologies. Located in Germany, Norway, the UK, the USA, Canada, Peru, and with sales presence across Asia, they deliver end-to-end solutions from custom synthesis to finished dosage forms while adhering to cGMP and global certifications. The project is to establish a robust Azure-based data platform for business intelligence (BI). It includes assessing Microsoft Fabric vs. Azure Data Factory (ADF) and, if needed, re-platforming to a scalable ADF-led architecture. About the Role

This role will focus on improving data quality, lineage, reliability, and time-to-insight, while helping reassess Fabric vs. an Azure Data Factory architecture. This is a high-impact role offering the long-term opportunity to own and shape the Azure data platform and grow as a trusted data leader in a global organization. Success Measures: Audit current estate, define migration plan, build ADF pipelines for priority sources (Business Central, TrackWise), achieve >98% dataset refresh success, establish baseline data quality checks & lineage. Deliver a consolidated Lakehouse/Warehouse with governed semantic models, optimized for cost & performance, with documented controls and stakeholder CSAT/NPS ?8/10. Responsibilities

Own the Azure data platform architecture & roadmap (ADF vs Fabric; Synapse/Databricks evaluation) Design, build, and operate ETL/ELT pipelines into ADLS/Warehouse Model data for Power BI (DAX/Tabular) Implement data quality, lineage, governance & security (Purview, RBAC, CI/CD) Partner with BI analysts to deliver reusable, trusted semantic models and dashboards Drive reliability & cost optimization (monitoring, alerting, SLAs) Support immediate projects: Business Central (ERP + MES), TrackWise (QMS), ECC6 extracts Essential Knowledge, Skills & Experience

Experience Level: 4-6 years of experience Strong expertise in Azure Data Factory & Azure Data Lake Gen2 Advanced SQL/T-SQL Power BI (DAX, Tabular modeling, deployment pipelines) Python or PySpark Git & Azure DevOps (CI/CD pipelines) Dimensional modeling Security & RBAC Nice-to-Have

Experience with Synapse, Databricks, and Delta Lake Knowledge of Microsoft Purview, IaC (Bicep/Terraform) Familiarity with ML basics Background in regulated manufacturing/pharma (GxP) can be learned Soft Skills

Strong communication & collaboration abilities Pragmatic "architect-builder" mindset able to balance strategy with hands-on delivery Comfort in leading technology choices and engaging stakeholders Results-driven with focus on data reliability, governance, and business value Interview Steps

GT interview with Recruiter Technical interview Final interview Reference Check Offer