Phiture
About Us
Komodo Consulting is a technology and strategy firm specializing in Digital Transformation. Operating in Portugal and Poland, we provide IT Consulting & Nearshore services. We support both public and private sector organizations through two main areas:
Consulting with a focus on strategy, investment analysis, and digital process improvement;
IT Team Augmentation helping clients scale and strengthen their tech teams.
The project We are seeking a Senior AI Data Engineer (Microsoft Fabric & Azure) to work on a project for a Software House.
You will have the following responsibilities:
Design, build, and maintain scalable data pipelines on Microsoft Fabric (Data Factory, Dataflows, Lakehouse, Warehouse, OneLake);
Develop and optimize data ingestion, transformation, and orchestration processes using Azure Data Factory, Azure Databricks, Azure Synapse, and Azure Functions;
Implement Delta Lake and lakehouse architectures and build reusable data products to support AI, machine learning, and analytics workloads;
Partner with Data Scientists and ML Engineers to prepare feature-rich datasets, operationalize models, and automate MLOps workflows using Azure Machine Learning, Fabric Data Science experiences, or Databricks;
Implement data quality, observability, monitoring, lineage, and governance frameworks using Fabric and Azure Purview, ensuring compliance with data privacy and security standards;
Manage CI/CD pipelines for data code using Azure DevOps or GitHub;
Collaborate with business stakeholders to understand requirements, deliver high-value data solutions, maintain documentation, and mentor junior developers.
You need to have the following skills/experience:
10+ years of experience as a Data Engineer, AI Engineer, or in a similar data-focused role;
Strong hands-on experience with Microsoft Fabric, including Lakehouse, Data Engineering, Data Factory, and Data Pipelines;
Expert knowledge of Azure data services such as Azure Data Factory, Azure Databricks/Spark, Azure Synapse/SQL Pools, Azure Storage (ADLS Gen2), Azure Functions, and Azure Machine Learning;
Solid proficiency in Python and SQL for data processing and transformation;
Proven experience designing and implementing data lakehouse architectures and Delta Lake pipelines;
Good understanding of modern DevOps/MLOps practices, including version control, testing, and automated deployments;
Familiarity with data governance and cataloging frameworks using Azure Purview or Fabric governance.
English level: C1 or higher.
Location Full Remote - Portugal
#J-18808-Ljbffr
Consulting with a focus on strategy, investment analysis, and digital process improvement;
IT Team Augmentation helping clients scale and strengthen their tech teams.
The project We are seeking a Senior AI Data Engineer (Microsoft Fabric & Azure) to work on a project for a Software House.
You will have the following responsibilities:
Design, build, and maintain scalable data pipelines on Microsoft Fabric (Data Factory, Dataflows, Lakehouse, Warehouse, OneLake);
Develop and optimize data ingestion, transformation, and orchestration processes using Azure Data Factory, Azure Databricks, Azure Synapse, and Azure Functions;
Implement Delta Lake and lakehouse architectures and build reusable data products to support AI, machine learning, and analytics workloads;
Partner with Data Scientists and ML Engineers to prepare feature-rich datasets, operationalize models, and automate MLOps workflows using Azure Machine Learning, Fabric Data Science experiences, or Databricks;
Implement data quality, observability, monitoring, lineage, and governance frameworks using Fabric and Azure Purview, ensuring compliance with data privacy and security standards;
Manage CI/CD pipelines for data code using Azure DevOps or GitHub;
Collaborate with business stakeholders to understand requirements, deliver high-value data solutions, maintain documentation, and mentor junior developers.
You need to have the following skills/experience:
10+ years of experience as a Data Engineer, AI Engineer, or in a similar data-focused role;
Strong hands-on experience with Microsoft Fabric, including Lakehouse, Data Engineering, Data Factory, and Data Pipelines;
Expert knowledge of Azure data services such as Azure Data Factory, Azure Databricks/Spark, Azure Synapse/SQL Pools, Azure Storage (ADLS Gen2), Azure Functions, and Azure Machine Learning;
Solid proficiency in Python and SQL for data processing and transformation;
Proven experience designing and implementing data lakehouse architectures and Delta Lake pipelines;
Good understanding of modern DevOps/MLOps practices, including version control, testing, and automated deployments;
Familiarity with data governance and cataloging frameworks using Azure Purview or Fabric governance.
English level: C1 or higher.
Location Full Remote - Portugal
#J-18808-Ljbffr