Hivefs
We are seeking a skilled
Data Engineer
to design, build, and maintain robust cloud-based data pipelines and architectures. The ideal candidate will have hands-on experience working with
Microsoft Fabric
and the broader
Azure data ecosystem . This role is focused on delivering reliable, high-quality data to power analytics, reporting, and operational use cases. Key Responsibilities Design, develop, and maintain
scalable and efficient pipelines
in Microsoft Fabric and Azure. Implement ETL/ELT processes to integrate data from diverse sources. Collaborate with data architects and analysts to deliver solutions aligned with business objectives. Leverage Microsoft Fabric and Azure services to build integrated, cloud-native data platforms.
Data Modeling & Warehousing
Develop data models to support reporting, analytics, and machine learning use cases. Optimize data lakehouse/warehouse solutions for performance and cost-efficiency.
Performance & Quality
Monitor, troubleshoot, and optimize pipelines for
reliability, performance, and data quality . Apply best practices in
data governance, security, and compliance . Evaluate new features and tools within
Azure / Microsoft Fabric
to improve efficiency. Contribute to team knowledge-sharing and process improvements.
Required Qualifications
36+ years of experience in data engineering or related field. Hands-on experience with
Microsoft Fabric
or Azure data platforms.
Technical Skills
Strong SQL and data modeling skills. Experience with
ETL/ELT pipelines
and orchestration (Azure Data Factory, Fabric pipelines, or similar). Programming proficiency in
Python and/or PySpark . Familiarity with
Azure Data Lake, Synapse, SQL DB, and Key Vault . Exposure to
NoSQL databases
(e.g., MongoDB Atlas) is a plus.
Soft Skills
Strong problem-solving and analytical mindset. Effective communicator who can collaborate across teams. Comfortable working in a
fast-paced, cloud-first environment .
Preferred Qualifications
Bachelors degree in Computer Science, Engineering, or related field. Familiarity with DevOps practices, CI/CD, or containerization is a plus.
Interested in building your career at Hive Financial Systems? Get future opportunities sent straight to your email. #J-18808-Ljbffr
Data Engineer
to design, build, and maintain robust cloud-based data pipelines and architectures. The ideal candidate will have hands-on experience working with
Microsoft Fabric
and the broader
Azure data ecosystem . This role is focused on delivering reliable, high-quality data to power analytics, reporting, and operational use cases. Key Responsibilities Design, develop, and maintain
scalable and efficient pipelines
in Microsoft Fabric and Azure. Implement ETL/ELT processes to integrate data from diverse sources. Collaborate with data architects and analysts to deliver solutions aligned with business objectives. Leverage Microsoft Fabric and Azure services to build integrated, cloud-native data platforms.
Data Modeling & Warehousing
Develop data models to support reporting, analytics, and machine learning use cases. Optimize data lakehouse/warehouse solutions for performance and cost-efficiency.
Performance & Quality
Monitor, troubleshoot, and optimize pipelines for
reliability, performance, and data quality . Apply best practices in
data governance, security, and compliance . Evaluate new features and tools within
Azure / Microsoft Fabric
to improve efficiency. Contribute to team knowledge-sharing and process improvements.
Required Qualifications
36+ years of experience in data engineering or related field. Hands-on experience with
Microsoft Fabric
or Azure data platforms.
Technical Skills
Strong SQL and data modeling skills. Experience with
ETL/ELT pipelines
and orchestration (Azure Data Factory, Fabric pipelines, or similar). Programming proficiency in
Python and/or PySpark . Familiarity with
Azure Data Lake, Synapse, SQL DB, and Key Vault . Exposure to
NoSQL databases
(e.g., MongoDB Atlas) is a plus.
Soft Skills
Strong problem-solving and analytical mindset. Effective communicator who can collaborate across teams. Comfortable working in a
fast-paced, cloud-first environment .
Preferred Qualifications
Bachelors degree in Computer Science, Engineering, or related field. Familiarity with DevOps practices, CI/CD, or containerization is a plus.
Interested in building your career at Hive Financial Systems? Get future opportunities sent straight to your email. #J-18808-Ljbffr