Logo
CyberTec

Data Engineer

CyberTec, Houston, Texas, United States, 77246

Save Job

Role:

Data Engineer Houston, Texas- Hybrid role so local only Rate: $65hr C2C Client- manufacturing company Visa-NO H1-B and no fake GC Please 6-12 month

Are you a skilled Data Engineer

with expertise in Azure, ADF, AI/ML, and Microsoft Fabric We're looking for a talented individual to join our team in a mid to senior-level position.

About the Role: Join us in Houston, Texas, for a Hybrid/onsite position as a Data Engineer focusing on cloud technology, data pipelines, and intelligent systems. If you're passionate about building scalable data solutions using Azure Data Factory, integrating AI/ML models, and leveraging Microsoft Fabric, this role is perfect for you.

Key Responsibilities: - Design, build, and maintain robust data pipelines with

Azure Data Factory (ADF) and Azure Synapse. - Develop and optimize data models and ETL/ELT processes for both structured and unstructured data. - Collaborate with data scientists to deploy AI/ML models in production environments. - Utilize Microsoft Fabric for unified data experiences in analytics, governance, and business intelligence. - Ensure data quality, security, and compliance throughout the data lifecycle. - Translate business requirements into scalable data solutions with cross-functional teams.

Required Skills & Experience: - 5+ years of experience in data engineering or related roles. - Hands-on expertise with Azure Data Factory, Azure Data Lake, Azure Synapse, and Azure SQL. -

Proficiency in Python, SQL, and data transformation tools. - Experience deploying AI/ML models in production environments. - Familiarity with Microsoft Fabric components like OneLake, Power BI, and Data Activator. - Strong understanding of data architecture, governance, and DevOps practices.

Qualifications: - Microsoft certifications such as Azure Data Engineer Associate or Azure AI Engineer. - Experience with real-time data streaming tools like Azure Stream Analytics or Kafka. - Knowledge of MLOps and CI/CD pipelines for data workflows.