Cynet systems Inc
Azure Data Engineer - Remote / Telecommute
Cynet systems Inc, Dallas, Texas, United States, 75215
Job Description
Develop, test, document, and support scalable data pipelines. Build and evolve data integrations, including APIs, to handle increasing data volume and complexity. Establish and follow data governance processes to ensure data availability, consistency, integrity, and security. Build, implement, and maintain scalable solutions aligned with data governance standards and architectural roadmaps. Collaborate with analytics and business teams to improve data models feeding business intelligence tools. Design and develop data integrations and a data quality framework; write unit, integration, and functional tests. Design, implement, and automate deployment of distributed systems for collecting and processing streaming events from multiple sources. Perform data analysis to troubleshoot and resolve data-related issues. Guide and mentor junior engineers on coding best practices and optimization. Requirements / Must Have
Bachelor’s degree in Computer Science, Mathematics, Statistics, or a related technical field, or equivalent experience. 5+ years of relevant experience in analytics, data engineering, business intelligence, or related field. Strong programming skills in Python, PySpark, and SQL. Experience with Databricks. Experience developing integrations across multiple systems and APIs. Experience with cloud-based databases, specifically Azure technologies (Azure Data Lake, ADF, Azure DevOps, Azure Functions). Experience writing SQL queries for large-scale, complex datasets. Experience with data warehouse technologies and creating ETL/ELT jobs. Strong problem-solving and troubleshooting skills. Process-oriented with excellent documentation skills. Preferred / Nice to Have
Experience designing data schemas and operating SQL/NoSQL database systems. Experience with Kafka, Flink, Fivetran, and Matillion. Experience in Data Science and Machine Learning. Software engineering experience. Experience with Snowflake. Familiarity with Agile software development methodologies. Skills
Strong analytical and problem-solving skills. Ability to mentor and guide junior engineers. Effective collaboration and communication skills. Attention to detail and documentation discipline.
#J-18808-Ljbffr
Develop, test, document, and support scalable data pipelines. Build and evolve data integrations, including APIs, to handle increasing data volume and complexity. Establish and follow data governance processes to ensure data availability, consistency, integrity, and security. Build, implement, and maintain scalable solutions aligned with data governance standards and architectural roadmaps. Collaborate with analytics and business teams to improve data models feeding business intelligence tools. Design and develop data integrations and a data quality framework; write unit, integration, and functional tests. Design, implement, and automate deployment of distributed systems for collecting and processing streaming events from multiple sources. Perform data analysis to troubleshoot and resolve data-related issues. Guide and mentor junior engineers on coding best practices and optimization. Requirements / Must Have
Bachelor’s degree in Computer Science, Mathematics, Statistics, or a related technical field, or equivalent experience. 5+ years of relevant experience in analytics, data engineering, business intelligence, or related field. Strong programming skills in Python, PySpark, and SQL. Experience with Databricks. Experience developing integrations across multiple systems and APIs. Experience with cloud-based databases, specifically Azure technologies (Azure Data Lake, ADF, Azure DevOps, Azure Functions). Experience writing SQL queries for large-scale, complex datasets. Experience with data warehouse technologies and creating ETL/ELT jobs. Strong problem-solving and troubleshooting skills. Process-oriented with excellent documentation skills. Preferred / Nice to Have
Experience designing data schemas and operating SQL/NoSQL database systems. Experience with Kafka, Flink, Fivetran, and Matillion. Experience in Data Science and Machine Learning. Software engineering experience. Experience with Snowflake. Familiarity with Agile software development methodologies. Skills
Strong analytical and problem-solving skills. Ability to mentor and guide junior engineers. Effective collaboration and communication skills. Attention to detail and documentation discipline.
#J-18808-Ljbffr