Logo
Trigyn Technologies Limited.

Senior Data Engineer (Azure Data Factory / Databricks)

Trigyn Technologies Limited., City of White Plains

Save Job

Job Details: Senior Data Engineer (Azure Data Factory / Databricks)

Position Id: J0525-0081

Job Type: 6-12 months (contract)

Country: United States

Location: White Plains, NY

Pay Rate: Open

Job Description:

Our client - a major utility firm based out of Westchester County, NY - has an immediate need for Senior Data Engineer. The particulars of the position are as follows.

Job Functions & Responsibilities:

ETL & Data Integration:
• Design, develop, and optimize ETL pipelines using Azure Databricks, ADF, and Pentaho to support enterprise data workflows.
• Implement and maintain data movement, transformation, and integration across multiple systems.
• Ensure seamless data exchange between cloud, on-prem, and hybrid environments.
• Work with Globalscape FTP for secure file transfers and automation.

API Development and Integration:
• Develop, consume, and integrate RESTful and SOAP APIs to facilitate data.
• Work with API gateways and authentication methods such Oauth, JWT, certificate, and API keys.
• Implement and optimize API-based data extractions and real-time data integrations

Data Quality & Governance:
• Implement data validation, cleansing, and enrichment techniques.
• Develop and execute data reconciliation processes to ensure accuracy and completeness.
• Adhere to data governance policies and security compliance standards.

BAU Support & Performance Optimization:
• Troubleshoot and resolve ETL failures, data load issues, and performance bottlenecks.
• Optimize SQL stored procedures and complex queries for better performance.
• Support ongoing enhancements and provide operational support for existing data pipelines.

Collaboration & Documentation:
• Work closely with Data Analysts, Business Analysts, and stakeholders to understand data needs.
• Document ETL processes, data mappings, and workflows for maintainability and knowledge sharing.
• Provide guidance and best practices to ensure scalability and efficiency of data solutions.

Required Skills & Experience:
• 7+ years of experience in ETL development, data integration, and SQL scripting.
• Strong expertise in Azure Databricks, ADF (Azure Data Factory), and Pentaho.
• Experience handling secure file transfers using Globalscape FTP.
• Hands-on experience in developing and consuming APIs (REST/SOAP).
• Experience working with API security protocols (Oauth, JWT, API Keys, etc.,).
• Proficiency in SQL, stored procedures, performance tuning, and query optimization.
• Understanding of data modeling, data warehousing, and data governance best practices.
• Hands-on experience with cloud-based data platforms (Azure/AWS) is a plus.
• Strong problem-solving skills, troubleshooting abilities, and ability to work independently.
• Excellent communication skills and ability to work in a fast-paced environment.

Preferred Qualifications:
• Experience working in large-scale enterprise data integration projects.
• Knowledge of Python, PySpark for big data processing.
• Familiarity with CI/CD for data pipelines (Azure DevOps, GitHub Actions, etc.).

Education & Certifications
• Bachelor's or Master's degree in a relevant field like Computer science, Data Engineering or related technical field

Nice to have below certifications:
• Databricks certified Data Engineer
• Azure Data Engineer associate.

TRIGYN TECHNOLOGIES, INC. is an EQUAL OPPORTUNITY EMPLOYER and has been in business for 35 years. TRIGYN is an ISO 27001:2022 and CMMI Level 5 certified company.

#J-18808-Ljbffr