RIT Solutions, Inc.
senior (1 2 + years ) Azure Data Analyst with extensive experience working with the Azure Data suite listed below. THE CLIENT WOULD LIKE TO SEE CERTIFICATIONS.
****CANDIDATES MUST HAVE RECENT CAPITAL MARKETS/TRADING AND/OR HEDGE FUND EXPERIENCE AND EXCELLENT COMMUNICATION SKILLS. HOT & MOVING FAST!!
*** Candidate Must Have's on a resume and for submittal:
1. How many years working with:
zure Data Analyst 2. How many years working with:
zure Data Factory (ADF 3. How many years working with: Azure Databricks
(highlighted expertise) 4. How many years working with:
Logic Apps 5. How many years working with: Capital Markets/Hedge Funds
Technical Skills: Programming & Tools:
10+ years of experience in
SQL ,
Python .
.Net
is a plus. 5+ years of experience in
Azure cloud services , including:
Azure SQL Server Azure Data Factory (ADF) Azure Databricks
(highlighted expertise) Azure Data Lake Storage (ADLS) Azure Key Vault Azure Functions Logic Apps
5+ years of experience in
GIT
and deploying code using
CI/CD pipelines .
Certifications (Preferred):
Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer Associate
or
Professional
Responsibilities:
Data Pipeline Development:
Create and manage scalable data pipelines to collect, process, and store large volumes of data from various sources.
Data Integration:
Integrate data from multiple sources, ensuring consistency, quality, and reliability.
Database Management:
Design, implement, and optimize database schemas and structures to support data storage and retrieval.
ETL Processes:
Develop and maintain
ETL (Extract, Transform, Load)
processes to ensure accurate and efficient data movement between systems.
Data Warehousing:
Build and maintain data warehouses to support business intelligence and analytics needs.
Performance Optimization:
Optimize data processing and storage performance for efficient resource utilization and quick data retrieval.
Documentation:
Create and maintain comprehensive documentation for data pipelines, ETL processes, and database schemas.
Monitoring and Troubleshooting:
Monitor data pipelines and systems for performance and reliability, troubleshooting and resolving issues as they arise.
Technology Evaluation: Stay updated with emerging technologies and best practices in data engineering, evaluating and recommending
****CANDIDATES MUST HAVE RECENT CAPITAL MARKETS/TRADING AND/OR HEDGE FUND EXPERIENCE AND EXCELLENT COMMUNICATION SKILLS. HOT & MOVING FAST!!
*** Candidate Must Have's on a resume and for submittal:
1. How many years working with:
zure Data Analyst 2. How many years working with:
zure Data Factory (ADF 3. How many years working with: Azure Databricks
(highlighted expertise) 4. How many years working with:
Logic Apps 5. How many years working with: Capital Markets/Hedge Funds
Technical Skills: Programming & Tools:
10+ years of experience in
SQL ,
Python .
.Net
is a plus. 5+ years of experience in
Azure cloud services , including:
Azure SQL Server Azure Data Factory (ADF) Azure Databricks
(highlighted expertise) Azure Data Lake Storage (ADLS) Azure Key Vault Azure Functions Logic Apps
5+ years of experience in
GIT
and deploying code using
CI/CD pipelines .
Certifications (Preferred):
Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer Associate
or
Professional
Responsibilities:
Data Pipeline Development:
Create and manage scalable data pipelines to collect, process, and store large volumes of data from various sources.
Data Integration:
Integrate data from multiple sources, ensuring consistency, quality, and reliability.
Database Management:
Design, implement, and optimize database schemas and structures to support data storage and retrieval.
ETL Processes:
Develop and maintain
ETL (Extract, Transform, Load)
processes to ensure accurate and efficient data movement between systems.
Data Warehousing:
Build and maintain data warehouses to support business intelligence and analytics needs.
Performance Optimization:
Optimize data processing and storage performance for efficient resource utilization and quick data retrieval.
Documentation:
Create and maintain comprehensive documentation for data pipelines, ETL processes, and database schemas.
Monitoring and Troubleshooting:
Monitor data pipelines and systems for performance and reliability, troubleshooting and resolving issues as they arise.
Technology Evaluation: Stay updated with emerging technologies and best practices in data engineering, evaluating and recommending