Logo
RIT Solutions, Inc.

Data Engineer

RIT Solutions, Inc., Atlanta, Georgia, United States, 30383

Save Job

Job Title - Data Engineer (Snowflake, Azure, ETL, SQL) Location - Remote Duration - 6+ months Interview - Video

Job Description -

Responsibilities:

Lead the development and management of ETL processes for Program Integrity data marts in Snowflake and SQL Server. Design, build, and maintain data pipelines using Azure Data Factory (ADF) to support ETL operations and Peer Group Profiling procedures and outputs. Implement and manage DevOps practices to ensure efficient data loading and workload balancing within the Program Integrity data mart. Collaborate with cross-functional teams to ensure seamless data integration and consistent data flow across platforms. Monitor, optimize, and troubleshoot data pipelines and workflows to maintain performance and reliability. Independently identify and resolve complex technical issues to maintain operational efficiency. Communicate effectively and foster collaboration across teams to support project execution and alignment. Apply advanced analytical skills and attention to detail to uphold data accuracy and quality in all deliverables. Lead cloud-based project delivery, ensuring adherence to timelines, scope, and performance benchmarks. Ensure data security and compliance with relevant industry standards and regulatory requirements.

TOP 3 REQUIREMENTS:

Over 5 years of hands-on expertise in cloud-based data engineering and data warehousing, with a strong emphasis on building scalable and efficient data solutions. More than 2 years of focused experience in Snowflake, including the design, development, and maintenance of automated data ingestion workflows using Snowflake SQL procedures and Python scripting. Practical experience with Azure data tools such as Azure Data Factory (ADF), SQL Server, and Blob Storage, alongside Snowflake.

Additional Requirements:

Skilled in managing various data formats (CSV, JSON, VARIANT) and executing data loading/exporting tasks using SnowSQL, with orchestration via ADF. Proficient in using data science and analytics tools like Snowpark, Apache Spark, pandas, NumPy, and Scikit-learn for complex data processing and modeling. Strong experience with Python and other scripting languages for data manipulation and automation. Proven ability to develop cloud-native ETL processes and data pipelines using modern technologies on Azure and Snowflake. Demonstrated excellence in analytical thinking and independent problem-solving. Strong interpersonal skills with a track record of effective communication and teamwork. Consistent success in delivering projects within cloud environments, meeting performance and quality standards. Working knowledge of medical claims processing systems, including familiarity with core functionalities, workflows, and data structures used in healthcare claims management.

Preferred:

Certification in Azure or Snowflake Experience with data modeling and database design Knowledge of data governance and data quality best practices Familiarity with other cloud platforms (e.g., AWS, Google Cloud)

Experience that will set candidates apart:

Recent medical claim processing experience. Data science and analytics experience.

Ideal Background:

Experience with setting up DDL, mapping data, and extract, transform, and load (ETL) procedures in Snowflake and SQL Server. Experience with Azure services; Azure Data Factory pipelines for ETL and movement of data between Snowflake and SQL Server. Experience with identifying and creating balancing procedures with medical claim data, and ability to resolve balancing issues. Ability to easily communicate process with technical staff as well as non-technical staff. Experience with creating algorithms or models with tools like like Snowpark, Apache Spark, pandas, NumPy, and Scikit-learn for complex data processing and modeling. Experience with Python - used for data manipulation and automation.