Jobs via Dice
Overview
Excellerate Consulting is seeking a Senior ETL Developer / Data Engineer to design, develop, and optimize ETL workflows and data pipelines using Snowflake and AWS services. Apply via Dice. Responsibilities
Design, develop, and optimize ETL workflows and data pipelines using Snowflake and AWS services. Implement data ingestion from various sources including APIs, databases, and flat files. Ensure data quality, integrity, and consistency across all ETL processes. Collaborate with data architects, analysts, and business stakeholders to understand data requirements. Monitor and troubleshoot ETL jobs and performance issues. Automate data workflows and implement CI/CD practices for data pipeline deployment. Maintain documentation for ETL processes, data models, and data flow diagrams. Qualifications
Bachelor's degree in computer science, Information Systems, or related field. 9+ years of experience in ETL development and data engineering. Hands-on experience with Snowflake including data modeling, performance tuning, and SQL scripting. Proficiency in AWS services such as S3, Lambda, Glue, Redshift, and CloudWatch. Strong programming skills in Python or Scala for data processing. Experience with orchestration tools like Apache Airflow or AWS Step Functions. Familiarity with version control systems (e.g., Git) and CI/CD pipelines. Job Details
Employment type: Full-time Location: San Diego, CA Salary: $120,000.00-$135,000.00
#J-18808-Ljbffr
Excellerate Consulting is seeking a Senior ETL Developer / Data Engineer to design, develop, and optimize ETL workflows and data pipelines using Snowflake and AWS services. Apply via Dice. Responsibilities
Design, develop, and optimize ETL workflows and data pipelines using Snowflake and AWS services. Implement data ingestion from various sources including APIs, databases, and flat files. Ensure data quality, integrity, and consistency across all ETL processes. Collaborate with data architects, analysts, and business stakeholders to understand data requirements. Monitor and troubleshoot ETL jobs and performance issues. Automate data workflows and implement CI/CD practices for data pipeline deployment. Maintain documentation for ETL processes, data models, and data flow diagrams. Qualifications
Bachelor's degree in computer science, Information Systems, or related field. 9+ years of experience in ETL development and data engineering. Hands-on experience with Snowflake including data modeling, performance tuning, and SQL scripting. Proficiency in AWS services such as S3, Lambda, Glue, Redshift, and CloudWatch. Strong programming skills in Python or Scala for data processing. Experience with orchestration tools like Apache Airflow or AWS Step Functions. Familiarity with version control systems (e.g., Git) and CI/CD pipelines. Job Details
Employment type: Full-time Location: San Diego, CA Salary: $120,000.00-$135,000.00
#J-18808-Ljbffr