Georgia Staffing
ETL Developer
The ETL Developer is responsible for designing, developing, and maintaining data integration solutions that ensure the accurate and efficient movement of data across enterprise systems. This role requires a strong background in database development, data warehousing concepts, and hands-on experience with ETL tools. The ideal candidate is analytical, detail-oriented, and comfortable collaborating with data engineers, analysts, and business stakeholders to deliver high-quality data solutions. Key Responsibilities: Design, develop, and maintain ETL processes and data pipelines to extract, transform, and load data from multiple source systems into data warehouses or data lakes. Work with SQL, stored procedures, and ETL tools (such as Informatica, Talend, SSIS, or DataStage) to build scalable data workflows. Collaborate with business and technical teams to define data requirements, integration rules, and transformation logic. Ensure data quality, consistency, and accuracy through validation, error handling, and audit processes. Optimize ETL performance by tuning SQL queries and improving process efficiency. Support existing data integration jobs by troubleshooting and resolving production issues. Create and maintain detailed technical documentation of ETL processes, mappings, and data flow diagrams. Participate in code reviews, version control, and change management processes to ensure compliance with standards. Work closely with data architects to implement best practices for data modeling and pipeline design. Requirements: Bachelor's degree in Computer Science, Information Systems, or related field. 35 years of hands-on experience in ETL development or data integration roles. Proficiency in SQL and experience with one or more ETL tools such as Informatica, SSIS, Talend, DataStage, or AWS Glue. Strong understanding of data warehousing concepts, dimensional modeling, and relational databases (SQL Server, Oracle, PostgreSQL, or Snowflake). Experience working with data in cloud environments such as AWS, Azure, or Google Cloud. Excellent troubleshooting, debugging, and performance-tuning skills. Ability to collaborate across teams and communicate complex technical issues clearly. Preferred: Experience with Python or Shell scripting for automation and data manipulation. Familiarity with big data technologies (Spark, Hadoop, Databricks) or modern ELT frameworks (dbt). Knowledge of CI/CD pipelines and source control tools like Git. Experience with API integrations and JSON/XML parsing.
The ETL Developer is responsible for designing, developing, and maintaining data integration solutions that ensure the accurate and efficient movement of data across enterprise systems. This role requires a strong background in database development, data warehousing concepts, and hands-on experience with ETL tools. The ideal candidate is analytical, detail-oriented, and comfortable collaborating with data engineers, analysts, and business stakeholders to deliver high-quality data solutions. Key Responsibilities: Design, develop, and maintain ETL processes and data pipelines to extract, transform, and load data from multiple source systems into data warehouses or data lakes. Work with SQL, stored procedures, and ETL tools (such as Informatica, Talend, SSIS, or DataStage) to build scalable data workflows. Collaborate with business and technical teams to define data requirements, integration rules, and transformation logic. Ensure data quality, consistency, and accuracy through validation, error handling, and audit processes. Optimize ETL performance by tuning SQL queries and improving process efficiency. Support existing data integration jobs by troubleshooting and resolving production issues. Create and maintain detailed technical documentation of ETL processes, mappings, and data flow diagrams. Participate in code reviews, version control, and change management processes to ensure compliance with standards. Work closely with data architects to implement best practices for data modeling and pipeline design. Requirements: Bachelor's degree in Computer Science, Information Systems, or related field. 35 years of hands-on experience in ETL development or data integration roles. Proficiency in SQL and experience with one or more ETL tools such as Informatica, SSIS, Talend, DataStage, or AWS Glue. Strong understanding of data warehousing concepts, dimensional modeling, and relational databases (SQL Server, Oracle, PostgreSQL, or Snowflake). Experience working with data in cloud environments such as AWS, Azure, or Google Cloud. Excellent troubleshooting, debugging, and performance-tuning skills. Ability to collaborate across teams and communicate complex technical issues clearly. Preferred: Experience with Python or Shell scripting for automation and data manipulation. Familiarity with big data technologies (Spark, Hadoop, Databricks) or modern ELT frameworks (dbt). Knowledge of CI/CD pipelines and source control tools like Git. Experience with API integrations and JSON/XML parsing.