TPI Global Solutions
Let’s Talk | Certified Recruiter | Hiring for FinTech & Crypto Giants, State & Federal Clients | Cloud, AI/ML, Cybersecurity, SAP & Data Engineering…
Locations:
Montgomery, AL (Onsite)
Duration:
Contract to Hire
Visa:
USC or GC Only
Experience Needed
5–7 years in data engineering or database development.
Hands‑on experience with SQL Server ETL/ELT pipelines.
Experience integrating pipelines with cloud services (AWS Glue, Azure Data Factory, GCP Dataflow).
Familiarity with streaming technologies (Kafka, Kinesis).
Experience in
data modeling and architecture design .
Exposure to
DevOps automation
(Terraform, Ansible) and
containerization
(Docker, Kubernetes).
DevOps and automation maturity with certifications (HashiCorp Terraform Associate, AWS DevOps Engineer) and
containerization
(Docker, Kubernetes).
Preferred:
Advanced programming depth with applied coursework or certifications (Python Institute PCPP, Scala Professional Certification).
Preferred:
Data modeling specialization with advanced coursework or vendor‑specific training (Snowflake, AWS Big Data Specialty).
Education
Bachelor’s degree in Computer Science, Software Engineering, or related technical field.
Certifications (Preferred)
Google Professional Data Engineer
Software Use
SQL Server (ETL/ELT pipelines, stored procedures).
Orchestration tools (Airflow, DBT).
Cloud integration services (AWS Glue, Azure Data Factory, GCP Dataflow).
Observability tools (OpenLineage, Monte Carlo).
Seniority level
Mid-Senior level
Employment type
Contract
Job function
Information Technology
Industries
Staffing and Recruiting, IT Services and IT Consulting, and Data Infrastructure and Analytics
#J-18808-Ljbffr
Montgomery, AL (Onsite)
Duration:
Contract to Hire
Visa:
USC or GC Only
Experience Needed
5–7 years in data engineering or database development.
Hands‑on experience with SQL Server ETL/ELT pipelines.
Experience integrating pipelines with cloud services (AWS Glue, Azure Data Factory, GCP Dataflow).
Familiarity with streaming technologies (Kafka, Kinesis).
Experience in
data modeling and architecture design .
Exposure to
DevOps automation
(Terraform, Ansible) and
containerization
(Docker, Kubernetes).
DevOps and automation maturity with certifications (HashiCorp Terraform Associate, AWS DevOps Engineer) and
containerization
(Docker, Kubernetes).
Preferred:
Advanced programming depth with applied coursework or certifications (Python Institute PCPP, Scala Professional Certification).
Preferred:
Data modeling specialization with advanced coursework or vendor‑specific training (Snowflake, AWS Big Data Specialty).
Education
Bachelor’s degree in Computer Science, Software Engineering, or related technical field.
Certifications (Preferred)
Google Professional Data Engineer
Software Use
SQL Server (ETL/ELT pipelines, stored procedures).
Orchestration tools (Airflow, DBT).
Cloud integration services (AWS Glue, Azure Data Factory, GCP Dataflow).
Observability tools (OpenLineage, Monte Carlo).
Seniority level
Mid-Senior level
Employment type
Contract
Job function
Information Technology
Industries
Staffing and Recruiting, IT Services and IT Consulting, and Data Infrastructure and Analytics
#J-18808-Ljbffr