Purple Drive
Title:
AWS Databricks Engineer
Location:
Wilmington, DE - Onsite
Type:
Contract Opportunity
Job Overview
We are seeking a highly skilled
AWS Databricks Engineer
to join our team for an onsite contract role in Wilmington, DE. The ideal candidate will have strong experience working with AWS frameworks, Terraform for infrastructure automation, and Databricks for large-scale data processing and analytics. Proficiency in
Scala, Python, and Java
is essential to design, develop, and optimize data solutions in a cloud-native environment.
Key Responsibilities
Design, develop, and optimize
data pipelines and ETL processes
using
Databricks (DBX) . Work extensively with
AWS services (EMR, EKS)
to build scalable and secure data platforms. Implement and manage
Infrastructure as Code (IaC)
using
Terraform . Develop and maintain solutions using
Scala, Python, and Java
for data processing and integration. Collaborate with cross-functional teams to ensure data solutions meet business requirements, scalability, and performance needs. Ensure
data quality, governance, and security
best practices are applied across the architecture. Troubleshoot, optimize, and monitor data workflows to ensure high performance and reliability. Required Skills & Qualifications
8+ years of experience in
Data Engineering / Cloud Engineering . Strong expertise with
AWS EMR and EKS
frameworks. Proven hands-on experience with
Terraform
for infrastructure automation. Advanced knowledge of
Databricks (DBX)
for building large-scale data solutions. Proficiency in programming languages:
Scala, Python, and Java . Solid understanding of
data engineering concepts
including data modeling, pipelines, and distributed computing. Experience with
big data ecosystems
and handling both structured and unstructured data. Strong problem-solving and analytical skills with excellent communication abilities. Preferred Skills
Experience in
CI/CD pipelines
and DevOps best practices. Familiarity with
modern data warehousing
solutions and cloud-native analytics platforms. Exposure to
real-time data processing frameworks .
AWS Databricks Engineer
Location:
Wilmington, DE - Onsite
Type:
Contract Opportunity
Job Overview
We are seeking a highly skilled
AWS Databricks Engineer
to join our team for an onsite contract role in Wilmington, DE. The ideal candidate will have strong experience working with AWS frameworks, Terraform for infrastructure automation, and Databricks for large-scale data processing and analytics. Proficiency in
Scala, Python, and Java
is essential to design, develop, and optimize data solutions in a cloud-native environment.
Key Responsibilities
Design, develop, and optimize
data pipelines and ETL processes
using
Databricks (DBX) . Work extensively with
AWS services (EMR, EKS)
to build scalable and secure data platforms. Implement and manage
Infrastructure as Code (IaC)
using
Terraform . Develop and maintain solutions using
Scala, Python, and Java
for data processing and integration. Collaborate with cross-functional teams to ensure data solutions meet business requirements, scalability, and performance needs. Ensure
data quality, governance, and security
best practices are applied across the architecture. Troubleshoot, optimize, and monitor data workflows to ensure high performance and reliability. Required Skills & Qualifications
8+ years of experience in
Data Engineering / Cloud Engineering . Strong expertise with
AWS EMR and EKS
frameworks. Proven hands-on experience with
Terraform
for infrastructure automation. Advanced knowledge of
Databricks (DBX)
for building large-scale data solutions. Proficiency in programming languages:
Scala, Python, and Java . Solid understanding of
data engineering concepts
including data modeling, pipelines, and distributed computing. Experience with
big data ecosystems
and handling both structured and unstructured data. Strong problem-solving and analytical skills with excellent communication abilities. Preferred Skills
Experience in
CI/CD pipelines
and DevOps best practices. Familiarity with
modern data warehousing
solutions and cloud-native analytics platforms. Exposure to
real-time data processing frameworks .