Southern Arkansas University
A fast-growing AI solutions company is looking for a
Data Engineer
to help power cutting-edge AI systems used by enterprises, governments, and top tech companies. You’ll play a key role in designing, optimizing, and maintaining data pipelines, working in a highly dynamic environment that thrives on innovation. Responsibilities:
Build, optimize, and maintain scalable
ETL pipelines Develop and enhance
data lakes and data warehouses
(e.g., Snowflake, Databricks) Collaborate with engineers and clients to refine
data processing workflows Conduct
code and design reviews
to maintain high-quality data infrastructure Mentor junior engineers and contribute to technical discussions Travel up to
30%
for meetings with clients (as needed) Minimum Requirements:
3+ years
of experience in software development or Big Data Strong knowledge of
Python
and
PySpark Hands-on experience with
ETL frameworks
(e.g., dbt) Experience with
AWS services
(IAM, S3, Security Groups) Proficiency in
Infrastructure-as-Code (Terraform, etc.) Ability to communicate effectively and work in a
fast-paced environment Benefits & Salary:
Salary:
$109,000 - $121,000 (dependent on location and experience) Equity options for senior-level hires Remote-first
with flexible work arrangements Comprehensive
health, dental, and vision insurance Learning & development stipends ️Generous
PTO & paid holidays
#J-18808-Ljbffr
Data Engineer
to help power cutting-edge AI systems used by enterprises, governments, and top tech companies. You’ll play a key role in designing, optimizing, and maintaining data pipelines, working in a highly dynamic environment that thrives on innovation. Responsibilities:
Build, optimize, and maintain scalable
ETL pipelines Develop and enhance
data lakes and data warehouses
(e.g., Snowflake, Databricks) Collaborate with engineers and clients to refine
data processing workflows Conduct
code and design reviews
to maintain high-quality data infrastructure Mentor junior engineers and contribute to technical discussions Travel up to
30%
for meetings with clients (as needed) Minimum Requirements:
3+ years
of experience in software development or Big Data Strong knowledge of
Python
and
PySpark Hands-on experience with
ETL frameworks
(e.g., dbt) Experience with
AWS services
(IAM, S3, Security Groups) Proficiency in
Infrastructure-as-Code (Terraform, etc.) Ability to communicate effectively and work in a
fast-paced environment Benefits & Salary:
Salary:
$109,000 - $121,000 (dependent on location and experience) Equity options for senior-level hires Remote-first
with flexible work arrangements Comprehensive
health, dental, and vision insurance Learning & development stipends ️Generous
PTO & paid holidays
#J-18808-Ljbffr