Motion Recruitment
A leading national automotive finance and warranty services provider headquartered in Scottsdale, Arizona is seeking a highly skilled
Senior Data Engineer
to join our growing
Data Engineering
team. We're looking for a hands-on technical expert with a passion for architecting scalable data solutions that support enterprise-wide reporting, compliance, and analytics.
Qualifications: Hold a bachelor's degree in Computer Science, Information Systems, or a related field. Bring
10+ years
of experience in data engineering or closely related roles. Extensive experience building and maintaining
ETL/ELT pipelines
for both batch and streaming data. Advanced SQL skills (complex joins, window functions, CTEs) and strong proficiency in
Python
for automation, pipeline development, and data validation. Proven success with
cloud-based data platforms
such as
Snowflake ,
Databricks , and
AWS
(e.g., S3, Glue, Redshift, Lambda, Secrets Manager). Solid knowledge of
data warehousing
principles, including dimensional modeling, star schemas, slowly changing dimensions, and aggregate strategies. Experience working with
unstructured and semi-structured data
(e.g., JSON, APIs, NoSQL databases). Skilled in using
workflow orchestration tools
like Airflow or Prefect. Familiar with data transformation and integration tools like
dbt ,
Talend , or
FiveTran . Strong understanding of data governance and regulatory compliance frameworks such as
PCI-DSS ,
GDPR ,
SOX , and
CPRA . Bonus: Experience with
Lakehouse technologies
such as Delta Lake, Apache Iceberg, or Hudi. Daily Duties:
Design, develop, and maintain
scalable data pipelines
that support enterprise reporting, regulatory compliance, and analytics. Implement
data validation frameworks
to ensure high accuracy, consistency, and integrity across systems. Administer and optimize
cloud-native data platforms
and services with a focus on performance, security, and cost-efficiency. Support both operational and analytical data needs across departments, including Finance, Audit, and Reporting. Collaborate with stakeholders to align data architecture with evolving business and compliance requirements. Build and maintain
data models and structures
to enable reliable self-service analytics through tools like Tableau. Create and iterate on
proof-of-concepts
to validate architectural improvements and new technologies. Take full ownership of projects from design through delivery, ensuring results are measurable and impactful. Document systems, pipelines, and architecture thoroughly to support long-term scalability. Proactively identify opportunities to improve data infrastructure and mentor junior engineers.
Posted by:
Carter Smith
Specialization :
Data Engineering
Senior Data Engineer
to join our growing
Data Engineering
team. We're looking for a hands-on technical expert with a passion for architecting scalable data solutions that support enterprise-wide reporting, compliance, and analytics.
Qualifications: Hold a bachelor's degree in Computer Science, Information Systems, or a related field. Bring
10+ years
of experience in data engineering or closely related roles. Extensive experience building and maintaining
ETL/ELT pipelines
for both batch and streaming data. Advanced SQL skills (complex joins, window functions, CTEs) and strong proficiency in
Python
for automation, pipeline development, and data validation. Proven success with
cloud-based data platforms
such as
Snowflake ,
Databricks , and
AWS
(e.g., S3, Glue, Redshift, Lambda, Secrets Manager). Solid knowledge of
data warehousing
principles, including dimensional modeling, star schemas, slowly changing dimensions, and aggregate strategies. Experience working with
unstructured and semi-structured data
(e.g., JSON, APIs, NoSQL databases). Skilled in using
workflow orchestration tools
like Airflow or Prefect. Familiar with data transformation and integration tools like
dbt ,
Talend , or
FiveTran . Strong understanding of data governance and regulatory compliance frameworks such as
PCI-DSS ,
GDPR ,
SOX , and
CPRA . Bonus: Experience with
Lakehouse technologies
such as Delta Lake, Apache Iceberg, or Hudi. Daily Duties:
Design, develop, and maintain
scalable data pipelines
that support enterprise reporting, regulatory compliance, and analytics. Implement
data validation frameworks
to ensure high accuracy, consistency, and integrity across systems. Administer and optimize
cloud-native data platforms
and services with a focus on performance, security, and cost-efficiency. Support both operational and analytical data needs across departments, including Finance, Audit, and Reporting. Collaborate with stakeholders to align data architecture with evolving business and compliance requirements. Build and maintain
data models and structures
to enable reliable self-service analytics through tools like Tableau. Create and iterate on
proof-of-concepts
to validate architectural improvements and new technologies. Take full ownership of projects from design through delivery, ensuring results are measurable and impactful. Document systems, pipelines, and architecture thoroughly to support long-term scalability. Proactively identify opportunities to improve data infrastructure and mentor junior engineers.
Posted by:
Carter Smith
Specialization :
Data Engineering