Addison Group
About the Role
The
Senior Data Engineer
will play a critical role in
building and scaling an enterprise data platform
to enable analytics, reporting, and operational insights across the organization. This position requires
deep expertise in Snowflake and cloud technologies (AWS or Azure) , along with strong
upstream oil & gas domain experience . The engineer will design and optimize data pipelines, enforce data governance and quality standards, and collaborate with cross-functional teams to deliver reliable, scalable data solutions.
Key Responsibilities Data Architecture & Engineering Design, develop, and maintain scalable data pipelines using
Snowflake ,
AWS/Azure , and modern data engineering tools. Implement
ETL/ELT processes
integrating data from upstream systems (SCADA, production accounting, drilling, completions, etc.). Architect data models supporting both operational reporting and advanced analytics. Establish and maintain frameworks for
data quality, validation, and lineage
to ensure enterprise data trust. Platform Development & Optimization Lead the
build and optimization
of Snowflake-based data warehouses for performance and cost efficiency. Design
cloud-native data solutions
leveraging AWS/Azure services (S3, Lambda, Azure Data Factory, Databricks). Manage
large-scale time-series and operational data
processing workflows. Implement strong
security, access control, and governance
practices. Technical Leadership & Innovation Mentor junior data engineers and provide technical leadership across the data platform team. Research and introduce new technologies to enhance platform scalability and automation. Build reusable frameworks, components, and utilities to streamline delivery. Support
AI/ML initiatives
by delivering production-ready, high-quality data pipelines. Business Partnership Collaborate with stakeholders across business units to translate requirements into technical solutions. Work with analysts and data scientists to enable
self-service analytics
and reporting. Ensure data integration supports
regulatory and compliance
reporting. Act as a bridge between business and technical teams to ensure alignment and impact.
Qualifications & Experience Education Bachelors degree in
Computer Science, Engineering, Information Systems, or a related field . Advanced degree or relevant certifications ( SnowPro, AWS/Azure Data Engineer, Databricks ) preferred. Experience 7+ years
in data engineering roles, with at least
3 years on cloud data platforms . Proven expertise in
Snowflake
and at least one major cloud platform ( AWS or Azure ). Hands-on experience with
upstream oil & gas data
(wells, completions, SCADA, production, reserves, etc.). Demonstrated success delivering
operational and analytical data pipelines . Technical Skills Advanced
SQL
and
Python
programming skills. Strong background in
data modeling, ETL/ELT , cataloging, lineage, and data security. Familiarity with
Airflow ,
Azure Data Factory , or similar orchestration tools. Experience with
CI/CD ,
Git , and automated testing. Knowledge of BI tools such as
Power BI, Spotfire, or Tableau . Understanding of
AI/ML data preparation
and integration.
Senior Data Engineer
will play a critical role in
building and scaling an enterprise data platform
to enable analytics, reporting, and operational insights across the organization. This position requires
deep expertise in Snowflake and cloud technologies (AWS or Azure) , along with strong
upstream oil & gas domain experience . The engineer will design and optimize data pipelines, enforce data governance and quality standards, and collaborate with cross-functional teams to deliver reliable, scalable data solutions.
Key Responsibilities Data Architecture & Engineering Design, develop, and maintain scalable data pipelines using
Snowflake ,
AWS/Azure , and modern data engineering tools. Implement
ETL/ELT processes
integrating data from upstream systems (SCADA, production accounting, drilling, completions, etc.). Architect data models supporting both operational reporting and advanced analytics. Establish and maintain frameworks for
data quality, validation, and lineage
to ensure enterprise data trust. Platform Development & Optimization Lead the
build and optimization
of Snowflake-based data warehouses for performance and cost efficiency. Design
cloud-native data solutions
leveraging AWS/Azure services (S3, Lambda, Azure Data Factory, Databricks). Manage
large-scale time-series and operational data
processing workflows. Implement strong
security, access control, and governance
practices. Technical Leadership & Innovation Mentor junior data engineers and provide technical leadership across the data platform team. Research and introduce new technologies to enhance platform scalability and automation. Build reusable frameworks, components, and utilities to streamline delivery. Support
AI/ML initiatives
by delivering production-ready, high-quality data pipelines. Business Partnership Collaborate with stakeholders across business units to translate requirements into technical solutions. Work with analysts and data scientists to enable
self-service analytics
and reporting. Ensure data integration supports
regulatory and compliance
reporting. Act as a bridge between business and technical teams to ensure alignment and impact.
Qualifications & Experience Education Bachelors degree in
Computer Science, Engineering, Information Systems, or a related field . Advanced degree or relevant certifications ( SnowPro, AWS/Azure Data Engineer, Databricks ) preferred. Experience 7+ years
in data engineering roles, with at least
3 years on cloud data platforms . Proven expertise in
Snowflake
and at least one major cloud platform ( AWS or Azure ). Hands-on experience with
upstream oil & gas data
(wells, completions, SCADA, production, reserves, etc.). Demonstrated success delivering
operational and analytical data pipelines . Technical Skills Advanced
SQL
and
Python
programming skills. Strong background in
data modeling, ETL/ELT , cataloging, lineage, and data security. Familiarity with
Airflow ,
Azure Data Factory , or similar orchestration tools. Experience with
CI/CD ,
Git , and automated testing. Knowledge of BI tools such as
Power BI, Spotfire, or Tableau . Understanding of
AI/ML data preparation
and integration.