Piper Companies
Overview
Piper Companies
is hiring an
AVP Data Engineer
for a global financial institution located in Cary, NC. The
AVP Data Engineer
will be hands-on and responsible for designing and implementing complex data pipelines using Google Cloud Platform (GCP) technologies. This is a full-time opportunity and requires the candidate to sit on site 3 days per week in Cary, NC. Responsibilities
Design and develop scalable data pipelines using GCP services such as BigQuery, Cloud Composer, and Datastore Build and optimize ETL workflows using Airflow and Directed Acyclic Graphs (DAGs) Implement data governance, security, and compliance best practices Collaborate with business stakeholders to translate technical concepts into actionable insights Provide documentation and training to application teams to support integration and handoff Qualifications
6+ years of experience in data engineering, data warehousing, and business intelligence Master-level proficiency in Python Strong experience with GCP and its data services (BigQuery, Cloud Storage, Dataflow) Expertise in SQL (T-SQL, PL/SQL, ANSI SQL) Experience with Airflow, DAGs, and ETL pipeline development Familiarity with BI tools such as Tableau or Looker Knowledge of relational and dimensional modeling techniques Experience with ESG Risk, CSRD, or regulatory reporting is a plus Terraform experience is a bonus Compensation
$125,000–$145,000 +4% Bonus Full Comprehensive Benefits: Health, Vision, Dental, PTO, Paid Holiday, Sick Leave if Required by Law Keywords:
Data Engineer, AVP, GCP, Google Cloud Platform, Python, Airflow, DAG, ETL, BigQuery, SQL, Dataflow, Datastore, Tableau, Looker, ESG, Regulatory Reporting, Terraform, Cary, Hybrid #LI-CL1 #LI-HYBRID This Job opens for applications on 10/17/2025. Applications for this job will be accepted for at least 30 days from the posting date.
#J-18808-Ljbffr
Piper Companies
is hiring an
AVP Data Engineer
for a global financial institution located in Cary, NC. The
AVP Data Engineer
will be hands-on and responsible for designing and implementing complex data pipelines using Google Cloud Platform (GCP) technologies. This is a full-time opportunity and requires the candidate to sit on site 3 days per week in Cary, NC. Responsibilities
Design and develop scalable data pipelines using GCP services such as BigQuery, Cloud Composer, and Datastore Build and optimize ETL workflows using Airflow and Directed Acyclic Graphs (DAGs) Implement data governance, security, and compliance best practices Collaborate with business stakeholders to translate technical concepts into actionable insights Provide documentation and training to application teams to support integration and handoff Qualifications
6+ years of experience in data engineering, data warehousing, and business intelligence Master-level proficiency in Python Strong experience with GCP and its data services (BigQuery, Cloud Storage, Dataflow) Expertise in SQL (T-SQL, PL/SQL, ANSI SQL) Experience with Airflow, DAGs, and ETL pipeline development Familiarity with BI tools such as Tableau or Looker Knowledge of relational and dimensional modeling techniques Experience with ESG Risk, CSRD, or regulatory reporting is a plus Terraform experience is a bonus Compensation
$125,000–$145,000 +4% Bonus Full Comprehensive Benefits: Health, Vision, Dental, PTO, Paid Holiday, Sick Leave if Required by Law Keywords:
Data Engineer, AVP, GCP, Google Cloud Platform, Python, Airflow, DAG, ETL, BigQuery, SQL, Dataflow, Datastore, Tableau, Looker, ESG, Regulatory Reporting, Terraform, Cary, Hybrid #LI-CL1 #LI-HYBRID This Job opens for applications on 10/17/2025. Applications for this job will be accepted for at least 30 days from the posting date.
#J-18808-Ljbffr