Piper Maddox
Asset Performance & Renewable Energy Analytics
The opportunity
An established renewable energy and digital solutions business is expanding its
Asset Performance Management (APM)
technology team and is hiring an experienced
Data Engineer
to support large-scale operational renewable assets.
This role sits within a product-focused engineering group responsible for building and scaling data platforms used to monitor, optimise, and improve the performance of wind, solar, and energy storage assets globally.
You will work closely with software engineers, data scientists, and platform teams to design and operate high-quality data pipelines that directly underpin operational decision-making and analytics for live energy assets.
Key responsibilities
Design, build, and maintain scalable data pipelines using
Databricks
(including Delta Live Tables).
Develop robust ETL/ELT workflows ingesting data from operational, telemetry, and third-party systems.
Optimise pipeline performance, reliability, and cost efficiency in cloud environments.
Ensure data quality, lineage, governance, and documentation across production systems.
Collaborate cross-functionally with analytics, product, and platform teams.
Contribute to reusable frameworks and engineering best practices within the team.
Candidates
must
have prior, hands‑on experience working with at least one of the following
APM platforms :
Power Factors
Bazefield
GPM
This experience is critical, as the role involves working directly with data models, integrations, and operational outputs from these platforms.
Technical requirements
Proven experience as a Data Engineer in production environments.
Strong Python and SQL skills.
Hands‑on Databricks experience (DLT, Delta Lake; Unity Catalog desirable).
Solid understanding of data modelling, data warehousing, and distributed systems.
Experience with cloud data platforms (Azure preferred; AWS or GCP acceptable).
Familiarity with Git-based workflows and CI/CD pipelines.
Exposure to analytics or ML-driven use cases is beneficial.
Nice to have
Databricks certifications (Associate or Professional).
Experience supporting asset-heavy or industrial environments.
Background in energy, utilities, or infrastructure data platforms.
Why this role
Work on
live, utility-scale renewable assets
rather than abstract datasets.
High-impact role within a mature but fast-evolving digital platform.
Strong engineering culture with real ownership and technical influence.
Long-term stability combined with ongoing platform growth and investment.
#J-18808-Ljbffr
Asset Performance Management (APM)
technology team and is hiring an experienced
Data Engineer
to support large-scale operational renewable assets.
This role sits within a product-focused engineering group responsible for building and scaling data platforms used to monitor, optimise, and improve the performance of wind, solar, and energy storage assets globally.
You will work closely with software engineers, data scientists, and platform teams to design and operate high-quality data pipelines that directly underpin operational decision-making and analytics for live energy assets.
Key responsibilities
Design, build, and maintain scalable data pipelines using
Databricks
(including Delta Live Tables).
Develop robust ETL/ELT workflows ingesting data from operational, telemetry, and third-party systems.
Optimise pipeline performance, reliability, and cost efficiency in cloud environments.
Ensure data quality, lineage, governance, and documentation across production systems.
Collaborate cross-functionally with analytics, product, and platform teams.
Contribute to reusable frameworks and engineering best practices within the team.
Candidates
must
have prior, hands‑on experience working with at least one of the following
APM platforms :
Power Factors
Bazefield
GPM
This experience is critical, as the role involves working directly with data models, integrations, and operational outputs from these platforms.
Technical requirements
Proven experience as a Data Engineer in production environments.
Strong Python and SQL skills.
Hands‑on Databricks experience (DLT, Delta Lake; Unity Catalog desirable).
Solid understanding of data modelling, data warehousing, and distributed systems.
Experience with cloud data platforms (Azure preferred; AWS or GCP acceptable).
Familiarity with Git-based workflows and CI/CD pipelines.
Exposure to analytics or ML-driven use cases is beneficial.
Nice to have
Databricks certifications (Associate or Professional).
Experience supporting asset-heavy or industrial environments.
Background in energy, utilities, or infrastructure data platforms.
Why this role
Work on
live, utility-scale renewable assets
rather than abstract datasets.
High-impact role within a mature but fast-evolving digital platform.
Strong engineering culture with real ownership and technical influence.
Long-term stability combined with ongoing platform growth and investment.
#J-18808-Ljbffr