Logo
Optimism

Senior Analytics Engineer (Contract)

Optimism, Snowflake, Arizona, United States, 85937

Save Job

As an Analytics Engineer at the Uniswap Foundation, youll own the transformation layer that turns raw onchain and offchain data into reliable, analytics?ready models. Today, youll leverage Dunes public dbt Spellbook; tomorrow, youll architect our own BigQuery/Snowflake pipelines, custom indexers or subgraphs. Youll enable Data Analysts and Growth Managers to generate insights that power our grants and liquidity?mining programs, collaborating across teams to guarantee data quality, consistency and accessibility. What You'll Do

Build & optimize data models

(dbt or equivalent) for Uniswap, hook protocols and broader DEX metrics, ensuring accuracy, consistency and performance Develop & maintain pipelines

to Ingest onchain events, API feeds and third?party sources into Dune/BigQuery/Snowflake, with monitoring and alerting Optimize pipeline health : Implement monitoring, alerting and root?cause workflows to quickly detect and resolve data issues Collaborate & iterate : Partner with Data Analysts, Growth and Research teams to refine schema, metrics and dashboards making data intuitive to query and interpret Centralize data sources : Merge disparate feeds into a unified repository while provisioning data to where its needed Plan & build in?house models : As needed, gradually transition transformations into BigQuery or Snowflake design schemas, materializations and deployment workflows Champion best practices : Contribute to open standards in the Uniswap and DEX communities Stay current:

Evaluate emerging data?engineering tools and cloud services (BigQuery, Snowflake, AWS/GCP) and recommend enhancements to our stack

Who You Are

Engineering?minded : you treat analytics transformations as production code robust, testable and maintainable Future?focused : adept with Dune Spellbook today and excited to build self?hosted solutions tomorrow Detail?obsessed : you identify edge cases, troubleshoot upstream issues and prevent data drift proactively Collaborative : you translate requirements into solutions and work seamlessly across small, cross?functional teams

Nice to Haves

Proficiency with modern cloud platforms (e.g., BigQuery, Snowflake, AWS, GCP, or Azure) and experience with both OLTP and analytical databases such as PostgreSQL or ClickHouse Experience building subgraphs or equivalent custom indexers (e.g., The Graph, Ponder) Experience building and exposing internal/external Data APIs and deploying containerized workloads using Docker and Kubernetes Advanced degree in Computer Science, Data Engineering, or a related technical field About Uniswap Foundation

In pursuit of a more open and fair financial system, the Uniswap Foundation supports the growth, decentralization, and sustainability of the Uniswap community. Through grants, we're driving value in five key focus areas: Protocol and Innovation, Developers, Governance, Research, and Security. Our grant making approach is designed to maximize positive impact, bringing in new contributors to our community who focus on building new initiatives, committees, products, infrastructure, and more. To learn more about our community impact, visit uniswapfoundation.org/impact. #J-18808-Ljbffr