Logo
Rippling

Senior Engineer - Data Platform

Rippling, WorkFromHome

Save Job

Drivepoint is a Series A AI-powered vertical software startup revolutionizing retail and commerce operations by providing cutting-edge AI solutions for financial forecasting, reporting and scenario planning, along with new products for demand planning and inventory .

I. Role Title & Core Context

  • Reports To: CTO
  • Employment Type: Full-time

II. Drivepoint & Team Context

  • About Drivepoint: Drivepoint is a Series A AI-powered vertical software startup revolutionizing retail and commerce operations by providing cutting-edge AI solutions for financial forecasting, reporting and scenario planning, along with new products for demand planning and inventory.
  • Why This Role Matters NOW: Data is at the heart of Drivepoint’s value proposition. Our customers depend on fast, accurate, and reliable insights to run their businesses, and our data platform is the backbone that makes it possible. As we expand rapidly, we need a strong engineer to own and evolve this foundation—ensuring fault tolerance, data quality, scalability, and speed—while also extending its capabilities to support new AI-driven features and customer use cases.
  • Team Dynamics: You’ll be joining a tight-knit engineering team that values pragmatism, clarity, and collaboration. We move quickly, but with intention. As part of this team, you’ll partner directly with the CTO, product managers, and customer-facing teams to build the infrastructure that powers every insight Drivepoint delivers.

III. The Mission (The 1-Year Impact)

  • Core Mission: Within 12 months, significantly improve the performance, scalability, and reliability of Drivepoint’s data platform, while enabling new integrations, optimizing pipeline costs, and contributing to product-level features that enhance the customer experience.

IV. Outcomes (Key Results for Success)

  • Platform Reliability & Quality: Enhance fault tolerance, observability, and data quality across the data stack (Source → Airbyte/Airflow → DBT → BigQuery → Applications/Agents).
  • Scalability & Efficiency: Optimize queries and pipelines for speed and cost efficiency, ensuring customers experience real-time, reliable insights.
  • Integration Expansion: Build and maintain new data connections from external sources to enrich the Drivepoint platform.
  • Product Contribution: Implement new customer-facing features using TypeScript and Python, working across the codebase beyond core data infrastructure.
  • Next-Gen Architecture: Help architect the next generation of our platform to support large-scale growth, AI/LLM-powered features, and evolving customer needs.

V. Competencies (How Success Will Be Achieved)

  • Data Warehouse Design & Architecture: 4+ years designing and building SQL-based data warehouses (BigQuery or Snowflake).
  • ELT/ETL Pipeline Development: Hands-on experience with Airbyte or Fivetran for ingestion, DBT for transforms, and Airflow (or equivalent) for orchestration.
  • Software Engineering: 4+ years of professional development experience with modern languages (TypeScript and/or Python).
  • SDLC & Feature Delivery: Ability to work within modern SDLC frameworks (Scrum/Agile), shipping customer-facing features while maintaining infrastructure.
  • AI Awareness: Comfortable using modern AI tools to accelerate work; bonus for experience building/deploying AI agents.
  • Ownership & Accountability: Takes responsibility for data platform performance and proactively drives improvements.
  • Pragmatism & Clarity: Designs simple, maintainable solutions while balancing speed and scalability.
  • Collaboration: Works closely with engineers, product managers, and business stakeholders to align technical work with customer needs.
  • Adaptability: Thrives in a fast-paced startup environment, quickly shifting priorities when necessary. Innovation: Eager to explore new technologies and approaches (AI, data infrastructure, observability) to strengthen the platform.

VI. Pre-requisites & Minimum Qualifications

  • 4+ years of experience in data engineering or related roles, with strong expertise in SQL-based data warehouses.
  • Experience building, scaling, and optimizing ELT/ETL pipelines using Airbyte/Fivetran, DBT, and Airflow.
  • 4+ years of software engineering experience in TypeScript and/or Python.
  • Ability to deliver both infrastructure improvements and customer-facing features.
  • Must be legally authorized to work in the U.S. (we cannot provide visa sponsorship at this time).
  • Location: Remote-friendly (U.S. only), with preference for Boston-based candidates who can collaborate in person with the GTM and leadership team as needed.
  • Compensation: Competitive salary + benefits, with meaningful equity opportunities.
  • Benefits: Unlimited PTO, 401(k) with employer match, excellent health/dental insurance.
  • Funding: Series A, $7M raised in July 2025 from top-tier investors.
#J-18808-Ljbffr