Logo
Docker, Inc

Software Engineer, Data Infrastructure

Docker, Inc, Seattle, Washington, us, 98127

Save Job

This range is provided by Docker, Inc. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more.

Base pay range $132,000.00/yr - $181,500.00/yr

At Docker, we make app development easier so developers can focus on what matters. Our remote-first team spans the globe, united by a passion for innovation and great developer experiences. With over 20 million monthly users and 20 billion image pulls, Docker is the #1 tool for building, sharing, and running apps—trusted by startups and Fortune 100s alike. We’re growing fast and just getting started.

Software Engineer – Data Infrastructure Docker is seeking a

Software Engineer

to join our

Data Infrastructure

team and drive the technical evolution of data systems that power analytics across the entire company. This is a hands‑on technical role focused on execution, learning, and individual contribution.

Responsibilities

Contribute to the design and implementation of highly scalable data infrastructure leveraging Snowflake, AWS, Airflow, DBT, and Sigma.

Implement and maintain end‑to‑end data pipelines supporting batch & realtime analytics across Docker's product ecosystem.

Follow and contribute to the technical standards for data quality, testing, monitoring, and operational excellence.

Design, build, and maintain robust data processing systems, focusing on data volume and latency requirements.

Implement data transformations and modeling using DBT for analytics and business intelligence use cases.

Develop and maintain data orchestration workflows using Apache Airflow under the direction of senior engineers.

Assist with optimizing Snowflake performance and cost efficiency.

Contribute to building data APIs and services to enable self‑service analytics.

Work with Product, Engineering, and Business teams to understand data requirements and translate them into technical tasks.

Support Data Scientists and Analysts by providing access to reliable, high‑quality data.

Collaborate with business teams to deliver and maintain accurate reporting and operational dashboards.

Engage with Security and Compliance teams to support data governance implementation.

Assist with monitoring, alerting, and incident response for critical data systems.

Support the implementation of data quality frameworks and automated testing in data pipelines.

Participate in performance optimization and cost management initiatives.

Contribute to troubleshooting and resolution of technical issues affecting data availability and accuracy.

Proactively learn technical skills, system design, and data engineering best practices from senior team members.

Participate in technical design reviews and provide feedback on documentation.

Actively contribute to team knowledge sharing and documentation efforts.

Qualifications

2+ years of software engineering experience, preferably with a focus on data engineering or analytics systems.

Experience with a major cloud platform (AWS, GCP, or Azure) and basic data services (S3, GCS, etc.).

Proficiency with SQL and experience with a cloud data warehouse (e.g., Snowflake, Redshift, BigQuery).

Familiarity with data transformation tools (e.g., DBT) and modern BI platforms (e.g., Sigma).

Familiarity with workflow orchestration tools (e.g., Apache Airflow, Dagster).

Proficiency in Python, Go, Kotlin and other programming languages used in data engineering.

Familiarity with version control (Git) and modern software development practices (CI/CD).

Basic understanding of data warehousing concepts (dimensional modeling) and analytics architectures.

Strong communication and collaboration skills.

Ability to take direction and work effectively as part of a team.

A proactive attitude toward problem‑solving and self‑improvement.

Preferred

Experience in an internship or junior role at a technology company.

Knowledge of container technologies (Docker, Kubernetes).

Experience with version control (Git) and CI/CD practices.

Advanced degree in Computer Science, Data Engineering, or a related technical field.

Key Success Metrics

Successful completion of assigned data engineering projects and tasks.

Delivery of high‑quality, reliable code for data pipelines.

Demonstrated technical growth and increasing independence.

Positive working relationships and collaboration with team members and stakeholders.

Impact You'll Make As a Software Engineer in our Data Platform group, you will contribute directly to the data foundation that powers Docker's product innovation and business intelligence. Your work will directly support the scaling of Docker's data infrastructure as we continue to expand our product portfolio and serve customers globally.

Perks

Freedom & flexibility; fit your work around your life.

Designated quarterly Whaleness Days.

Home office setup; we want you comfortable while you work.

16 weeks of paid parental leave.

Technology stipend equivalent to $100 net/month.

PTO plan that encourages you to take time to do the things you enjoy.

Quarterly, company‑wide hackathons.

Training stipend for conferences, courses and classes.

Equity; we are a growing start‑up and want all employees to have a share in the success of the company.

Docker Swag.

Medical benefits, retirement and holidays vary by country.

Docker embraces diversity and equal opportunity. We are committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better our company will be.

Due to the remote nature of this role, we are unable to provide visa sponsorship.

Compensation Range: $132K - $181.5K

#J-18808-Ljbffr