Logo
Boston Dynamics

Data Infrastructure Software Engineer, Central Software

Boston Dynamics, Waltham, Massachusetts, United States, 02254

Save Job

Data Infrastructure Software Engineer, Central Software

Boston Dynamics is seeking a data infrastructure software engineer to join the Central Software (CSW) team. The role focuses on developing and maintaining robust cloud-based data pipelines and other big data solutions for use across the company, including integration with robots. The solutions you develop will help expand the reach and capabilities of our advanced mobile robots. Boston Dynamics is at the forefront of mobile robotics, tackling challenging problems in the field and expanding automation solutions for industrial applications and warehouse logistics. Responsibilities

Design, develop, and maintain scalable and robust data pipelines using Apache Airflow and other big data technologies. Optimize existing data systems for performance, reliability, and cost-effectiveness. Collaborate with machine learning engineers and other software engineers to respond to data needs and solve problems with data. Troubleshoot and resolve issues related to data availability, performance, and accuracy. Monitor data quality and integrity, implementing processes for data validation and error handling. Participate in code reviews, contributing to a high standard of code quality and best practices. Research and evaluate new technologies and tools to improve our data platform. Contribute to the overall architecture and strategy for data infrastructure. Participate in our agile development process, coordinating work with others, identifying challenges, and communicating progress regularly. Mentor and upskill peers and other contributors across the organization. Qualifications

5+ years of professional experience in delivering data infrastructure solutions to end-users. Proven ability to design, develop, and optimize efficient ETL/ELT pipelines for large-scale data ingestion and transformation (e.g., Apache Airflow). In-depth knowledge and hands-on experience with big data technologies such as Apache Spark, Hadoop, Kafka, Flink, or similar distributed systems. Expertise in relational databases (e.g., PostgreSQL, MySQL). Experience with major cloud providers like AWS, Google Cloud Platform (GCP), or Microsoft Azure, including services related to data storage, processing, and analytics. Proficiency in Python. Familiarity with Git version control and a comfortable working proficiency in a Linux development environment. Bachelor’s in Engineering, Computer Science, or other technical areas. Additional Skills

Experience with C++ or Rust. Familiarity with containerization (Docker, Kubernetes). Seniority level

Mid-Senior level Employment type

Full-time Job function

Engineering and Information Technology Industries

Automation Machinery Manufacturing

#J-18808-Ljbffr