Logo
Liberty Mutual Insurance Group

Senior Data Engineer Cloud Data Warehousing (Investments Technology)

Liberty Mutual Insurance Group, Boston, Massachusetts, us, 02298

Save Job

Senior Data Engineer

Are you passionate about data and excited to shape the future of investing through innovative technology? Do you thrive in an inclusive, collaborative environment where your ideas are valued? Are you curious, adaptable, and eager to learn and innovate? If so, come build with us at Liberty Mutual Investments. Liberty Mutual Investments manages Liberty Mutual Insurance Group's global financial assets across public and private domains, to create capital and generate income. With over $100 billion in AUM and staffed with 300+ investment, finance and operations professionals located in Boston, MA., Liberty Mutual Investments offers the best of both worlds; the look and feel of a boutique investment firm, and the reputation and financial strength of a Fortune 100 company. This role has a hybrid work arrangement (2x per week) in our Boston, MA office. Key Responsibilities

Work in a Data and Analytics squad to design and build robust data pipelines that deliver high-quality insights to our investment teams. Collaborate with diverse teams of data engineers and product partners who believe in continuous learning, respectful collaboration, and inclusive data solutions. Build and optimize scalable, reliable, high-performance automated data pipelines and solutions using Snowflake and AWS. Drive data quality, governance, and observability practices throughout the development lifecycle. Stay current with industry trends and best practices in data engineering and recommend improvements to existing processes. Contribute to a culture of learning by mentoring junior engineers, participating in code reviews, and promoting engineering excellence and best practices. Qualifications

A bachelor's degree in technical or business discipline, or equivalent experience. 5+ years of experience in data engineering, with designing, building, and optimizing scalable data pipelines and architecture. Design and build data provisioning workflows/pipelines, extracts, data transformations, and data integrations using Snowflake cloud-native features in an AWS cloud environment. Intermediate experience using Python for data transformation, integration, and automation. Hands-on experience with modern public cloud-based data platformsSnowflake, AWS (e.g., S3, Glue, Lambda). Proficiency with ETL/ELT pipelines and patterns for loading data warehouses and data lakes. Hands-on experience building and deploying CI/CD pipelines (e.g., GitHub Actions, Bamboo, or similar tools). Bonus Skills

Exposure to the Investments or Finance domain is highly desirable. Familiarity with building and consuming APIs (e.g., REST, API Gateway). Awareness of security best practices, including access control, authentication, and secure processing.