The Opportunity
Responsible for Data Engineering Standards CoE, driving automation and best practices to standardize and accelerate development. Responsible for overseeing the design, development, implementation, and maintenance of data solutions within the organization. Support a team of data engineers, collaborating with cross-functional teams, data scientists, and business stakeholders to ensure the efficient and reliable management of data.
What You'll Do
- Enhance and administer the DBT and CI/CD platform architecture to improve scalability, performance, and automation.
- Design and build reusable, high-performance data pipelines using technologies like Snowflake, Python, and DBT to support both operational and analytical needs.
- Lead the design, development, and automation of CI/CD pipelines for data workflows using Azure DevOps, Git, and orchestration tools.
- Evaluate emerging features in DBT, CI/CD, and related tools; lead POCs, define adoption strategies, and support rollout to development teams.
- Act as an escalation point to troubleshoot and solve technical challenges.
- Leverage Python and cutting-edge AI technologies, including Large Language Models (LLMs), to automate and streamline the data management lifecycle—from intelligent metadata tagging and automated code validation to advanced data profiling and pipeline generation.
- Lead Data Engineering platform collaboration sessions and actively drive standard adoption and technical best practices.
- Proactively identify opportunities to innovate and expand platform capabilities in alignment with strategic goals.
- Champion continuous improvement through agile delivery, platform upgrades, and process innovation.
- Mentor and support engineers at all levels, fostering a strong culture of learning and technical excellence.
What You'll Bring
- Bachelor's Degree in Computer Science or a related field required; experience can be considered in lieu of a degree.
- Experience in professional IT positions for 8-10+ years and data warehousing development in large reporting environments.
- 5-7 years of experience working with data integration tools, ETL frameworks, and workflow management systems (e.g., Apache Airflow).
- 2+ years of hands-on experience in DBT tool development and administration, including CI/CD pipeline development using Azure DevOps and Azure Git; experience with AI/ML technologies, including LLMs, to support automation and innovation.
- Strong attention to detail and a proactive approach to process improvement and standards development.
- Highly Preferred: Proficiency with DBT, designing, developing, and performing admin tasks; strong understanding of CI/CD processes using Azure DevOps or Git.
Skills for Success
- Expertise in data engineering principles, data modeling, ETL development, and data warehousing.
- Hands-on experience with DBT, CI/CD pipelines, and cloud data platforms (Azure, Snowflake).
- Strong ability in building data pipelines using Snowflake features (Snowpipe, SnowSQL, Snow Sight, Data Streams).
- Proficiency in working with relational databases (SQL Server, Oracle, PostgreSQL) and NoSQL databases (MongoDB, Cassandra).
Working Model
- Hybrid required
- Onsite 1-3 times weekly; local or willing to relocate is required for weekly or at least 4 times monthly attendance.
- Monday-Friday Eastern Business Hours required.
- Remote work requires a stable, secure, quiet, and compliant workstation.
Mass General Brigham Incorporated is an Equal Opportunity Employer. By embracing diverse skills, perspectives, and ideas, we lead. All qualified applicants will receive consideration without regard to race, color, religious creed, national origin, sex, age, gender identity, disability, sexual orientation, military service, genetic information, or other protected status. We ensure reasonable accommodations for individuals with disabilities during the application and interview process, as well as in performing essential job functions.
#J-18808-Ljbffr