jobright.com
Join to apply for the
Principal Data Engineer
role at
Jobright.ai Join to apply for the
Principal Data Engineer
role at
Jobright.ai Get AI-powered advice on this job and more exclusive features. Xplor Technologies is a global platform combining SaaS with embedded payments and tools to help businesses grow and succeed. They are seeking a Principal Data Engineer to lead and contribute to enterprise data warehousing and reporting deliverables, with a focus on designing, creating, and maintaining business intelligence assets and reports. The role involves collaborating with various teams and mentoring other engineers to ensure accurate and scalable data pipelines. Responsibilities: Include designing, creating, and maintaining business intelligence assets and reports that drive innovation, client value and business efficiency. This position will be working primarily in one of our largest business units and experience with multi-terabyte database systems is required. Assisting primarily through maintaining and implementing an array of data synchronization efforts across numerous systems both on-premises and in the cloud. You will coordinate with our data architects, product management, and IT management to ensure that our data pipelines for both internal operations and customer reporting are accurate, scalable, and secure. This will occur through several initiatives: Leading the maintenance and enhancement of existing reporting and BI solutions as needed for functionality or efficiency. Architecting and developing complex reporting stored procedures. Overseeing and contributing to ELT development and pipeline automation using tools like FiveTran, Coalesce, Python, and YAML. Driving and executing the migration of legacy reporting technologies to cloud platforms such as Snowflake. Championing and implementing best practices in data engineering, including GitFlow version control, GitHub Actions for CI/CD, and infrastructure automation. Providing technical leadership and mentoring to junior and mid-level data engineers. Leading and actively participating in team efforts within Agile/Scrum/Kanban methodologies, aligning with Xplor's Orbit Planning process. Collaborating on data architecture for multi-tenanted and single-tenanted systems across diverse database technologies (SQL Server, Postgres, MySQL). Analyse customer and marketing data to identify business opportunities and threats, and create high-quality, insightful reports and dashboards. Lead the development and maintenance of BI solutions and ELT development based on agreed requirements. Spearhead initiatives for cloud migration (particularly to Snowflake) and ensure system uptime and performance for data warehouse products. Participate in and guide Agile work planning, testing, and quality assurance, ensuring alignment with strategic roadmaps. Evaluate internal and external customer needs and abilities to provide strategic and appropriate technical solutions, developing data solutions to a high standard of performance and accuracy. Identify and provide strategic input and leadership on new technology opportunities that could potentially have an impact on Data Engineering and Analytics systems. Advise and lead on how processes, practices and technologies can play a critical role in improving business management and optimization. Write, optimize, and review complex SQL queries to extract and manipulate data to be used by business units. Communicate data nuances, insights, and architectural decisions effectively to all levels of the company in a clear and concise manner. Provide expert-level technical expertise with all aspects of the solution design and implementation, including requirements definition, data acquisition processes, data modelling (dimensional and semantic), process automation, escalation procedures, construction and deployment. Identify and drive the implementation of innovative solutions to enhance the existing Data Engineering infrastructure processes and technology. Participate in and lead planning processes including inception, technical design, development, testing and delivery of Data solutions. Participate in and facilitate Agile work planning and estimation processes, including backlog grooming, sprint planning, and retrospectives. Act as a self-starter, proactively identifying areas for improvement and innovation within the data landscape. Foster a culture of technical excellence through team leadership and active mentoring. Qualifications: Required: Extensive experience working at enterprise scale with multi-terabyte database systems and cloud data warehouses, particularly Snowflake. Proven experience in automation using coding/scripting languages (Python strongly preferred, experience with YAML for configurations highly desirable). Advanced presentation and communication skills, capable of conveying complex technical concepts to diverse audiences. Able to perform critical thinking and provide strategic direction in the area of data analysis, data architecture, and ELT processes. Strong prioritisation, time management and organisational skills, with proven ability to lead projects and manage competing priorities. Demonstrated ability as a self-starter, taking initiative from concept to delivery. Strong team leadership and mentoring capabilities with a passion for developing talent. Deep expertise in Snowflake is essential. Proficient with modern data stack tools such as FiveTran and Coalesce. Experience with SQL Server, Postgres, and MySQL. Strong proficiency in Python for data engineering, ELT development, and automation. Hands-on experience with CI/CD pipelines using GitHub Actions. Solid understanding and practical application of GitFlow branching strategies. Experience with YAML for writing configurations and automation scripts. BA/BS degree in Computer Science, Computer Engineering, or related field, or equivalent practical experience. Preferred: Deep expertise in parallel processing data warehouses (Snowflake essential). Extensive experience designing and implementing high-frequency, resilient, and scalable ELT processes using tools like FiveTran and Coalesce. Mastery of Agile development methodology (Scrum/Kanban) and experience leading agile teams or pods. Proven experience with data architecture (multi-tenanted, single-tenanted) across various database systems (e.g., SQL Server, Azure SQL, Azure Hyperscale, MySQL, Postgres). Track record of leading Terraform-to-GitHub Actions migration or similar infrastructure automation initiatives. Company: Xplor Technologies is a global platform integrating SaaS solutions, embedded payments, and Commerce Accelerating Technologies. Founded in 2021, the company is headquartered in Atlanta, Georgia, USA, with a team of 1001-5000 employees. The company is currently Late Stage. Seniority level
Seniority level
Mid-Senior level Employment type
Employment type
Full-time Job function
Industries
Software Development Referrals increase your chances of interviewing at Jobright.ai by 2x Inferred from the description for this job
Medical insurance Vision insurance 401(k) Get notified about new Data Engineer jobs in
Boston, MA . Boston, MA $100,906.00-$151,360.00 5 days ago Boston, MA $170,000.00-$240,000.00 6 months ago Boston, MA $125,000.00-$178,000.00 2 weeks ago Boston, MA $130,000.00-$180,000.00 6 months ago Boston, MA $145,000.00-$204,000.00 4 days ago Boston, MA $115,000.00-$135,000.00 6 days ago Bedford, MA $80,000.00-$100,000.00 2 weeks ago Cambridge, MA $119,700.00-$245,700.00 1 week ago Boston, MA $100,000.00-$200,000.00 3 months ago Boston, MA $125,000.00-$178,000.00 4 months ago Boston, MA $80,000.00-$90,000.00 4 weeks ago Boston, MA $150,000.00-$220,000.00 1 month ago Were unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI. #J-18808-Ljbffr
Principal Data Engineer
role at
Jobright.ai Join to apply for the
Principal Data Engineer
role at
Jobright.ai Get AI-powered advice on this job and more exclusive features. Xplor Technologies is a global platform combining SaaS with embedded payments and tools to help businesses grow and succeed. They are seeking a Principal Data Engineer to lead and contribute to enterprise data warehousing and reporting deliverables, with a focus on designing, creating, and maintaining business intelligence assets and reports. The role involves collaborating with various teams and mentoring other engineers to ensure accurate and scalable data pipelines. Responsibilities: Include designing, creating, and maintaining business intelligence assets and reports that drive innovation, client value and business efficiency. This position will be working primarily in one of our largest business units and experience with multi-terabyte database systems is required. Assisting primarily through maintaining and implementing an array of data synchronization efforts across numerous systems both on-premises and in the cloud. You will coordinate with our data architects, product management, and IT management to ensure that our data pipelines for both internal operations and customer reporting are accurate, scalable, and secure. This will occur through several initiatives: Leading the maintenance and enhancement of existing reporting and BI solutions as needed for functionality or efficiency. Architecting and developing complex reporting stored procedures. Overseeing and contributing to ELT development and pipeline automation using tools like FiveTran, Coalesce, Python, and YAML. Driving and executing the migration of legacy reporting technologies to cloud platforms such as Snowflake. Championing and implementing best practices in data engineering, including GitFlow version control, GitHub Actions for CI/CD, and infrastructure automation. Providing technical leadership and mentoring to junior and mid-level data engineers. Leading and actively participating in team efforts within Agile/Scrum/Kanban methodologies, aligning with Xplor's Orbit Planning process. Collaborating on data architecture for multi-tenanted and single-tenanted systems across diverse database technologies (SQL Server, Postgres, MySQL). Analyse customer and marketing data to identify business opportunities and threats, and create high-quality, insightful reports and dashboards. Lead the development and maintenance of BI solutions and ELT development based on agreed requirements. Spearhead initiatives for cloud migration (particularly to Snowflake) and ensure system uptime and performance for data warehouse products. Participate in and guide Agile work planning, testing, and quality assurance, ensuring alignment with strategic roadmaps. Evaluate internal and external customer needs and abilities to provide strategic and appropriate technical solutions, developing data solutions to a high standard of performance and accuracy. Identify and provide strategic input and leadership on new technology opportunities that could potentially have an impact on Data Engineering and Analytics systems. Advise and lead on how processes, practices and technologies can play a critical role in improving business management and optimization. Write, optimize, and review complex SQL queries to extract and manipulate data to be used by business units. Communicate data nuances, insights, and architectural decisions effectively to all levels of the company in a clear and concise manner. Provide expert-level technical expertise with all aspects of the solution design and implementation, including requirements definition, data acquisition processes, data modelling (dimensional and semantic), process automation, escalation procedures, construction and deployment. Identify and drive the implementation of innovative solutions to enhance the existing Data Engineering infrastructure processes and technology. Participate in and lead planning processes including inception, technical design, development, testing and delivery of Data solutions. Participate in and facilitate Agile work planning and estimation processes, including backlog grooming, sprint planning, and retrospectives. Act as a self-starter, proactively identifying areas for improvement and innovation within the data landscape. Foster a culture of technical excellence through team leadership and active mentoring. Qualifications: Required: Extensive experience working at enterprise scale with multi-terabyte database systems and cloud data warehouses, particularly Snowflake. Proven experience in automation using coding/scripting languages (Python strongly preferred, experience with YAML for configurations highly desirable). Advanced presentation and communication skills, capable of conveying complex technical concepts to diverse audiences. Able to perform critical thinking and provide strategic direction in the area of data analysis, data architecture, and ELT processes. Strong prioritisation, time management and organisational skills, with proven ability to lead projects and manage competing priorities. Demonstrated ability as a self-starter, taking initiative from concept to delivery. Strong team leadership and mentoring capabilities with a passion for developing talent. Deep expertise in Snowflake is essential. Proficient with modern data stack tools such as FiveTran and Coalesce. Experience with SQL Server, Postgres, and MySQL. Strong proficiency in Python for data engineering, ELT development, and automation. Hands-on experience with CI/CD pipelines using GitHub Actions. Solid understanding and practical application of GitFlow branching strategies. Experience with YAML for writing configurations and automation scripts. BA/BS degree in Computer Science, Computer Engineering, or related field, or equivalent practical experience. Preferred: Deep expertise in parallel processing data warehouses (Snowflake essential). Extensive experience designing and implementing high-frequency, resilient, and scalable ELT processes using tools like FiveTran and Coalesce. Mastery of Agile development methodology (Scrum/Kanban) and experience leading agile teams or pods. Proven experience with data architecture (multi-tenanted, single-tenanted) across various database systems (e.g., SQL Server, Azure SQL, Azure Hyperscale, MySQL, Postgres). Track record of leading Terraform-to-GitHub Actions migration or similar infrastructure automation initiatives. Company: Xplor Technologies is a global platform integrating SaaS solutions, embedded payments, and Commerce Accelerating Technologies. Founded in 2021, the company is headquartered in Atlanta, Georgia, USA, with a team of 1001-5000 employees. The company is currently Late Stage. Seniority level
Seniority level
Mid-Senior level Employment type
Employment type
Full-time Job function
Industries
Software Development Referrals increase your chances of interviewing at Jobright.ai by 2x Inferred from the description for this job
Medical insurance Vision insurance 401(k) Get notified about new Data Engineer jobs in
Boston, MA . Boston, MA $100,906.00-$151,360.00 5 days ago Boston, MA $170,000.00-$240,000.00 6 months ago Boston, MA $125,000.00-$178,000.00 2 weeks ago Boston, MA $130,000.00-$180,000.00 6 months ago Boston, MA $145,000.00-$204,000.00 4 days ago Boston, MA $115,000.00-$135,000.00 6 days ago Bedford, MA $80,000.00-$100,000.00 2 weeks ago Cambridge, MA $119,700.00-$245,700.00 1 week ago Boston, MA $100,000.00-$200,000.00 3 months ago Boston, MA $125,000.00-$178,000.00 4 months ago Boston, MA $80,000.00-$90,000.00 4 weeks ago Boston, MA $150,000.00-$220,000.00 1 month ago Were unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI. #J-18808-Ljbffr