Chick-fil-A
Sr. Software Engineer – Data and Analytics
Chick-fil-A, Atlanta, Georgia, United States, 30383
Overview
We are seeking a highly skilled and motivated Senior Software Engineer to join our Data & Analytics DevOps team. This team creates and supports big data solutions across Chick-fil-A’s Distribution, Manufacturing, and Production businesses, continuously innovating and optimizing through technology to meet the needs of a growing business. The Data and Analytics DevOps team is a small, agile team where engineers can make impactful contributions in a fast-paced environment, solving large-scale data problems using cloud-based technologies.
In this role, you will work at the intersection of software engineering and data modeling, driving the development, optimization, and scalability of our data systems and processes. You will collaborate closely with our reporting and visualization team to deliver high-performing, optimized data models and ensure data sets are readily available for business intelligence and decision-making.
As a key member of the team, you will contribute to the full lifecycle of our data infrastructure, including ETL/ELT pipelines, data warehouse management in AWS, data modeling, and performance optimization. This role offers the opportunity to shape the future of our data ecosystem and deliver significant business value.
About Chick-fil-A Supply and Bay Center Foods As wholly owned subsidiaries of Chick-fil-A, Inc., Chick-fil-A Supply and Bay Center Foods are food production and distribution services networks focused entirely on serving the unique needs and growing volume of Chick-fil-A restaurants.
The Chick-fil-A Supply and Bay Center Foods service networks are comprised of three components:
Distribution Center – State-of-the-art warehouses that house supply for Chick-fil-A restaurants
Production Distribution Center – Offsite facilities that prepare select ingredients and menu items for Chick-fil-A restaurants
Transportation – Growing fleet of delivery vehicles that supply Chick-fil-A restaurants
Our Flexible Future model offers a healthy mix of working in person and remotely, strengthening key elements of the Chick-fil-A Supply culture by fostering collaboration and community.
Responsibilities
Design, develop, and maintain scalable ETL/ELT pipelines for ingesting, transforming, and preparing data from multiple sources
Manage and optimize our data warehouse in AWS, ensuring reliability, scalability, and cost-efficiency
Develop and maintain data models that support robust business intelligence and analytics, with a focus on performance and usability
Partner with the reporting and visualization team to align on data requirements, optimize data sets for reporting tools, and enhance the user experience of dashboards and reports
Drive automation and best practices in data processing, deployment, and monitoring as part of DevOps principles
Collaborate with stakeholders to understand data needs, troubleshoot issues, and deliver high-quality solutions that meet business goals
Ensure data quality, governance, and compliance with organizational and regulatory standards
Evaluate and recommend new tools, technologies, and processes to improve the data platform and workflows
Leverage agile project management tools and methods to impact the team through system enhancements with a focus on small, frequent releases and continuous improvement
Note: Working in a DevOps model, this opportunity includes both building and running solutions that could require off hours support. This support is shared amongst the team members to cover weekends and weeknights. The goal is to design for failure and automate responses to possible issues so they can be worked during normal hours.
Minimum Qualifications
Bachelor’s Degree or the equivalent combination of education, training and experience from which comparable skills can be acquired
5+ years of experience using Python are your primary language
Proven experience with ETL/ELT development and tools in an AWS environment such as AWS Glue, Apache Airflow, dbt, or similar
Strong expertise in data warehouse management, specifically in AWS (Redshift, DataBricks, or similar)
Proficiency in SQL and query optimization for large-scale data sets
Hands-on experience with data modeling and performance tuning for analytics
Experience working in an agile software development environment
Excellent verbal and written communication
Excellent problem-solving and troubleshooting skills, with the ability to bring clarity to complex data challenges.
Strong collaboration and communication skills, with a history of working closely with data analysts, engineers, and business stakeholders
Demonstrated ability to value both relationships and results Navigate challenging situations, ensuring all parties are treated with honor, dignity, and respect
Preferred Qualifications
Deep understanding of AWS technologies, architecture, and security best practices
Experience working with Apache Airflow, Redshift, DBT, and/or Databricks
Experience with monitoring and alerting technologies
Knowledge of reporting and visualization tools, like tableau, with an understanding of their performance optimization needs
Experience implementing or working within a DevOps culture for data engineering teams
Understanding of data governance frameworks, security, and compliance standards
Minimum Years of Experience5 Travel Requirements10% Required Level of EducationBachelor's Degree Preferred Level of EducationMaster's Degree Major/ConcentrationComputer science or related technical field.
#J-18808-Ljbffr
In this role, you will work at the intersection of software engineering and data modeling, driving the development, optimization, and scalability of our data systems and processes. You will collaborate closely with our reporting and visualization team to deliver high-performing, optimized data models and ensure data sets are readily available for business intelligence and decision-making.
As a key member of the team, you will contribute to the full lifecycle of our data infrastructure, including ETL/ELT pipelines, data warehouse management in AWS, data modeling, and performance optimization. This role offers the opportunity to shape the future of our data ecosystem and deliver significant business value.
About Chick-fil-A Supply and Bay Center Foods As wholly owned subsidiaries of Chick-fil-A, Inc., Chick-fil-A Supply and Bay Center Foods are food production and distribution services networks focused entirely on serving the unique needs and growing volume of Chick-fil-A restaurants.
The Chick-fil-A Supply and Bay Center Foods service networks are comprised of three components:
Distribution Center – State-of-the-art warehouses that house supply for Chick-fil-A restaurants
Production Distribution Center – Offsite facilities that prepare select ingredients and menu items for Chick-fil-A restaurants
Transportation – Growing fleet of delivery vehicles that supply Chick-fil-A restaurants
Our Flexible Future model offers a healthy mix of working in person and remotely, strengthening key elements of the Chick-fil-A Supply culture by fostering collaboration and community.
Responsibilities
Design, develop, and maintain scalable ETL/ELT pipelines for ingesting, transforming, and preparing data from multiple sources
Manage and optimize our data warehouse in AWS, ensuring reliability, scalability, and cost-efficiency
Develop and maintain data models that support robust business intelligence and analytics, with a focus on performance and usability
Partner with the reporting and visualization team to align on data requirements, optimize data sets for reporting tools, and enhance the user experience of dashboards and reports
Drive automation and best practices in data processing, deployment, and monitoring as part of DevOps principles
Collaborate with stakeholders to understand data needs, troubleshoot issues, and deliver high-quality solutions that meet business goals
Ensure data quality, governance, and compliance with organizational and regulatory standards
Evaluate and recommend new tools, technologies, and processes to improve the data platform and workflows
Leverage agile project management tools and methods to impact the team through system enhancements with a focus on small, frequent releases and continuous improvement
Note: Working in a DevOps model, this opportunity includes both building and running solutions that could require off hours support. This support is shared amongst the team members to cover weekends and weeknights. The goal is to design for failure and automate responses to possible issues so they can be worked during normal hours.
Minimum Qualifications
Bachelor’s Degree or the equivalent combination of education, training and experience from which comparable skills can be acquired
5+ years of experience using Python are your primary language
Proven experience with ETL/ELT development and tools in an AWS environment such as AWS Glue, Apache Airflow, dbt, or similar
Strong expertise in data warehouse management, specifically in AWS (Redshift, DataBricks, or similar)
Proficiency in SQL and query optimization for large-scale data sets
Hands-on experience with data modeling and performance tuning for analytics
Experience working in an agile software development environment
Excellent verbal and written communication
Excellent problem-solving and troubleshooting skills, with the ability to bring clarity to complex data challenges.
Strong collaboration and communication skills, with a history of working closely with data analysts, engineers, and business stakeholders
Demonstrated ability to value both relationships and results Navigate challenging situations, ensuring all parties are treated with honor, dignity, and respect
Preferred Qualifications
Deep understanding of AWS technologies, architecture, and security best practices
Experience working with Apache Airflow, Redshift, DBT, and/or Databricks
Experience with monitoring and alerting technologies
Knowledge of reporting and visualization tools, like tableau, with an understanding of their performance optimization needs
Experience implementing or working within a DevOps culture for data engineering teams
Understanding of data governance frameworks, security, and compliance standards
Minimum Years of Experience5 Travel Requirements10% Required Level of EducationBachelor's Degree Preferred Level of EducationMaster's Degree Major/ConcentrationComputer science or related technical field.
#J-18808-Ljbffr