Logo
Capgemini

Senior Software Engineer - GCP Data Engineer

Capgemini, Atlanta, Georgia, United States, 30383

Save Job

Overview

Senior Software Engineer - GCP Data Engineer at Capgemini. This role focuses on migrating a legacy data warehouse to a Google Cloud-based data warehouse and delivering data solutions in collaboration with Data Product Managers and Data Architects. Your Role

As a GCP Engineer, contribute to the migration of a legacy data warehouse to a Google Cloud-based data warehouse for a Telecom Major. Collaborate with Data Product Managers and Data Architects to design, implement, and deliver successful data solutions. Help architect data pipelines for the underlying data warehouse and data marts. Design and develop complex ETL pipelines in Google Cloud Data environments. The legacy stack includes Teradata; the new stack includes GCP data technologies like BigQuery and Airflow. Languages include SQL and Python. Maintain detailed documentation to support data quality and governance. Support QA/UAT testing and deployment activities to higher environments. Ensure high operational efficiency and quality to meet SLAs. Be an active participant and advocate of agile/scrum practices. Qualifications 8+ years of data engineering experience developing large data pipelines in very complex environments Very strong SQL skills and ability to build complex transformation pipelines using a custom ETL framework in Google BigQuery Exposure to Teradata and ability to understand complex Teradata BTEQ scripts Strong Python programming skills Strong skills on building Airflow jobs and debugging issues Ability to optimize queries in BigQuery Hands-on experience with Google Cloud data technologies (GCS, BigQuery, Dataflow, Pub/Sub, Data Fusion, Cloud Function) Preferred Qualifications Experience with BigQuery Nice to have experience with Cloud technologies like GCP (GCS, DataProc, Pub/Sub, Dataflow, Data Fusion, Cloud Function) Nice to have exposure to Teradata Experience with job orchestration tools like Airflow and building complex jobs Writing and maintaining large data pipelines using a custom ETL framework Ability to automate jobs using Python Familiarity with data modeling techniques and data warehousing methodologies Experience with GitHub or similar version control Scripting skills (Bash and Python) Familiarity with Scrum and Agile methodologies Problem solver with strong attention to detail and strong analytical and communication skills Ability to work in Onsite/Offshore model and lead a team Life at Capgemini Flexible work Healthcare including dental, vision, mental health, and well-being programs Financial well-being programs such as 401(k) and Employee Share Ownership Plan Paid time off and paid holidays Paid parental leave Family building benefits like adoption assistance and related programs Social well-being benefits like subsidized back-up childcare and tutoring Mentoring, coaching and learning programs Employee Resource Groups Disaster Relief About Capgemini Capgemini is a global business and technology transformation partner, helping organizations accelerate their digital and sustainable transformation with 340,000 team members in more than 50 countries. It leverages capabilities in AI, cloud and data, and has strong industry expertise and partner ecosystems. The Group reported 2024 global revenues of €22.1 billion. Disclaimer Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law. Capgemini will consider reasonable accommodations during the recruitment process as needed. Job details: Programmer/Analyst (Full-time) | Primary Location: US-GA-Atlanta | Organization: I&D

#J-18808-Ljbffr