Charles Schwab Corporation
Your Opportunity
At Schwab, you’re empowered to make an impact on your career. Here, innovative thought meets creative problem solving, helping us “challenge the status quo” and transform the finance industry together. We believe in the importance of in-office collaboration and fully intend for the selected candidate for this role to work on site in the specified location. At Schwab, the Analytical Data Platform Engineering (ADPE) organization governs the strategy and implementation of the enterprise data warehouse and emerging data platforms. Our mission is to drive activation of data solutions, engagement technology (Sales, Marketing and Service) to achieve targeted business outcomes, address data risk and safeguard our competitive edge. We help Digital, Brokerage, HR, Marketing, Finance, Risk, and executive leadership make fact-based decisions by integrating and analyzing data. As a Sr Specialist Data Engineering, you will partner with our Business stakeholders and Data Engineering team to design and develop data solutions for data science, analytics, and reporting. We are a team of passionate data engineers and SMEs who bring energy, focus, and fresh ideas to support our mission to see the world “Through Clients' Eyes”. ETL Developers work with large teams, including onshore and offshore developers, using best-in-class technologies including Informatica IICS and BigQuery. You will design, develop, and implement enterprise data integration solutions with opportunities to grow in responsibility, work on exciting projects, train on new technologies, and collaborate with other Developers to shape the future of the Data Warehouse. What you are good at Designing, developing, and implementing new data ingestion workflows using current data engineering techniques Developing data ingestion workflows across various data sources and patterns such as batch, near real-time, and real-time Leading projects to ensure successful delivery Collaborating with business analysts to understand requirements and use cases Crafting and updating ETL specifications and documentation Working with technical directors, Data Modelers, and cross-functional teams to ensure accurate and efficient implementation Defining and executing quality assurance and testing scripts Guiding the ETL delivery team with technical expertise Reviewing ETL work from third-party vendors Promoting agile practices to improve delivery efficiency Maintaining standards for development, coding, and testing Required Qualifications Proven experience as an ETL Developer with a record of delivering high-quality code 5-6 years of hands-on experience with Google Cloud Platform, BigQuery, Informatica Power Center, and IICS or similar tools 5-6 years of experience with Data Warehouse platforms like Teradata, BigData/Hadoop, or BigQuery At least 3-5 years in data modeling (logical and/or physical) 3-5 years of experience with near real-time and real-time data ingestion techniques, including AVRO, KAFKA, RabbitMQ Expertise in schema design and working with complex data Strong SQL skills for developing, tuning, and debugging complex applications 1-2 years’ experience with scheduling tools such as Control M or ESP Experience working in large environments, including RDBMS, EDW, NoSQL, BigData, etc., is preferred Experience interfacing with vendors, offshore teams, and internal groups Ability to quickly learn new technologies Strong analytical, problem-solving, influencing, and communication skills In addition to the salary range, this role is eligible for bonus or incentive opportunities.
#J-18808-Ljbffr
At Schwab, you’re empowered to make an impact on your career. Here, innovative thought meets creative problem solving, helping us “challenge the status quo” and transform the finance industry together. We believe in the importance of in-office collaboration and fully intend for the selected candidate for this role to work on site in the specified location. At Schwab, the Analytical Data Platform Engineering (ADPE) organization governs the strategy and implementation of the enterprise data warehouse and emerging data platforms. Our mission is to drive activation of data solutions, engagement technology (Sales, Marketing and Service) to achieve targeted business outcomes, address data risk and safeguard our competitive edge. We help Digital, Brokerage, HR, Marketing, Finance, Risk, and executive leadership make fact-based decisions by integrating and analyzing data. As a Sr Specialist Data Engineering, you will partner with our Business stakeholders and Data Engineering team to design and develop data solutions for data science, analytics, and reporting. We are a team of passionate data engineers and SMEs who bring energy, focus, and fresh ideas to support our mission to see the world “Through Clients' Eyes”. ETL Developers work with large teams, including onshore and offshore developers, using best-in-class technologies including Informatica IICS and BigQuery. You will design, develop, and implement enterprise data integration solutions with opportunities to grow in responsibility, work on exciting projects, train on new technologies, and collaborate with other Developers to shape the future of the Data Warehouse. What you are good at Designing, developing, and implementing new data ingestion workflows using current data engineering techniques Developing data ingestion workflows across various data sources and patterns such as batch, near real-time, and real-time Leading projects to ensure successful delivery Collaborating with business analysts to understand requirements and use cases Crafting and updating ETL specifications and documentation Working with technical directors, Data Modelers, and cross-functional teams to ensure accurate and efficient implementation Defining and executing quality assurance and testing scripts Guiding the ETL delivery team with technical expertise Reviewing ETL work from third-party vendors Promoting agile practices to improve delivery efficiency Maintaining standards for development, coding, and testing Required Qualifications Proven experience as an ETL Developer with a record of delivering high-quality code 5-6 years of hands-on experience with Google Cloud Platform, BigQuery, Informatica Power Center, and IICS or similar tools 5-6 years of experience with Data Warehouse platforms like Teradata, BigData/Hadoop, or BigQuery At least 3-5 years in data modeling (logical and/or physical) 3-5 years of experience with near real-time and real-time data ingestion techniques, including AVRO, KAFKA, RabbitMQ Expertise in schema design and working with complex data Strong SQL skills for developing, tuning, and debugging complex applications 1-2 years’ experience with scheduling tools such as Control M or ESP Experience working in large environments, including RDBMS, EDW, NoSQL, BigData, etc., is preferred Experience interfacing with vendors, offshore teams, and internal groups Ability to quickly learn new technologies Strong analytical, problem-solving, influencing, and communication skills In addition to the salary range, this role is eligible for bonus or incentive opportunities.
#J-18808-Ljbffr