The Walt Disney Company (Germany) GmbH
Lead Software Engineer
The Walt Disney Company (Germany) GmbH, Seattle, Washington, us, 98127
Technology is at the heart of Disney’s past, present, and future. Disney Entertainment and ESPN Product & Technology (DEEP&T) is a global organization of engineers, product developers, designers, technologists, data scientists, and more – all working to build and advance the technological backbone for Disney’s media business globally.
The team marries technology with creativity to build world-class products, enhance storytelling, and drive velocity, innovation, and scalability for our businesses. We are Storytellers and Innovators. Creators and Builders. Entertainers and Engineers. We work with every part of The Walt Disney Company’s media portfolio to advance the technological foundation and consumer media touch points serving millions of people around the world.
Here are a few reasons why we think you’d love working here:
Building the future of Disney’s media:
Our Technologists are designing and building the products and platforms that will power our media, advertising, and distribution businesses for years to come. Reach, Scale & Impact:
More than ever, Disney’s technology and products serve as a signature doorway for fans\' connections with the company’s brands and stories. Disney+. Hulu. ESPN. ABC. ABC News…and many more. These products and brands – and the unmatched stories, storytellers, and events they carry – matter to millions of people globally. Innovation:
We develop and implement groundbreaking products and techniques that shape industry norms, and solve complex and distinctive technical problems. The Big Data Infrastructure team
The Big Data Infrastructure team manages big data services such as Hadoop, Spark, Flink, Presto, Hive, etc. Our services are distributed across the data center and Cloud, supporting a large scale of data amount and thousands of physical resources. We focus on the virtualization of big data environments, cost efficiency, resiliency, and performance. The right person for this role should have proven experience with working in mission-critical infrastructure and enjoy building and maintaining large-scale data systems with the challenge of varied requirements and large storage capabilities. If you are someone who enjoys building large-scale big data infrastructure, then this is a great role for you Responsibilities
Develop, scale, and improve in-house/cloud and open-source big data engines (e.g. Spark, Flink, Presto/Trino, etc). Investigate new big data technology, and apply it to the DisneyStreaming production environment. Build next-gen cloud-based big data infrastructure for batch and streaming data applications, and continuously improve performance, scalability and availability Handle architectural and design considerations such as performance, scalability, reusability, and flexibility issues. Advocate engineering best practices, including the use of design patterns, code review, and automated unit/functional testing. Work together with other engineering teams to influence them on big data system design and optimization. Define and lead the adoption of best practices and processes. Collaborate with senior internal team members and external stakeholders to gather requirements and drive implementation Collaborate efficiently with Product Managers and other developers to build datastores as a service. Collaborate with senior internal team members and external stakeholders to gather requirements and drive implementation. Basic Qualifications
Experience in building in-house big data infrastructure. Experience in developing and optimizing big data components (e.g. Spark, Flink, Presto/Trino, etc) Experience with modern data formats (Iceberg, Delta, Hudi etc) Experience with CICD, fine-tuned metrics, security and compliance enhancement Ability to drive a project from end to end, including clarifying requirements, resolving conflicts, handling tech challenges and delivering the results Motivation to dive deep and become an expert in one or more big data areas. Ready to unpack and crack open-source software to fix bugs or develop new features, and contribute back to the community Preferred Qualifications
Experience in building in-house big data infrastructure. Experience in contributing big data components (e.g. HDFS, Hive, Spark, Flink, Presto/Trino, etc) Experience in container tech stack, including Kubernetes, Docker, Volcano, etc Experience in managing a big data cluster with over 1000 nodes. Required Education
7+ years of relevant professional experience and Bachelor’s degree in Computer Science or related field OR 5+ years of relevant professional experiences and a Master’s degree in Computer Science or related field Additional Information
#DISNEYTECH #CDI The hiring range for this position in New York, NY and Seattle, WA is $159,500 to $213,900 per year, in San Francisco, CA is $166,800 to $223,600, and in Santa Monica, CA is $152,200 to $204,100 per year. The base pay offered will reflect internal equity and may vary by geographic region, knowledge, skills, and experience. A bonus and/or long-term incentive units may be provided as part of the compensation package, in addition to benefits, dependent on level and position offered.
#J-18808-Ljbffr
Our Technologists are designing and building the products and platforms that will power our media, advertising, and distribution businesses for years to come. Reach, Scale & Impact:
More than ever, Disney’s technology and products serve as a signature doorway for fans\' connections with the company’s brands and stories. Disney+. Hulu. ESPN. ABC. ABC News…and many more. These products and brands – and the unmatched stories, storytellers, and events they carry – matter to millions of people globally. Innovation:
We develop and implement groundbreaking products and techniques that shape industry norms, and solve complex and distinctive technical problems. The Big Data Infrastructure team
The Big Data Infrastructure team manages big data services such as Hadoop, Spark, Flink, Presto, Hive, etc. Our services are distributed across the data center and Cloud, supporting a large scale of data amount and thousands of physical resources. We focus on the virtualization of big data environments, cost efficiency, resiliency, and performance. The right person for this role should have proven experience with working in mission-critical infrastructure and enjoy building and maintaining large-scale data systems with the challenge of varied requirements and large storage capabilities. If you are someone who enjoys building large-scale big data infrastructure, then this is a great role for you Responsibilities
Develop, scale, and improve in-house/cloud and open-source big data engines (e.g. Spark, Flink, Presto/Trino, etc). Investigate new big data technology, and apply it to the DisneyStreaming production environment. Build next-gen cloud-based big data infrastructure for batch and streaming data applications, and continuously improve performance, scalability and availability Handle architectural and design considerations such as performance, scalability, reusability, and flexibility issues. Advocate engineering best practices, including the use of design patterns, code review, and automated unit/functional testing. Work together with other engineering teams to influence them on big data system design and optimization. Define and lead the adoption of best practices and processes. Collaborate with senior internal team members and external stakeholders to gather requirements and drive implementation Collaborate efficiently with Product Managers and other developers to build datastores as a service. Collaborate with senior internal team members and external stakeholders to gather requirements and drive implementation. Basic Qualifications
Experience in building in-house big data infrastructure. Experience in developing and optimizing big data components (e.g. Spark, Flink, Presto/Trino, etc) Experience with modern data formats (Iceberg, Delta, Hudi etc) Experience with CICD, fine-tuned metrics, security and compliance enhancement Ability to drive a project from end to end, including clarifying requirements, resolving conflicts, handling tech challenges and delivering the results Motivation to dive deep and become an expert in one or more big data areas. Ready to unpack and crack open-source software to fix bugs or develop new features, and contribute back to the community Preferred Qualifications
Experience in building in-house big data infrastructure. Experience in contributing big data components (e.g. HDFS, Hive, Spark, Flink, Presto/Trino, etc) Experience in container tech stack, including Kubernetes, Docker, Volcano, etc Experience in managing a big data cluster with over 1000 nodes. Required Education
7+ years of relevant professional experience and Bachelor’s degree in Computer Science or related field OR 5+ years of relevant professional experiences and a Master’s degree in Computer Science or related field Additional Information
#DISNEYTECH #CDI The hiring range for this position in New York, NY and Seattle, WA is $159,500 to $213,900 per year, in San Francisco, CA is $166,800 to $223,600, and in Santa Monica, CA is $152,200 to $204,100 per year. The base pay offered will reflect internal equity and may vary by geographic region, knowledge, skills, and experience. A bonus and/or long-term incentive units may be provided as part of the compensation package, in addition to benefits, dependent on level and position offered.
#J-18808-Ljbffr