Palo Alto Networks
Sr Principal Engineer Software (Big Data)
Palo Alto Networks, Santa Clara, California, us, 95053
Sr Principal Engineer Software (Big Data)
Prisma Access (formally GlobalProtect Cloud Service) provides protection straight from the cloud to make access to the cloud secure. It combines the connectivity and security you need - and delivers it everywhere you need it. Using cutting-edge public and private cloud technologies extending the next-generation security protection to all cloud services, customers on-premise remote networks and mobile users. We are seeking an experienced Big Data Software Engineer to design, develop and deliver next-generation technologies within our Prisma Access team. We want passionate engineers who love to code and build great products. Engineers who bring new ideas in all facets of software development. Collaboration and teamwork are at the foundation of our culture and we need engineers who can communicate and work well with others towards achieving a common goal. This role is located at our Santa Clara, CA headquarters. Your Impact Design, develop and implement highly scalable software features on our next-generation security platform as part of our Prisma Access Work with different development and quality assurances groups to achieve the best quality Suggest and implement improvements to the development process Work with DevOps and the Technical Support teams to troubleshoot customer issues Your Experience At least 6+ years of development experience Experience in developing services in the cloud/Kubernetes Experience with building data pipelines and analytics pipelines using like dataflow, pubsub, GKE Strong understanding of message queuing, stream processing, and highly scalable 'big data' data stores Experience with RESTful interfaces and Build Management tools (Gradle, maven) Experience in continuous integration and design Experience with Test-Driven Development Experience with distributed computing and object-oriented design and analysis Strong understanding of microservices-based deployments with the ability to design services Showcase your ability of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, peer review, and operations Familiarity with Agile (e.g., Scrum Process) Familiarity in Big Data technologies like Hive, Kafka, Hadoop, SQL, developing APIs Familiarity working with GCP or other Cloud platforms such as AWS and Azure High energy and the ability to work in a fast-paced environment with a can-do attitude Enjoys working with many different teams with strong collaboration and communications skills Fast learner and eager to absorb new emerging technologies M.S./B.S. degree in Computer Science or Electrical Engineering
Prisma Access (formally GlobalProtect Cloud Service) provides protection straight from the cloud to make access to the cloud secure. It combines the connectivity and security you need - and delivers it everywhere you need it. Using cutting-edge public and private cloud technologies extending the next-generation security protection to all cloud services, customers on-premise remote networks and mobile users. We are seeking an experienced Big Data Software Engineer to design, develop and deliver next-generation technologies within our Prisma Access team. We want passionate engineers who love to code and build great products. Engineers who bring new ideas in all facets of software development. Collaboration and teamwork are at the foundation of our culture and we need engineers who can communicate and work well with others towards achieving a common goal. This role is located at our Santa Clara, CA headquarters. Your Impact Design, develop and implement highly scalable software features on our next-generation security platform as part of our Prisma Access Work with different development and quality assurances groups to achieve the best quality Suggest and implement improvements to the development process Work with DevOps and the Technical Support teams to troubleshoot customer issues Your Experience At least 6+ years of development experience Experience in developing services in the cloud/Kubernetes Experience with building data pipelines and analytics pipelines using like dataflow, pubsub, GKE Strong understanding of message queuing, stream processing, and highly scalable 'big data' data stores Experience with RESTful interfaces and Build Management tools (Gradle, maven) Experience in continuous integration and design Experience with Test-Driven Development Experience with distributed computing and object-oriented design and analysis Strong understanding of microservices-based deployments with the ability to design services Showcase your ability of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, peer review, and operations Familiarity with Agile (e.g., Scrum Process) Familiarity in Big Data technologies like Hive, Kafka, Hadoop, SQL, developing APIs Familiarity working with GCP or other Cloud platforms such as AWS and Azure High energy and the ability to work in a fast-paced environment with a can-do attitude Enjoys working with many different teams with strong collaboration and communications skills Fast learner and eager to absorb new emerging technologies M.S./B.S. degree in Computer Science or Electrical Engineering