Strategic Staffing Solutions
Lead Cloud Data Platform Engineer
Strategic Staffing Solutions, Phoenix, Arizona, United States, 85003
STRATEGIC STAFFING SOLUTIONS (S3) HAS AN OPENING!
Strategic Staffing Solutions is currently looking for a
Lead Cloud Data Platform Engineer
, a W2 contract opportunity with one of our largest clients!
Candidates should be willing to work on our W2 ONLY, NO C2C
Job Title : Lead Cloud Data Platform Engineer
Role Type: W2 only
Duration: 12 months
Locations: AZ, Dallas, Charlotte, Philadelphia
Schedule: Onsite
Required Skills
: 5+ years of Data engineering Hadoop and GCP Data Lake architecture and design NoSQL databases
Job Summary:
We are looking for someone to join our team to help enable solutions for the next generation of data analysis while working in a fast-paced environment that fosters growth and development. If this sounds like you, we want to hear from you.
We are currently seeking a
Lead Cloud Data Platform Engineer
to apply deep technical skills to create data products, develop AI-based automation tools, and build a world-class cloud analytics capability for company's Cyber Security Data Ecosystem on a Hybrid cloud. The successful candidate will continually innovate and pioneer the use of new technologies, and drive adoption of these amongst a team of talented data engineers. You will be an integral part of our migration from our on-premises systems to the Google Azure cloud. You will be a vital member of an agile team helping to lead the design and hands-on implementation of modern data processing capabilities. This is a visible role that allows you to share your knowledge and skills with other developers and product teams in a collaborative environment.
Responsibilities:
In this role you will
• Implement and operationalize modern self-serve data capabilities on Google Cloud to ingest, transform, and distribute data for a variety of big data apps • Enable secure data pipelines to ensure data protection in transit and at rest • Automate data governance capabilities to ensure proper data observability throughout the data flows • Leverage AI/Agentic frameworks to automate data management, governance, and data consumption capabilities • Create repeatable processes to instantiate data processes that fuel analytics products and business decisions • Work with principal engineers, product managers, and data engineers to roadmap, plan, and deliver key data capabilities based on priority • Create the Future of Data: design and implement processes using the entire toolset within GCP to shape the future of data
Required Skills:
• 5+ years of experience in data engineering including hands-on experience working with Hadoop and Google Cloud data solutions: creating/supporting Spark based processing, Kafka streaming, in a highly collaborative team
• 3+ years of experience with Data lakehouse architecture and design, including hands-on experience with Python, pySpark, Apache Kafka, Airflow, and SQL, GPC Cloud Storage, BigQuery, Data Proc, Cloud Composer
• 2+ years working with NoSQL databases such as columnar databases, graph databases, document databases, KV stores, and associated data formats
• Public cloud certifications such as GCP Professional Data Engineer, Azure Data Engineer, or AWS Specialty Data Analytics
Desired Qualifications:
• Proven skills with data migration from on-prem to a cloud native environment
• Proven experience working with the Hadoop ecosystem capabilities such as Hive, HDFS, Parquet, Iceberg, and Delta Tables
• Deep understanding of data warehouse, data cloud architecture, building data pipelines, and orchestration
• Design and implementation of highly scalable and modular data pipelines with built-in data controls for automating data governance
• Familiarity of GenAI frameworks such as Langchain and Langraph to develop agent-based data capabilities
• Dev Ops and CI/CD deployments including Git, Jenkins, Docker, and Kubernetes
• Web based UI development using React and Node JS is a plus
"Beware of scams. S3 never asks for money during its onboarding process."
Strategic Staffing Solutions is currently looking for a
Lead Cloud Data Platform Engineer
, a W2 contract opportunity with one of our largest clients!
Candidates should be willing to work on our W2 ONLY, NO C2C
Job Title : Lead Cloud Data Platform Engineer
Role Type: W2 only
Duration: 12 months
Locations: AZ, Dallas, Charlotte, Philadelphia
Schedule: Onsite
Required Skills
: 5+ years of Data engineering Hadoop and GCP Data Lake architecture and design NoSQL databases
Job Summary:
We are looking for someone to join our team to help enable solutions for the next generation of data analysis while working in a fast-paced environment that fosters growth and development. If this sounds like you, we want to hear from you.
We are currently seeking a
Lead Cloud Data Platform Engineer
to apply deep technical skills to create data products, develop AI-based automation tools, and build a world-class cloud analytics capability for company's Cyber Security Data Ecosystem on a Hybrid cloud. The successful candidate will continually innovate and pioneer the use of new technologies, and drive adoption of these amongst a team of talented data engineers. You will be an integral part of our migration from our on-premises systems to the Google Azure cloud. You will be a vital member of an agile team helping to lead the design and hands-on implementation of modern data processing capabilities. This is a visible role that allows you to share your knowledge and skills with other developers and product teams in a collaborative environment.
Responsibilities:
In this role you will
• Implement and operationalize modern self-serve data capabilities on Google Cloud to ingest, transform, and distribute data for a variety of big data apps • Enable secure data pipelines to ensure data protection in transit and at rest • Automate data governance capabilities to ensure proper data observability throughout the data flows • Leverage AI/Agentic frameworks to automate data management, governance, and data consumption capabilities • Create repeatable processes to instantiate data processes that fuel analytics products and business decisions • Work with principal engineers, product managers, and data engineers to roadmap, plan, and deliver key data capabilities based on priority • Create the Future of Data: design and implement processes using the entire toolset within GCP to shape the future of data
Required Skills:
• 5+ years of experience in data engineering including hands-on experience working with Hadoop and Google Cloud data solutions: creating/supporting Spark based processing, Kafka streaming, in a highly collaborative team
• 3+ years of experience with Data lakehouse architecture and design, including hands-on experience with Python, pySpark, Apache Kafka, Airflow, and SQL, GPC Cloud Storage, BigQuery, Data Proc, Cloud Composer
• 2+ years working with NoSQL databases such as columnar databases, graph databases, document databases, KV stores, and associated data formats
• Public cloud certifications such as GCP Professional Data Engineer, Azure Data Engineer, or AWS Specialty Data Analytics
Desired Qualifications:
• Proven skills with data migration from on-prem to a cloud native environment
• Proven experience working with the Hadoop ecosystem capabilities such as Hive, HDFS, Parquet, Iceberg, and Delta Tables
• Deep understanding of data warehouse, data cloud architecture, building data pipelines, and orchestration
• Design and implementation of highly scalable and modular data pipelines with built-in data controls for automating data governance
• Familiarity of GenAI frameworks such as Langchain and Langraph to develop agent-based data capabilities
• Dev Ops and CI/CD deployments including Git, Jenkins, Docker, and Kubernetes
• Web based UI development using React and Node JS is a plus
"Beware of scams. S3 never asks for money during its onboarding process."