The Dignify Solutions, LLC
Data Engineer (GCP/ Spark/Pyspark & Scala functional)
The Dignify Solutions, LLC, Bentonville, Arkansas, United States, 72712
Overview
The Dignify Solutions, LLC provided pay range. This range is provided by The Dignify Solutions, LLC. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more. Compensation
Base pay range: $100,000.00/yr - $120,000.00/yr Responsibilities
Note: The original description does not provide explicit role responsibilities beyond required technical skills. The content below reflects the qualifications and skills listed in the original description. Qualifications / Technical & Functional Skills
Tech Skills: Scala, Spark, GCP, Dataproc, Hadoop, Airflow, SBT, Maven, Docker, Kubernetes, PySpark, Jenkins, BigQuery Experience with workflow management tools such as Jenkins and Airflow Experience running Spark/Hadoop workloads using Dataproc, Dataflow, Cloud Composer, EMR, HDInsight or similar Working expertise with Big Data Technologies Spark, PySpark, Hive, and SQL Expertise in writing complex, highly optimized queries across large data sets Knowledge and experience in Kafka, Storm, Druid and Presto Seniority level
Mid-Senior level Employment type
Full-time Job function
Other Industries
IT Services and IT Consulting Note:
This listing included additional postings and location data which have been omitted to maintain a focused, job-focused description. The original content also included referral notes and multiple salary postings not relevant to this specific role.
#J-18808-Ljbffr
The Dignify Solutions, LLC provided pay range. This range is provided by The Dignify Solutions, LLC. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more. Compensation
Base pay range: $100,000.00/yr - $120,000.00/yr Responsibilities
Note: The original description does not provide explicit role responsibilities beyond required technical skills. The content below reflects the qualifications and skills listed in the original description. Qualifications / Technical & Functional Skills
Tech Skills: Scala, Spark, GCP, Dataproc, Hadoop, Airflow, SBT, Maven, Docker, Kubernetes, PySpark, Jenkins, BigQuery Experience with workflow management tools such as Jenkins and Airflow Experience running Spark/Hadoop workloads using Dataproc, Dataflow, Cloud Composer, EMR, HDInsight or similar Working expertise with Big Data Technologies Spark, PySpark, Hive, and SQL Expertise in writing complex, highly optimized queries across large data sets Knowledge and experience in Kafka, Storm, Druid and Presto Seniority level
Mid-Senior level Employment type
Full-time Job function
Other Industries
IT Services and IT Consulting Note:
This listing included additional postings and location data which have been omitted to maintain a focused, job-focused description. The original content also included referral notes and multiple salary postings not relevant to this specific role.
#J-18808-Ljbffr