Logo
Global Channel Management

Hadoop Developer

Global Channel Management, Atlanta

Save Job

Global Channel Management is a technology company specializing in recruiting and staff augmentation. Our account managers and recruiters have over a decade of experience across various verticals. GCM understands the challenges companies face regarding skills and experience needed to fill day-to-day functions. Organizations aim to reduce training and labor costs while securing the best talent for the job. Qualifications Hadoop Developer requires 5+ years of experience with Python, Java/Scala. Additional requirements include: B.S. and M.S. in mathematics, computer science, or engineering 3+ years of demonstrated technical proficiency with Spark, big data projects, and data modeling Experience designing and developing data ingestion and processing frameworks using tools such as ADF, NiFi, Sqoop, Eclipse At least 5 years of experience in the Big Data space with Apache Spark, Hive, and MapReduce Experience with Big Data ETL tools like Spark, Hive, Kafka, Sqoop, MapReduce, Scala, Zookeeper, etc. Experience with Cloudera or Hortonworks; exposure to Teradata is beneficial Hadoop Developer duties Translate complex functional and technical requirements into detailed designs Proficiency in Spark for technical development and implementation Load disparate datasets leveraging big data technologies such as Kafka #J-18808-Ljbffr