Keylent Inc
***Max rate $60 - 32267-1
***Onsite 5 days in Austin TX
Sr. Developer
Job summary
Experience : 6 to 10yrs
Required Skills - Apache Hadoop
Nice to have skills - Shell scripting,Python,Hive,Apache
Sentry,Impala,Hadoop Administration,Delivery Management,Risk Management,Project Stakeholder Management,Cloudera Data Platform,Hadoop\Big Data Admin
We are seeking a Sr. Developer (Sr. Associate - Projects) with 6-10 years of experience, specializing in Apache Hadoop ecosystems and Hive, to join our dynamic team. This role involves developing high-quality solutions, optimizing data processing, and ensuring data security. The ideal candidate will contribute to our mission by enhancing our data capabilities, thereby impacting our global financial services positively.
Roles & Responsibilities Person will be responsible to Perform Big Data Administration and Engineering activities on multiple Hadoop Kafka Hbase and Spark clusters Work on Performance Tuning and Increase Operational efficiency on a continuous basis Monitor health of the platforms and Generate Performance Reports and Monitor and provide continuous improvements Working closely with development engineering and operation teams jointly work on key deliverables ensuring production scalability and stability Develop and enhance platform best practices Ensure the Hadoop platform can effectively meet performance SLA requirements Responsible for Big Data Production environment which includes Hadoop HDFS and YARN Hive Spark Livy SOLR Oozie Kafka Airflow Nifi Hbase etc Perform optimization debugging and capacity planning of a Big Data cluster Perform security remediation automation and self heal as per the requirement
Sentry,Impala,Hadoop Administration,Delivery Management,Risk Management,Project Stakeholder Management,Cloudera Data Platform,Hadoop\Big Data Admin
We are seeking a Sr. Developer (Sr. Associate - Projects) with 6-10 years of experience, specializing in Apache Hadoop ecosystems and Hive, to join our dynamic team. This role involves developing high-quality solutions, optimizing data processing, and ensuring data security. The ideal candidate will contribute to our mission by enhancing our data capabilities, thereby impacting our global financial services positively.
Roles & Responsibilities Person will be responsible to Perform Big Data Administration and Engineering activities on multiple Hadoop Kafka Hbase and Spark clusters Work on Performance Tuning and Increase Operational efficiency on a continuous basis Monitor health of the platforms and Generate Performance Reports and Monitor and provide continuous improvements Working closely with development engineering and operation teams jointly work on key deliverables ensuring production scalability and stability Develop and enhance platform best practices Ensure the Hadoop platform can effectively meet performance SLA requirements Responsible for Big Data Production environment which includes Hadoop HDFS and YARN Hive Spark Livy SOLR Oozie Kafka Airflow Nifi Hbase etc Perform optimization debugging and capacity planning of a Big Data cluster Perform security remediation automation and self heal as per the requirement