ideaVat
Jobs Bridge Inc is among the fastest growing IT staffing / professional services organizations with its own job portal. Jobs Bridge works closely with a large number of IT organizations in high-demand technology skill sets.
Job Description
Skills:
Big Data Developer, Java, Hadoop, DaaS, Hive, MapReduce, Pig, Sqoop, Flume, SQL, XML, JSON Location:
Glendale, CA Total Experience:
3 years Max Salary:
Not Mentioned Employment Type:
Full Time (Direct Jobs) Domain:
Any Description
OPT and EAD candidates can apply. The Big Data Developer will be responsible for delivering Business Intelligence efforts, managing existing data extraction jobs, and building new data pipelines from various sources into Hadoop. The role involves working with heterogeneous data models, data mapping, and transformation, as well as using data as a service (DaaS). Required Skills and Experience
At least 5 years of hands-on experience in Java Enterprise ecosystem, including design, development, testing, and deployment. At least 2 years of experience with Hadoop, Hive, MapReduce, Pig, Sqoop, Flume in a production environment. Experience with data segmentation, organization, security, and encryption models. Proficiency with Hadoop/HBase/Hive/MRV1/MRV2. Experience orchestrating complex data flows; familiarity with Apache Spark, Storm, Kafka is preferred. Strong skills in SQL, XML, JSON, UNIX. Experience integrating heterogeneous applications and designing RESTful Web Services. Knowledge of open source tools in Java Enterprise ecosystem. Ability to collaborate effectively and resolve infrastructure issues. Knowledge of Data Governance, IBM CDC, Watson Explorer, or Apache Solr is a plus. Ability to work independently and as part of a team, with excellent communication skills. Education
Bachelor's degree or equivalent required. Additional Information
Multiple openings available for OPT/CPT/H4/L2/EAD/Citizens.
#J-18808-Ljbffr
Skills:
Big Data Developer, Java, Hadoop, DaaS, Hive, MapReduce, Pig, Sqoop, Flume, SQL, XML, JSON Location:
Glendale, CA Total Experience:
3 years Max Salary:
Not Mentioned Employment Type:
Full Time (Direct Jobs) Domain:
Any Description
OPT and EAD candidates can apply. The Big Data Developer will be responsible for delivering Business Intelligence efforts, managing existing data extraction jobs, and building new data pipelines from various sources into Hadoop. The role involves working with heterogeneous data models, data mapping, and transformation, as well as using data as a service (DaaS). Required Skills and Experience
At least 5 years of hands-on experience in Java Enterprise ecosystem, including design, development, testing, and deployment. At least 2 years of experience with Hadoop, Hive, MapReduce, Pig, Sqoop, Flume in a production environment. Experience with data segmentation, organization, security, and encryption models. Proficiency with Hadoop/HBase/Hive/MRV1/MRV2. Experience orchestrating complex data flows; familiarity with Apache Spark, Storm, Kafka is preferred. Strong skills in SQL, XML, JSON, UNIX. Experience integrating heterogeneous applications and designing RESTful Web Services. Knowledge of open source tools in Java Enterprise ecosystem. Ability to collaborate effectively and resolve infrastructure issues. Knowledge of Data Governance, IBM CDC, Watson Explorer, or Apache Solr is a plus. Ability to work independently and as part of a team, with excellent communication skills. Education
Bachelor's degree or equivalent required. Additional Information
Multiple openings available for OPT/CPT/H4/L2/EAD/Citizens.
#J-18808-Ljbffr