Logo
Jobsbridge

BigData Consultant

Jobsbridge, Santa Ana, California, United States, 92725

Save Job

Jobsbridge, Inc.

is a fast-growing Silicon Valley-based I.T staffing and professional services company specializing in Web, Cloud & Mobility staffing solutions. Be it core Java, full-stack Java, Web/UI designers, Big Data or Cloud or Mobility developers/architects, we have them all. Job Description

Responsible for Big Data (1.3) / data warehouse Maintenance, Clean up, monitoring with heavy loads. Acts as a lead in identification and troubleshooting processing issues impacting timely availability of data in the data warehouse or delivery of critical reporting within established SLAs. Identifies and recommends improvements in production application solutions or operational processes in support of data warehouse applications and business intelligence reporting (i.e., data quality, performance, static and dynamic reports, etc.). Focuses on the overall stability and availability of the Big Data applications and the associated interfaces, data transport protocols. Researches, manages and coordinates resolution of complex issues through root cause analysis as appropriate. Ensure adherence to established problem / incident management, change management and other internal IT processes. Ensures 3rd party vendors engaged in projects deliver on their responsibilities and conform to Gap’s established standards. Ensures comprehensive knowledge transition from development teams on new or modified applications moving to ongoing production support. Seeks improvement opportunities in design, solution implementation approaches in partnership with Architects and Operations team for ensuring the performance and health of the Big Data and other EDW Applications. Participates in production migrations and upgrades, develops processes for sustaining the Big Data, EDW & BI environment and ensures implementation of the same. Ensures timely and accurate escalation of issues to management. Technical Skills

7+ years of technical experience in Enterprise Data Warehouse and Business Intelligence environments. 3+ years developing and supporting data integration in Hadoop (1.3) with more than 10 clusters is a must. 2-3 years developing or supporting applications using Talend, Pig, Hive, Python, Spark, Hbase and Pig. Advanced experience with real-time data ingestion (Kafka, MQ etc) and ingestion using Talend into Hortonworks or Cloudera environment is a plus. Possess a strong technology, data, and big data/data warehouse application design background. Experience with various databases, Teradata, HBase is preferred. Extensive experience with production batch scheduling and monitoring (CAWA, Oozie etc). Qualifications #J-18808-Ljbffr