Big Data Engineer at Charlotte, NC
CapB InfoteK - Charlotte, North Carolina, United States, 28245
Work at CapB InfoteK
Overview
- View job
Overview
Big Data Engineer at Charlotte, NC
role at
CapB InfoteK The
ETL Hadoop Data Engineer
will be responsible for analyzing business requirements, designing, developing, and implementing highly efficient and scalable ETL processes. Candidates are required to perform daily project functions with a focus on meeting business objectives on time in a rapidly changing work environment. They should be able to lead and drive a globally located team to achieve business objectives. Required Skills: 5-10 years of hands-on experience working with Informatica PowerCenter, Hadoop, and related ecosystem components (Hive, Impala, Spark). Strong knowledge of relational databases such as Teradata, DB2, Oracle, SQL Server. Experience in writing shell scripts on Unix platforms. Experience in data warehousing, ETL tools, MPP database systems. Understanding of Data Models (Conceptual, Logical, Physical, Dimensional, Relational). Ability to analyze functional specifications and assist in designing technical solutions. Identify data sources and define data extraction methodologies. Proficiency in writing complex queries in Teradata, DB2, Oracle, PL/SQL. Maintain batch processing jobs and respond to critical production issues. Good communication skills for stakeholder interaction. Strong experience with Data Analysis, Profiling, and Root Cause Analysis. Ability to understand banking system processes and data flow. Ability to work independently, lead, and mentor teams.
Seniority level Mid-Senior level Employment type
Full-time Job function
Engineering and Information Technology Referrals increase your chances of interviewing at CapB InfoteK by 2x. Sign in to set job alerts for Big Data Developer roles. #J-18808-Ljbffr