First Fidelity Bank
Job Details
Level Experienced
Job Location Corporate Office - Oklahoma City, OK
Position Type Full Time
Education Level 4 Year Degree
Travel Percentage None
Job Shift Day
Description
SUMMARY
This position is NOT a remote position. It is on-site at our Corporate office in Oklahoma City.
The Senior Data Engineer is part of the Data Analytics team that supports the entire organization in developing, programming, and maintaining the FFB Enterprise Data Environment. Duties include developing and maintaining enterprise data integration points and pipelines across a hybrid cloud environment. In this role, the Senior Data Engineer will design and maintain the processes required to manage the various enterprise data assets and workflows. They must be self-directed and comfortable supporting the data needs of FFB's business units, core banking systems, customer platforms, and business processes, while ensuring compliance with regulatory requirements and company policies and standards.
PRIMARY DUTIES/RESPONSIBILITIES:
Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of the employee for this job. Duties, responsibilities and activities may change at any time with or without notice. Create and maintain an optimal data architecture and data pipelines designed to scale with current and future enterprise needs. Organize and assemble large, complex data sets from multiple sources (Jack Henry SilverLake, Salesforce, etc.) into data warehouse, data marts, and data cubes that meet business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, data quality controls, and data infrastructure for scalability and resilience. Build the infrastructure and tools required for optimal extraction, transformation, and loading of data from a wide variety of sources using SQL, Python, AWS Glue, and other technologies. Work with team members to support data analytics and forecasting that utilize our data pipelines to provide actionable insights into operational efficiency, customer behavior, and key performance indicators. Work with stakeholders to resolve data related technical issues and support data infrastructure needs across departments. Ensure that data processes are properly segmented, encrypted, and secured across network boundaries (on-prem and cloud) through AWS S3, Aurora PostgreSQL, and AWS Glue workflows. Collaborate with analytics, engineering, and business subject matter experts to support system enhancements and improved business reporting. Develop and support deployment operations (DevOps) and data operations (DataOps) principles and workflows. Ensure compliance with industry regulations, bank policies and procedures. Qualifications
EXPERIENCE REQUIREMENTS:
4+ years of experience in a Data Engineer, Architecture and/or Analyst or role. 4+ years of experience in the banking industry. 2+ years of experience in programming with Python. EDUCATION REQUIREMENTS:
Bachelor's and/or Graduate degree in Computer Science, or another quantitative field.
OTHER REQUIREMENTS (SKILLS, ABILITIES, CHARACTERISTICS):
Advanced working SQL knowledge and experience with relational databases, complex query authoring, and graph databases. Experience building and optimizing scalable data pipelines, data architectures, and data sets. Strong analytic skills related to working with structured and semi-structured datasets. Build processes supporting data transformation, data structures, metadata, dependency, and workload management. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. Excellent interpersonal skills for working with technical and non-technical colleagues across the organization. Experience with the following is preferred:
Core Banking Platforms: Jack Henry Silverlake Data Integration: Salesforce, use of REST and SOAP API's Databases: PostgreSQL, SQL Server, AWS Neptune or Neo4j Programming Languages: Python, PowerShell Scripting Query Languages: SQL & T-SQL, Postgres SQL & PL/pgSQL, Cypher, Gremlin or SPARQL Cloud Platforms: AWS (PostgreSQL, Redshift, Neptune, Glue, S3, EC2) BI & Visualization: DOMO, Power BI
ADDITIONAL INFORMATION
SUPERVISORY RESPONSIBILITY:
None
PHYSICAL REQUIREMENTS:
Must be able to work within a routine office environment. Ability to travel from one office location to another.
EOE D/V
Level Experienced
Job Location Corporate Office - Oklahoma City, OK
Position Type Full Time
Education Level 4 Year Degree
Travel Percentage None
Job Shift Day
Description
SUMMARY
This position is NOT a remote position. It is on-site at our Corporate office in Oklahoma City.
The Senior Data Engineer is part of the Data Analytics team that supports the entire organization in developing, programming, and maintaining the FFB Enterprise Data Environment. Duties include developing and maintaining enterprise data integration points and pipelines across a hybrid cloud environment. In this role, the Senior Data Engineer will design and maintain the processes required to manage the various enterprise data assets and workflows. They must be self-directed and comfortable supporting the data needs of FFB's business units, core banking systems, customer platforms, and business processes, while ensuring compliance with regulatory requirements and company policies and standards.
PRIMARY DUTIES/RESPONSIBILITIES:
Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of the employee for this job. Duties, responsibilities and activities may change at any time with or without notice. Create and maintain an optimal data architecture and data pipelines designed to scale with current and future enterprise needs. Organize and assemble large, complex data sets from multiple sources (Jack Henry SilverLake, Salesforce, etc.) into data warehouse, data marts, and data cubes that meet business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, data quality controls, and data infrastructure for scalability and resilience. Build the infrastructure and tools required for optimal extraction, transformation, and loading of data from a wide variety of sources using SQL, Python, AWS Glue, and other technologies. Work with team members to support data analytics and forecasting that utilize our data pipelines to provide actionable insights into operational efficiency, customer behavior, and key performance indicators. Work with stakeholders to resolve data related technical issues and support data infrastructure needs across departments. Ensure that data processes are properly segmented, encrypted, and secured across network boundaries (on-prem and cloud) through AWS S3, Aurora PostgreSQL, and AWS Glue workflows. Collaborate with analytics, engineering, and business subject matter experts to support system enhancements and improved business reporting. Develop and support deployment operations (DevOps) and data operations (DataOps) principles and workflows. Ensure compliance with industry regulations, bank policies and procedures. Qualifications
EXPERIENCE REQUIREMENTS:
4+ years of experience in a Data Engineer, Architecture and/or Analyst or role. 4+ years of experience in the banking industry. 2+ years of experience in programming with Python. EDUCATION REQUIREMENTS:
Bachelor's and/or Graduate degree in Computer Science, or another quantitative field.
OTHER REQUIREMENTS (SKILLS, ABILITIES, CHARACTERISTICS):
Advanced working SQL knowledge and experience with relational databases, complex query authoring, and graph databases. Experience building and optimizing scalable data pipelines, data architectures, and data sets. Strong analytic skills related to working with structured and semi-structured datasets. Build processes supporting data transformation, data structures, metadata, dependency, and workload management. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. Excellent interpersonal skills for working with technical and non-technical colleagues across the organization. Experience with the following is preferred:
Core Banking Platforms: Jack Henry Silverlake Data Integration: Salesforce, use of REST and SOAP API's Databases: PostgreSQL, SQL Server, AWS Neptune or Neo4j Programming Languages: Python, PowerShell Scripting Query Languages: SQL & T-SQL, Postgres SQL & PL/pgSQL, Cypher, Gremlin or SPARQL Cloud Platforms: AWS (PostgreSQL, Redshift, Neptune, Glue, S3, EC2) BI & Visualization: DOMO, Power BI
ADDITIONAL INFORMATION
SUPERVISORY RESPONSIBILITY:
None
PHYSICAL REQUIREMENTS:
Must be able to work within a routine office environment. Ability to travel from one office location to another.
EOE D/V