Jobs via Dice
Overview
Role: Data Architect Location: Cincinnati, OH (Onsite) Experience: 14+ years (Banking/Financial experience must) Responsibilities
Design and implement robust, scalable data pipelines and ETL processes. Develop and maintain data models and data products that support business needs. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Ensure data quality, integrity, and security across all data products. Monitor and optimize performance of data systems and infrastructure. Advocate for data best practices and contribute to the evolution of the data platform. Required Qualifications
Proficiency in: BI - Data Engineering, Elevate & DBT, ETL (DataStage), Kafka, Snowflake. Strong problem-solving and analytical skills. Excellent communication and teamwork abilities. Proven experience with relational and non-relational databases (e.g. SQL, PostgreSQL). Proficiency in programming languages such as Python, Java and data pipeline tools (e.g. DBT). Experience with cloud platforms (e.g. AWS, Azure, Google Cloud Platform) and modern data warehouses (e.g. Snowflake). Hands-on expertise in IBM DataStage for building and managing ETL processes. Knowledge of big data technologies (preferably Kafka) is a plus. Preferred Qualifications
Experience with CI/CD (Jenkins/MettleCI), GitHub, Scripting (Python, SAS, SQL, Java). Familiarity with Agile methodologies and DevOps practices (Part of an Agile Squad SAFe, Scrum). Prior experience working in Financial Institutions or other highly regulated industries. Job Function
Engineering and Information Technology Industry
Software Development
#J-18808-Ljbffr
Role: Data Architect Location: Cincinnati, OH (Onsite) Experience: 14+ years (Banking/Financial experience must) Responsibilities
Design and implement robust, scalable data pipelines and ETL processes. Develop and maintain data models and data products that support business needs. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Ensure data quality, integrity, and security across all data products. Monitor and optimize performance of data systems and infrastructure. Advocate for data best practices and contribute to the evolution of the data platform. Required Qualifications
Proficiency in: BI - Data Engineering, Elevate & DBT, ETL (DataStage), Kafka, Snowflake. Strong problem-solving and analytical skills. Excellent communication and teamwork abilities. Proven experience with relational and non-relational databases (e.g. SQL, PostgreSQL). Proficiency in programming languages such as Python, Java and data pipeline tools (e.g. DBT). Experience with cloud platforms (e.g. AWS, Azure, Google Cloud Platform) and modern data warehouses (e.g. Snowflake). Hands-on expertise in IBM DataStage for building and managing ETL processes. Knowledge of big data technologies (preferably Kafka) is a plus. Preferred Qualifications
Experience with CI/CD (Jenkins/MettleCI), GitHub, Scripting (Python, SAS, SQL, Java). Familiarity with Agile methodologies and DevOps practices (Part of an Agile Squad SAFe, Scrum). Prior experience working in Financial Institutions or other highly regulated industries. Job Function
Engineering and Information Technology Industry
Software Development
#J-18808-Ljbffr