G2O
Overview
We’re passionate about designing and delivering top-notch digital experiences for our clients — and their customers — and helping them create efficiencies using data and technology. We have a diverse team of experts dedicated to getting clients from goals to outcomes. This is a hybrid position. Will need to work 3 days a week in offices located in Columbus, OH. Role Overview
This role focuses on building robust and scalable data pipelines to extract data from a variety of sources, integrate it with our data lake/warehouse, and prepare it for analysis by our Data Analysts and training custom AI models. This position is critical for enabling our focus on vendor-provided capabilities and eventually building custom solutions. Key Responsibilities
Design, build, and maintain scalable and efficient ETL/ELT data pipelines to ingest data from internal and external sources (e.g., APIs from EPIC, Workday, relational databases, flat files) and data warehouse to ensure data is clean, accessible, and ready for analysis and model training. Collaborate with the Data Analyst(s) and other stakeholders to understand their data requirements and provide them with clean, well-structured datasets. Implement data governance, security, and quality controls to ensure data integrity and compliance. Automate data ingestion, transformation, and validation processes. Work with our broader IT team to ensure seamless integration of data infrastructure with existing systems. Contribute to the evaluation and implementation of new data technologies and tools. Required Skills & Qualifications
Informatica: Hands-on experience with IICS/IDMC is required. ETL/ELT Development: Strong experience in designing and building data pipelines using ETL/ELT tools and frameworks. SQL: Advanced proficiency in SQL for data manipulation, transformation, and optimization. Programming: Strong programming skills in Python (or a similar language) for scripting, automation, and data processing. Data Warehousing: Experience with data warehousing concepts and technologies. EPIC EMR: Experience with EPIC Health System’s databases (Chronicles, Clarity, or Caboodle) and ability to extract data and build ETL pipelines. Problem-Solving: Proven ability to troubleshoot and resolve data pipeline issues. Data Modeling: Experience with various data modeling techniques (e.g., dimensional modeling). Real-time Processing: Familiarity with real-time data streaming technologies (e.g., Kafka, Azure Event Hubs). Education: Bachelor's degree in Computer Science, Engineering, or related field. Nice-to-Have Skills
API Integration: Experience building data connectors and integrating with APIs from major enterprise systems (e.g., EPIC, Workday). Cloud Computing: Hands-on experience with at least one major cloud platform's data services (e.g., Microsoft Azure Data Factory, Azure Fabric, IICS, IDMC). Version Control: Proficiency with Git for code management and collaboration. CI/CD: Knowledge of Continuous Integration/Continuous Deployment practices for data pipelines. AI/ML MLOps: A basic understanding of the machine learning lifecycle and how to build data pipelines to support model training and deployment. Experience with Microsoft Fabric: Direct experience with Microsoft Fabric's integrated data platform (OneLake, Data Factory, Synapse Data Engineering). About G2O
We blend the research and design, technology, and data expertise to deliver the solutions our clients crave — and we do all of this as one in-house team, from vision to execution. We’re also the largest company of our kind based in Ohio to do this — and have been evolving how we do it for 40 years. Individually, we bring a wealth of experience from diverse backgrounds — personally and in business. We’re a diverse and passionate team of leaders and experts in technology, data, analytics, design, content, and more. Each person brings something distinct to our team, which strengthens collaboration and elevates client outcomes. Are you ready to collaborate to greatness with us? Company Details
Seniority level: Mid-Senior level Employment type: Full-time Job function: Consulting, Engineering, and Information Technology Industries: IT Services and IT Consulting, Data Infrastructure and Analytics, and IT System Custom Software Development Referrals increase your chances of interviewing at G2O. Get notified about new Data Engineer jobs in Columbus, Ohio Metropolitan Area. Location examples and salary ranges shown in the original posting include: Columbus, OH $109,000.00-$155,500.00; Washington Court House, OH $39.00-$75.00; and several other Columbus-area postings with varying ranges.
#J-18808-Ljbffr
We’re passionate about designing and delivering top-notch digital experiences for our clients — and their customers — and helping them create efficiencies using data and technology. We have a diverse team of experts dedicated to getting clients from goals to outcomes. This is a hybrid position. Will need to work 3 days a week in offices located in Columbus, OH. Role Overview
This role focuses on building robust and scalable data pipelines to extract data from a variety of sources, integrate it with our data lake/warehouse, and prepare it for analysis by our Data Analysts and training custom AI models. This position is critical for enabling our focus on vendor-provided capabilities and eventually building custom solutions. Key Responsibilities
Design, build, and maintain scalable and efficient ETL/ELT data pipelines to ingest data from internal and external sources (e.g., APIs from EPIC, Workday, relational databases, flat files) and data warehouse to ensure data is clean, accessible, and ready for analysis and model training. Collaborate with the Data Analyst(s) and other stakeholders to understand their data requirements and provide them with clean, well-structured datasets. Implement data governance, security, and quality controls to ensure data integrity and compliance. Automate data ingestion, transformation, and validation processes. Work with our broader IT team to ensure seamless integration of data infrastructure with existing systems. Contribute to the evaluation and implementation of new data technologies and tools. Required Skills & Qualifications
Informatica: Hands-on experience with IICS/IDMC is required. ETL/ELT Development: Strong experience in designing and building data pipelines using ETL/ELT tools and frameworks. SQL: Advanced proficiency in SQL for data manipulation, transformation, and optimization. Programming: Strong programming skills in Python (or a similar language) for scripting, automation, and data processing. Data Warehousing: Experience with data warehousing concepts and technologies. EPIC EMR: Experience with EPIC Health System’s databases (Chronicles, Clarity, or Caboodle) and ability to extract data and build ETL pipelines. Problem-Solving: Proven ability to troubleshoot and resolve data pipeline issues. Data Modeling: Experience with various data modeling techniques (e.g., dimensional modeling). Real-time Processing: Familiarity with real-time data streaming technologies (e.g., Kafka, Azure Event Hubs). Education: Bachelor's degree in Computer Science, Engineering, or related field. Nice-to-Have Skills
API Integration: Experience building data connectors and integrating with APIs from major enterprise systems (e.g., EPIC, Workday). Cloud Computing: Hands-on experience with at least one major cloud platform's data services (e.g., Microsoft Azure Data Factory, Azure Fabric, IICS, IDMC). Version Control: Proficiency with Git for code management and collaboration. CI/CD: Knowledge of Continuous Integration/Continuous Deployment practices for data pipelines. AI/ML MLOps: A basic understanding of the machine learning lifecycle and how to build data pipelines to support model training and deployment. Experience with Microsoft Fabric: Direct experience with Microsoft Fabric's integrated data platform (OneLake, Data Factory, Synapse Data Engineering). About G2O
We blend the research and design, technology, and data expertise to deliver the solutions our clients crave — and we do all of this as one in-house team, from vision to execution. We’re also the largest company of our kind based in Ohio to do this — and have been evolving how we do it for 40 years. Individually, we bring a wealth of experience from diverse backgrounds — personally and in business. We’re a diverse and passionate team of leaders and experts in technology, data, analytics, design, content, and more. Each person brings something distinct to our team, which strengthens collaboration and elevates client outcomes. Are you ready to collaborate to greatness with us? Company Details
Seniority level: Mid-Senior level Employment type: Full-time Job function: Consulting, Engineering, and Information Technology Industries: IT Services and IT Consulting, Data Infrastructure and Analytics, and IT System Custom Software Development Referrals increase your chances of interviewing at G2O. Get notified about new Data Engineer jobs in Columbus, Ohio Metropolitan Area. Location examples and salary ranges shown in the original posting include: Columbus, OH $109,000.00-$155,500.00; Washington Court House, OH $39.00-$75.00; and several other Columbus-area postings with varying ranges.
#J-18808-Ljbffr