Tier4 Group
Epic Data Engineer, Cogito & Clarity 4603
Tier4 Group, Atlanta, Georgia, United States, 30383
Overview
Epic Data Engineer role focused on the Epic Clarity & Caboodle data stack within a healthcare environment. The Senior Data Engineer will design, develop, and implement data architecture solutions, mentor junior engineers, collaborate with cross-functional teams, and contribute to the ongoing evolution of the enterprise data architecture. The ideal candidate has deep experience with the EPIC data ecosystem, SQL Server technologies, data modeling, governance, and reporting tools such as Power BI and Tableau. Responsibilities
Design, build, and maintain robust data pipelines and data processing frameworks across hybrid environments (on-premises and cloud). Ensure seamless integration of data services with other platform technologies and applications. Develop well-governed, high-quality data products that meet business and analytical requirements. Optimize and enhance ETL/ELT processes for performance, scalability, and reliability. Collaborate with data scientists, analysts, and business stakeholders to understand data needs and deliver actionable solutions. Serve as a subject matter expert for EPIC Clarity and Caboodle data models. Evaluate and integrate new data sources and modern data acquisition models (e.g., APIs, pub-sub, event-driven architectures). Stay current with emerging data engineering tools and technologies; drive adoption of industry best practices. Propose architectural enhancements and improvements to data transformation processes. Support & Governance
Participate in on-call support rotation for critical data systems and pipelines. Troubleshoot and resolve issues in both new and existing data solutions. Ensure compliance with internal policies, data governance standards, and applicable organizational code of conduct. Required Qualifications
Bachelor’s degree in Computer Science or a related field or 4+ years of professional experience in lieu of a formal degree Certifications
Epic Caboodle and Clarity Data Model certification Additional EPIC certifications are a plus Experience
Minimum 6 years of experience with: SQL and NoSQL database systems ETL/ELT tools (including SSIS and SSAS) Data warehousing and data lake architecture Data modeling and governance in on-prem and cloud environments Experience with Epic Rev Cycle, Clinical Data Models & Access Data Models - specifically with Clarity and Caboodle Cloud platforms such as Azure, Snowflake (preferred) APIs and event-driven data integration patterns Technical Skills
Proficiency in SSIS/SSAS and Microsoft Excel Power BI and Tableau Solid understanding of distributed systems Software development life cycle and agile methodologies Excellent problem-solving, communication, and collaboration skills Location : Prefer local to Atlanta, but open to remote in the following states: AL, AR, FL, GA, IL, LA, MI, NH, NC, OH, PA, SC, TN, TX, VA, or WI
#J-18808-Ljbffr
Epic Data Engineer role focused on the Epic Clarity & Caboodle data stack within a healthcare environment. The Senior Data Engineer will design, develop, and implement data architecture solutions, mentor junior engineers, collaborate with cross-functional teams, and contribute to the ongoing evolution of the enterprise data architecture. The ideal candidate has deep experience with the EPIC data ecosystem, SQL Server technologies, data modeling, governance, and reporting tools such as Power BI and Tableau. Responsibilities
Design, build, and maintain robust data pipelines and data processing frameworks across hybrid environments (on-premises and cloud). Ensure seamless integration of data services with other platform technologies and applications. Develop well-governed, high-quality data products that meet business and analytical requirements. Optimize and enhance ETL/ELT processes for performance, scalability, and reliability. Collaborate with data scientists, analysts, and business stakeholders to understand data needs and deliver actionable solutions. Serve as a subject matter expert for EPIC Clarity and Caboodle data models. Evaluate and integrate new data sources and modern data acquisition models (e.g., APIs, pub-sub, event-driven architectures). Stay current with emerging data engineering tools and technologies; drive adoption of industry best practices. Propose architectural enhancements and improvements to data transformation processes. Support & Governance
Participate in on-call support rotation for critical data systems and pipelines. Troubleshoot and resolve issues in both new and existing data solutions. Ensure compliance with internal policies, data governance standards, and applicable organizational code of conduct. Required Qualifications
Bachelor’s degree in Computer Science or a related field or 4+ years of professional experience in lieu of a formal degree Certifications
Epic Caboodle and Clarity Data Model certification Additional EPIC certifications are a plus Experience
Minimum 6 years of experience with: SQL and NoSQL database systems ETL/ELT tools (including SSIS and SSAS) Data warehousing and data lake architecture Data modeling and governance in on-prem and cloud environments Experience with Epic Rev Cycle, Clinical Data Models & Access Data Models - specifically with Clarity and Caboodle Cloud platforms such as Azure, Snowflake (preferred) APIs and event-driven data integration patterns Technical Skills
Proficiency in SSIS/SSAS and Microsoft Excel Power BI and Tableau Solid understanding of distributed systems Software development life cycle and agile methodologies Excellent problem-solving, communication, and collaboration skills Location : Prefer local to Atlanta, but open to remote in the following states: AL, AR, FL, GA, IL, LA, MI, NH, NC, OH, PA, SC, TN, TX, VA, or WI
#J-18808-Ljbffr