Engage Partners Inc.
Base pay range
$80,000.00/yr - $120,000.00/yr Job Title
Epic Caboodle Data Warehouse Engineer Location
South-east USA Position Type
Full-Time, Remote with minimal travel. MUST reside in Florida or willing to relocate on own to Florida. About the Role
Hospital in Florida is seeking a highly technical
Epic Caboodle Data Warehouse Engineer
to lead the build and implementation of our enterprise data warehouse environment. This role focuses on data engineering, ETL development, and system optimization to enable reliable reporting and analytics across the hospital. The engineer will work closely with Epic Caboodle and Clarity data models, ensuring efficient data extraction, transformation, and integration into the Caboodle warehouse. Key Responsibilities
Implement, configure, and optimize the
Epic Caboodle Data Warehouse
in alignment with enterprise IT and data architecture standards. Develop and maintain
ETL workflows
(Epic ETL, SSIS, or equivalent) to populate Caboodle data models with high-quality, validated data. Write and optimize complex
SQL queries
to support downstream reporting, analytics, and research initiatives. Partner with Epic Clarity developers to ensure
data pipelines
from Clarity into Caboodle are efficient, reliable, and scalable. Monitor performance, troubleshoot issues, and tune queries and processes for efficiency. Collaborate with BI/Analytics teams to ensure Caboodle provides accurate, timely, and well-structured data sources. Document technical workflows, schema mappings, and metadata for long-term maintainability. Ensure
data security and compliance
with HIPAA and organizational standards. Provide technical leadership during
Epic upgrades, patching, and integration of new modules
into Caboodle. Qualifications
Required: Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or related field (or equivalent experience). Epic Caboodle Data Warehouse Certification
(or ability to obtain within required timeframe). Minimum
3–5 years of technical experience
with Epic Caboodle and/or Epic Clarity data models. Proficiency in
SQL Server
and strong relational database design skills. Hands-on experience with
ETL tools
(Epic ETL, SSIS, Informatica, or similar). Experience with data performance tuning, query optimization, and troubleshooting. Preferred: Epic Clarity Data Model certification(s). Experience with
cloud data platforms
(Azure, AWS, or GCP). Familiarity with BI/analytics tools (Power BI, Tableau, Qlik) for validating data outputs. Experience with
DevOps/CI/CD pipelines , source control (Git), and automated testing in a data engineering environment. Healthcare IT experience, especially with EHR data. Strong technical problem-solving ability and attention to detail. Ability to work independently on complex engineering tasks as well as collaborate across multidisciplinary teams. Clear and concise technical documentation skills. Ability to balance performance, scalability, and compliance in data architecture.
#J-18808-Ljbffr
$80,000.00/yr - $120,000.00/yr Job Title
Epic Caboodle Data Warehouse Engineer Location
South-east USA Position Type
Full-Time, Remote with minimal travel. MUST reside in Florida or willing to relocate on own to Florida. About the Role
Hospital in Florida is seeking a highly technical
Epic Caboodle Data Warehouse Engineer
to lead the build and implementation of our enterprise data warehouse environment. This role focuses on data engineering, ETL development, and system optimization to enable reliable reporting and analytics across the hospital. The engineer will work closely with Epic Caboodle and Clarity data models, ensuring efficient data extraction, transformation, and integration into the Caboodle warehouse. Key Responsibilities
Implement, configure, and optimize the
Epic Caboodle Data Warehouse
in alignment with enterprise IT and data architecture standards. Develop and maintain
ETL workflows
(Epic ETL, SSIS, or equivalent) to populate Caboodle data models with high-quality, validated data. Write and optimize complex
SQL queries
to support downstream reporting, analytics, and research initiatives. Partner with Epic Clarity developers to ensure
data pipelines
from Clarity into Caboodle are efficient, reliable, and scalable. Monitor performance, troubleshoot issues, and tune queries and processes for efficiency. Collaborate with BI/Analytics teams to ensure Caboodle provides accurate, timely, and well-structured data sources. Document technical workflows, schema mappings, and metadata for long-term maintainability. Ensure
data security and compliance
with HIPAA and organizational standards. Provide technical leadership during
Epic upgrades, patching, and integration of new modules
into Caboodle. Qualifications
Required: Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or related field (or equivalent experience). Epic Caboodle Data Warehouse Certification
(or ability to obtain within required timeframe). Minimum
3–5 years of technical experience
with Epic Caboodle and/or Epic Clarity data models. Proficiency in
SQL Server
and strong relational database design skills. Hands-on experience with
ETL tools
(Epic ETL, SSIS, Informatica, or similar). Experience with data performance tuning, query optimization, and troubleshooting. Preferred: Epic Clarity Data Model certification(s). Experience with
cloud data platforms
(Azure, AWS, or GCP). Familiarity with BI/analytics tools (Power BI, Tableau, Qlik) for validating data outputs. Experience with
DevOps/CI/CD pipelines , source control (Git), and automated testing in a data engineering environment. Healthcare IT experience, especially with EHR data. Strong technical problem-solving ability and attention to detail. Ability to work independently on complex engineering tasks as well as collaborate across multidisciplinary teams. Clear and concise technical documentation skills. Ability to balance performance, scalability, and compliance in data architecture.
#J-18808-Ljbffr