iO Associates
Data Engineer - Job Opportunity
Job Description A leading healthcare organization is seeking a skilled and motivated
Data Engineer
to join its dynamic data and analytics team. Whether you bring years of experience or are early in your professional journey, this opportunity offers a rewarding and impactful career path.
The Data Engineer will be responsible for designing, developing, and maintaining the infrastructure and systems required for data storage, processing, and analysis. This role plays a vital part in building and managing data pipelines that support efficient and reliable data integration, transformation, and delivery across the organization. The position also involves creating BI solutions to derive insights, monitor key metrics, and enhance system visibility across operational and customer-facing areas.
Key Responsibilities:
Design and develop data pipelines for extraction, transformation, and loading (ETL) from various data sources. Integrate data from diverse systems, including databases, APIs, and external platforms. Analyze, design, and document BI solutions in line with established standards and best practices. Collaborate with team members to build knowledge and ensure consistency in information delivery. Troubleshoot and resolve issues related to reporting, ETL processes, and data quality. Ensure data consistency and integrity during integration, applying data validation and cleaning processes. Apply data cleansing, aggregation, filtering, and enrichment techniques to transform raw data. Optimize workflows for performance, scalability, and efficiency. Monitor and fine-tune systems, resolving performance issues and implementing enhancements. Incorporate data quality checks and validations to maintain accuracy and completeness. Required Qualifications:
Bachelor's degree in Computer Science, Information Systems, Mathematics, or a related field (or equivalent experience). Minimum of 6 years of experience in data management, including integration, modeling, optimization, and quality assurance. Proven experience in developing and maintaining data warehouses, particularly within big data platforms (e.g., Snowflake). Expertise in designing and deploying data solutions to support AI, ML, and BI. Proficiency with tools/languages such as SQL, Python, R, SAS, and Excel. Strong understanding of modern data architectures, including cloud services (e.g., AWS, Azure, GCP), and tools like Databricks. Experience with various database technologies (SQL, NoSQL, Oracle, Hadoop, Teradata). Ability to collaborate across technical and non-technical teams. Excellent problem-solving and debugging skills. Strong business acumen and communication skills to influence decision-making across departments. Ability to articulate technical concepts and business use cases to varied audiences. Preferred Qualifications:
Master's degree in Information Systems, Business Intelligence, or related field. Familiarity with Apache technologies such as Kafka, Airflow, and Spark. Programming experience with languages such as Java, Python, or C/C++. Relevant industry certifications (e.g., Epic Caboodle Developer Certification).
Equal Opportunity Employer This organization is an equal opportunity employer and values diversity at every level of the workforce. All qualified applicants will receive consideration without regard to race, color, religion, gender, sexual orientation, gender identity, national origin, veteran status, or disability status.
Job Description A leading healthcare organization is seeking a skilled and motivated
Data Engineer
to join its dynamic data and analytics team. Whether you bring years of experience or are early in your professional journey, this opportunity offers a rewarding and impactful career path.
The Data Engineer will be responsible for designing, developing, and maintaining the infrastructure and systems required for data storage, processing, and analysis. This role plays a vital part in building and managing data pipelines that support efficient and reliable data integration, transformation, and delivery across the organization. The position also involves creating BI solutions to derive insights, monitor key metrics, and enhance system visibility across operational and customer-facing areas.
Key Responsibilities:
Design and develop data pipelines for extraction, transformation, and loading (ETL) from various data sources. Integrate data from diverse systems, including databases, APIs, and external platforms. Analyze, design, and document BI solutions in line with established standards and best practices. Collaborate with team members to build knowledge and ensure consistency in information delivery. Troubleshoot and resolve issues related to reporting, ETL processes, and data quality. Ensure data consistency and integrity during integration, applying data validation and cleaning processes. Apply data cleansing, aggregation, filtering, and enrichment techniques to transform raw data. Optimize workflows for performance, scalability, and efficiency. Monitor and fine-tune systems, resolving performance issues and implementing enhancements. Incorporate data quality checks and validations to maintain accuracy and completeness. Required Qualifications:
Bachelor's degree in Computer Science, Information Systems, Mathematics, or a related field (or equivalent experience). Minimum of 6 years of experience in data management, including integration, modeling, optimization, and quality assurance. Proven experience in developing and maintaining data warehouses, particularly within big data platforms (e.g., Snowflake). Expertise in designing and deploying data solutions to support AI, ML, and BI. Proficiency with tools/languages such as SQL, Python, R, SAS, and Excel. Strong understanding of modern data architectures, including cloud services (e.g., AWS, Azure, GCP), and tools like Databricks. Experience with various database technologies (SQL, NoSQL, Oracle, Hadoop, Teradata). Ability to collaborate across technical and non-technical teams. Excellent problem-solving and debugging skills. Strong business acumen and communication skills to influence decision-making across departments. Ability to articulate technical concepts and business use cases to varied audiences. Preferred Qualifications:
Master's degree in Information Systems, Business Intelligence, or related field. Familiarity with Apache technologies such as Kafka, Airflow, and Spark. Programming experience with languages such as Java, Python, or C/C++. Relevant industry certifications (e.g., Epic Caboodle Developer Certification).
Equal Opportunity Employer This organization is an equal opportunity employer and values diversity at every level of the workforce. All qualified applicants will receive consideration without regard to race, color, religion, gender, sexual orientation, gender identity, national origin, veteran status, or disability status.