Cotiviti US & Canada Page
Data Integration Engineer
Cotiviti US & Canada Page, Myrtle Point, Oregon, United States, 97458
Overview
Edifecs/Cotiviti is seeking a Data Integration Engineer to join our innovative software teams. In this position, you will be responsible for onboarding customers to the Risk Adjustment workflow applications. As part of this role, you will work with platform engineering, product, and implementation teams. The ideal candidate has experience in building scalable data pipelines that enable workflow applications, reporting, and customer confidence. You must have strong, hands‑on technical expertise in big data technologies and communication abilities with non‑tech and technical audiences.
Responsibilities
Design and develop data flows and extraction
Integrate client’s data into the Edifecs/Cotiviti/Cotiviti Risk Adjustment
Work with relational databases and Python
Implement open-source standards for data
Develop a new framework for data ELT jobs to scale implementation, monitoring, and
Build and scale automation that orchestrate complex
Ability to support existing processes while leading efforts to redefine the data
Qualifications
3+ years of hands‑on experience with relational database systems (PostgreSQL).
2+ years of hands‑on experience developing ELT jobs/data pipelines within various technologies such as: Argo Workflows, cron, Airflow, dbt, great
2+ years of experience in Linux system.
Proficiency in shell scripting and automation
Experience building reusable data methods and working with version control systems (git).
3+ years of hands‑on experience with programming languages such as
Plus - Experience with container deployment platforms and tools, such as Kubernetes, Docker, Helm, and Terraform.
Plus - AWS Cloud experience: EC2, RDS, SQS, IAM,
#J-18808-Ljbffr
Responsibilities
Design and develop data flows and extraction
Integrate client’s data into the Edifecs/Cotiviti/Cotiviti Risk Adjustment
Work with relational databases and Python
Implement open-source standards for data
Develop a new framework for data ELT jobs to scale implementation, monitoring, and
Build and scale automation that orchestrate complex
Ability to support existing processes while leading efforts to redefine the data
Qualifications
3+ years of hands‑on experience with relational database systems (PostgreSQL).
2+ years of hands‑on experience developing ELT jobs/data pipelines within various technologies such as: Argo Workflows, cron, Airflow, dbt, great
2+ years of experience in Linux system.
Proficiency in shell scripting and automation
Experience building reusable data methods and working with version control systems (git).
3+ years of hands‑on experience with programming languages such as
Plus - Experience with container deployment platforms and tools, such as Kubernetes, Docker, Helm, and Terraform.
Plus - AWS Cloud experience: EC2, RDS, SQS, IAM,
#J-18808-Ljbffr