JUARA IT SOLUTIONS
Job Title: Integration Lead – EDI & Databricks
Location: North America
Notice Period: Immediate
Experience: 8 + years
Role Summary We are seeking an Integration Lead with strong hands‑on experience in EDI integrations, Databricks development, Control‑M scheduling, and SQL to own end‑to‑end integration monitoring, testing, and production support. This role is primarily responsible for ensuring stability, reliability, and continuous enhancement of data flows involving Databricks, with accountability for both support and development activities.
The ideal candidate will act as the single point of ownership for integration failures and enhancements, working across upstream and downstream systems, and will actively fix issues as a developer rather than only coordinating.
Key Responsibilities 1. Integration Monitoring & Production Support (Primary)
Monitor EDI and Databricks‑based integrations to ensure seamless data flow across systems.
Perform proactive monitoring, health checks, and daily status reporting to stakeholders.
Execute and support testing for new and enhanced EDI and Databricks data flows.
Troubleshoot and resolve production incidents related to data failures, latency, or inconsistencies.
2. Databricks Development & Bug Fixing (Primary)
Perform
hands‑on bug fixes
in Databricks when failures occur in upstream or downstream integrations.
Analyze root causes of integration issues and implement permanent fixes in Databricks notebooks, jobs, or pipelines.
Optimize existing Databricks data flows for performance, reliability, and scalability.
3. Data Flow & Job Creation (Primary)
Design and develop new data flows using
Databricks ,
SQL queries , and
SQL stored procedures
based on business requirements.
Create, modify, and maintain batch jobs and workflows using
Control‑M .
Ensure proper dependency handling between upstream source systems, Databricks, and downstream consumers.
4. EDI Integration Responsibilities (Primary)
Support and manage EDI integrations, including coordination with third‑party EDI VAN providers.
Handle EDI testing, monitoring, and resolution of data discrepancies.
Ensure data accuracy, reconciliation, and compliance with integration standards.
5. Environment & Release Support
Support environment upgrades and changes across upstream, Databricks, and downstream systems.
Participate in release activities, including validation and post‑release monitoring.
6. Documentation & Compliance (Secondary)
Maintain technical and operational documentation for integrations and data flows.
Support audit and compliance documentation as required.
Mandatory / Must-Have Skills
Strong hands‑on experience with EDI integrations (monitoring, testing, troubleshooting).
Databricks development experience, including notebooks, jobs, and data pipelines.
Ability to perform bug fixing and enhancements in Databricks as a developer.
Experience with Control‑M for batch job scheduling and workflow management.
Strong SQL skills, including complex queries and stored procedures.
Proven experience in production support and integration monitoring.
Strong analytical, troubleshooting, and communication skills.
Nice-to-Have / Preferred Skills
Exposure to Model N or other pharma / life sciences platforms.
Experience with ETL tools such as IBM DataStage or equivalent.
Prior experience working in regulated environments (pharma, healthcare, life sciences).
Experience with audit support and compliance documentation.
Experience Requirements
Overall experience: 8+ years in integration, data engineering, or production support roles.
Relevant experience: 3–5+ years hands‑on with Databricks, EDI, batch scheduling, and SQL.
Role Characteristics
Hands‑on lead‑level role with direct responsibility for fixing issues, not just coordination.
Combination of L2/L3 production support and active development.
Requires strong ownership mindset and ability to work directly with business and technical stakeholders.
#J-18808-Ljbffr
Notice Period: Immediate
Experience: 8 + years
Role Summary We are seeking an Integration Lead with strong hands‑on experience in EDI integrations, Databricks development, Control‑M scheduling, and SQL to own end‑to‑end integration monitoring, testing, and production support. This role is primarily responsible for ensuring stability, reliability, and continuous enhancement of data flows involving Databricks, with accountability for both support and development activities.
The ideal candidate will act as the single point of ownership for integration failures and enhancements, working across upstream and downstream systems, and will actively fix issues as a developer rather than only coordinating.
Key Responsibilities 1. Integration Monitoring & Production Support (Primary)
Monitor EDI and Databricks‑based integrations to ensure seamless data flow across systems.
Perform proactive monitoring, health checks, and daily status reporting to stakeholders.
Execute and support testing for new and enhanced EDI and Databricks data flows.
Troubleshoot and resolve production incidents related to data failures, latency, or inconsistencies.
2. Databricks Development & Bug Fixing (Primary)
Perform
hands‑on bug fixes
in Databricks when failures occur in upstream or downstream integrations.
Analyze root causes of integration issues and implement permanent fixes in Databricks notebooks, jobs, or pipelines.
Optimize existing Databricks data flows for performance, reliability, and scalability.
3. Data Flow & Job Creation (Primary)
Design and develop new data flows using
Databricks ,
SQL queries , and
SQL stored procedures
based on business requirements.
Create, modify, and maintain batch jobs and workflows using
Control‑M .
Ensure proper dependency handling between upstream source systems, Databricks, and downstream consumers.
4. EDI Integration Responsibilities (Primary)
Support and manage EDI integrations, including coordination with third‑party EDI VAN providers.
Handle EDI testing, monitoring, and resolution of data discrepancies.
Ensure data accuracy, reconciliation, and compliance with integration standards.
5. Environment & Release Support
Support environment upgrades and changes across upstream, Databricks, and downstream systems.
Participate in release activities, including validation and post‑release monitoring.
6. Documentation & Compliance (Secondary)
Maintain technical and operational documentation for integrations and data flows.
Support audit and compliance documentation as required.
Mandatory / Must-Have Skills
Strong hands‑on experience with EDI integrations (monitoring, testing, troubleshooting).
Databricks development experience, including notebooks, jobs, and data pipelines.
Ability to perform bug fixing and enhancements in Databricks as a developer.
Experience with Control‑M for batch job scheduling and workflow management.
Strong SQL skills, including complex queries and stored procedures.
Proven experience in production support and integration monitoring.
Strong analytical, troubleshooting, and communication skills.
Nice-to-Have / Preferred Skills
Exposure to Model N or other pharma / life sciences platforms.
Experience with ETL tools such as IBM DataStage or equivalent.
Prior experience working in regulated environments (pharma, healthcare, life sciences).
Experience with audit support and compliance documentation.
Experience Requirements
Overall experience: 8+ years in integration, data engineering, or production support roles.
Relevant experience: 3–5+ years hands‑on with Databricks, EDI, batch scheduling, and SQL.
Role Characteristics
Hands‑on lead‑level role with direct responsibility for fixing issues, not just coordination.
Combination of L2/L3 production support and active development.
Requires strong ownership mindset and ability to work directly with business and technical stakeholders.
#J-18808-Ljbffr