Logo
Canal Insurance Company

ETL Team Lead

Canal Insurance Company, Greenville, South Carolina, us, 29610

Save Job

Join to apply for the

ETL Team Lead

role at

Canal Insurance Company .

Overview Canal Insurance Company specializes in insurance for commercial trucking and specialty transportation operations. Founded in 1939 and located in Greenville, South Carolina, Canal recognizes that its success depends on the hard work and dedication of its employees. The company cultivates a culture that enables the recruitment and retention of the best talent, balancing happiness and productivity.

Culture

Located in beautiful downtown Greenville, SC

Career growth & advancement opportunities

Comprehensive benefits package

Employee referral program

Casual dress code

Innovation-focused & customer-centric

80+ years of industry expertise

Committed to giving back to our community

Unquestioned integrity and commitment

Benefits

Basic & voluntary life insurance plans

Medical, dental, & vision coverage

Short-term & long-term disability

401(k) plan with company match up to 6%

Flexible spending accounts

Employee assistance programs

Generous PTO plan

Responsibilities

Production Support, Operations & Reliability

The ETL Team Lead owns end-to-end operational support for Canal’s existing data stack.

Monitor daily ETL loads across SQL jobs, DHIC (GW DataHub and InfoCenter), and legacy SSIS packages.

Collaborate with the AMS team to troubleshoot pipeline failures, performance issues, schema mismatches, permissions issues, and cloud resource failures.

Perform root-cause analysis and implement permanent fixes.

Ensure SLA adherence and on-time delivery of critical reporting datasets for scheduled ETL jobs.

Provide direction for both AMS and ETL developers for legacy and current ETL maintenance.

Refactor or retire outdated or redundant ETL processes.

Maintain and improve existing pipelines that utilize the following technologies

Microsoft SQL Server database programming

T‑SQL scripting

SQL Server Integration Services

Microsoft PowerShell

Guidewire DataHub and InfoCenter

Oracle database programming

Oracle PL/SQL scripting

SAP BODS (SAP BusinessObjects Data Services)

PostgreSQL scripting

Operational Excellence

Assist with the creation and enhancement of operational runbooks, SOPs, monitoring dashboards, and incident response workflows.

Partner with other IT operational segments, business SMEs, and AMS to minimize downtime and meet business SLAs.

Improve existing processes and implement new proactive solutions for daily processing.

Business Continuity

Ensure development support coverage for critical data pipelines (rotation-based).

Support month‑end and quarter‑end financial reporting cycles.

Coordinate production releases and validate deployments.

Become the steady‑state technical owner of the entire data operations layer during the Canal modernization journey.

Technical Leadership & Collaboration

Serve as technical lead guiding onshore/offshore developers.

Review code, enforce best practices, and mentor junior engineers.

Partner with Scrum Masters, Project Managers, Enterprise Architecture, QA Automation, Change Management, and AMS support teams.

Data Ingestion, ETL/ELT Development & Optimization

Develop reusable ingestion patterns for Guidewire DataHub and InfoCenter, HubSpot, telematics, and other facets of the business.

Modernize existing ETL workloads using Delta Lake, Medallion Architecture, and Fabric Lakehouse.

Build scalable data ingestion pipelines using emerging technologies such as Azure Data Factory, MS Fabric, Databricks, and Synapse Pipelines.

Integrate internal and external data into the platform.

Real‑Time, Streaming & Event‑Driven Engineering

Design and implement real‑time data pipelines using Event Hub, Fabric Real‑Time Analytics, Databricks Structured Streaming, and KQL‑based event processing.

Enable real‑time operational insights and automation, including telematics alerting, FNOL automation, and fraud/VNOS/VNOP detection.

Modern Azure Data Stack Leadership

Lead the strategy, design, and engineering of Canal’s modern Azure data ecosystem using next‑generation tools and Medallion Architecture.

Implement Medallion Architecture (Bronze/Silver/Gold) across Fabric Lakehouse, Warehouse, Eventhouse, and KQL Database.

Leverage Delta tables with schema enforcement, ACID compliance, and versioning.

Data Modeling, Curation & Governance

Develop curated, analytics‑ready datasets to support Power BI, operational reporting, and advanced analytics use cases.

Assist Canal architect with the implementation of Data Governance tools.

Establish robust data quality, validation, alerting, and observability frameworks.

AI/ML Data Enablement (Optional)

Prepare ML‑ready datasets for pricing, risk, fraud detection, underwriting, claims leakage, and predictive insights.

Seniority level Mid‑Senior level

Employment type Full‑time

Job function Engineering and Information Technology

#J-18808-Ljbffr