AVP AI and Data Engineering - Employee Benefits
The Hartford - WorkFromHome
Work at The Hartford
Overview
- View job
Overview
AVP AI and Data Engineering - Employee Benefits
Apply: remote type Hybrid, locations Hartford, CT; Chicago, IL; Charlotte, NC; time type Full time, posted on Posted 30+ Days Ago, job requisition id R2521055
We’re committed to making a difference and are proud to be an insurance company that goes beyond coverages and policies. Working here offers opportunities to achieve your goals and help others do the same. Join us as we shape the future.
Join The Hartford Insurance as we lead the AI transformation in our industry!
We seek a skilled leader, Assistant Vice President (AVP) of AI and Data Engineering, to join our Employee Benefits data team. The ideal candidate has experience in developing scalable data environments, launching data products, enabling AI/ML solutions, and shaping data analytics strategies. The AVP will lead the development and implementation of advanced AI and data engineering solutions to drive business transformation and improve customer experiences.
This role requires strategic thinking, expertise in data engineering, knowledge of AI technologies, and leadership in cross-functional teams. Candidates should be familiar with real-time data streaming, agentic frameworks, Data APIs, vector stores, and RAG architectures, with a track record of enabling self-serve analytics and AI use cases.
The AVP will promote a data-driven culture and AI adoption to enhance our data platform and deliver value to members. They will lead transformational initiatives, guide innovative technologies for better business outcomes through automation and AI. This is a strategic leadership opportunity to shape The Hartford’s future.
This role offers Hybrid or Remote work arrangements. Candidates near our office locations are expected to work in-office 3 days a week (Tuesday-Thursday). Remote candidates must be able to visit an office as needed. Candidates must be eligible to work in the US without sponsorship now or in the future.
Key Responsibilities
- Leadership & Strategy: Lead enterprise-wide data strategy aligned with AI-driven transformation. Oversee data architecture, develop scalable data consumption patterns, and promote innovation in AI solutions. Communicate strategy and progress to stakeholders and showcase data capabilities through thought leadership.
- Data Modernization: Develop a roadmap to modernize legacy data platforms to strategic cloud platforms. Address current data complexities and enable data products for diverse stakeholders.
- Real-Time Data Streaming: Design and manage scalable real-time data pipelines for ingestion, processing, and delivery. Extract insights from unstructured data and integrate with structured sources for comprehensive analytics.
- Third-Party Data Exchange: Collaborate with Technology teams to implement data exchange strategies with Third-Party Administrators (TPAs).
- Team Leadership & Development: Build, mentor, and lead a high-performing team of data engineers and architects.
- Drive Efficiency & Productivity: Identify and implement solutions to enhance developer productivity, including AI-driven auto-generation of ETL pipelines, advanced DevOps practices, and automated data quality frameworks.
- Data Infrastructure: Oversee design, development, and maintenance of data pipelines, warehouses, and lakes.
- AI/ML Enablement: Architect data solutions for AI and machine learning models to optimize operations and enhance customer experience.
- Collaboration & Communication: Partner with stakeholders to translate data needs into technical requirements.
- Technology Evaluation & Adoption: Stay current with trends in data engineering and AI/ML, recommending innovative tools and technologies.
- Data Governance & Compliance: Implement data management frameworks ensuring data governance and quality, enabling effective data access and use.
- Budget Management: Manage the AI Data Engineering budget.
- Performance Monitoring & Optimization: Monitor and improve data pipeline and infrastructure performance.
Qualifications
- Bachelor's or master’s in computer science, Data Science, Engineering, or related field.
- 15+ years in data engineering, designing large-scale ecosystems.
- At least 3 years in leadership roles managing teams and initiatives.
- Excellent communication skills for technical and non-technical audiences.
- Deep expertise in data architecture, warehouses, lakes, and cloud tech.
- Experience with ML integration on cloud platforms like AWS SageMaker, GCP Vertex AI.
- Technical skills in LLMs, Generative AI, Prompt engineering, RAG architectures, Vector stores, AI agents, APIs.
- Experience with model grounding, hallucination handling, and APIs.
- Hands-on with GCP, Cloud AI, Vertex AI, BigQuery, LangChain, AI agents, and hybrid data lake-houses.
- Strong communication skills to explain AI/ML concepts to leaders.
- Knowledge of machine learning algorithms, computer vision, NLP, multimodal models, and content summarization.
- Experience with cloud tech stacks across AWS and GCP, preferably in financial services or insurance.
Compensation
The annual base pay range is $182,000 - $273,000, based on market analysis. Actual pay varies based on performance, skills, and competencies. Compensation includes bonuses, incentives, and recognition.
Equal Opportunity Employer/Sex/Race/Color/Veterans/Disability/Sexual Orientation/Gender Identity/Religion/Age
Learn more about us, our culture, diversity, and benefits.
#J-18808-Ljbffr