GEICO
Staff Engineer – Data Lakehouse Platform
– GEICO Join to apply for the role at GEICO.
Base Pay Range $115,000 – $230,000 per year
Company Culture At GEICO, we offer a rewarding career where your ambitions are met with endless possibilities.
Every day we honor our iconic brand by offering quality coverage to millions of customers and being there when they need us most. We thrive through relentless innovation to exceed our customers’ expectations while making a real impact for our company through our shared purpose.
When you join our company, we want you to feel valued, supported and proud to work here. That’s why we offer The GEICO Pledge: Great Company, Great Culture, Great Rewards and Great Careers.
Position Summary GEICO is seeking an experienced Engineer with a passion for building high-performance, low maintenance, zero-downtime platforms, and core data infrastructure. You will help drive our insurance business transformation as we transition from a traditional IT model to a tech organization with engineering excellence as its mission, while co-creating the culture of psychological safety and continuous improvement.
Position Description Our Staff Engineer is a key member of the engineering staff working across the organization to innovate and bring the best open-source data infrastructure and practices into GEICO as we embark on a greenfield project to implement a core Data Lakehouse for all GEICO’s core data use‑cases across each of the company’s business verticals.
Position Responsibilities
Scope, design, and build scalable, resilient Data Lakehouse components
Lead architecture sessions and reviews with peers and leadership
Accountable for the quality, usability, and performance of the solutions
Spearhead new software evaluations and innovate with new tooling
Determine and support resource requirements, evaluate operational processes, measure outcomes to ensure desired results, and demonstrate adaptability and sponsoring continuous learning
Collaborate with customers, team members, and other engineering teams to solve our toughest problems
Be a role model and mentor, helping to coach and strengthen the technical expertise and know‑how of our engineering community
Consistently share best practices and improve processes within and across teams
Share your passion for staying on top of the latest open-source projects, experimenting with, and learning recent technologies, participating in internal and external technology communities, and mentoring other members of the engineering community
Qualifications
Exemplary ability to design and develop, perform experiments
Experience developing new and enhancing existing open-source based Data Lakehouse platform components
Experience cultivating relationships with and contributing to open-source software projects
Experience with:
Apache Superset for data visualization and business intelligence
Jupyter Notebook for data science and machine learning development
Cloud computing (AWS, Microsoft Azure, Google Cloud, Hybrid Cloud, or equivalent)
Expertise in developing large‑scale distributed systems that are scalable, resilient, and highly available, with a focus on:
Designing and implementing systems that can handle high traffic and large data volumes
Ensuring system reliability, uptime, and performance in complex environments
Expertise in container technology like Docker and Kubernetes platform development
Experience with continuous delivery and infrastructure as code
In‑depth knowledge of DevOps concepts and cloud architecture
Experience in Azure Network (Subscription, Security zoning, etc.) or equivalent
Desirable:
Experience with ML Ops pipeline development and management, including:
Designing and implementing data pipelines for machine learning workflows
Ensuring data quality, integrity, and security in ML pipelines
Monitoring and optimizing ML pipeline performance and efficiency
Experience working with Large Language Models (LLM) to create Agentic systems, including:
Integrating LLMs with data lakehouse platforms and other systems
Developing and deploying Agentic models and workflows
Ensuring model performance, reliability, and security in production environment
Ability to excel in a fast‑paced, startup‑like environment
Experience
6+ years of professional experience in data software development, programming languages and developing with big data technologies
4+ years of experience on open‑source development or large scale distributed systems
3+ years of experience with architecture and design
3+ years of experience with AWS, GCP, Azure, or another cloud service
Education
Bachelor’s degree in Computer Science, Information Systems, Mathematics or equivalent education or work experience
Seniority Level Mid‑Senior level
Employment Type Full‑time
Job Function Engineering and Information Technology
Industries Insurance
Equal Employment Opportunity Statement The equal employment opportunity policy of the GEICO Companies provides for a fair and equal employment opportunity for all associates and job applicants regardless of race, color, religious creed, national origin, ancestry, age, gender, pregnancy, sexual orientation, gender identity, marital status, familial status, disability or genetic information, in compliance with applicable federal, state and local law. GEICO hires and promotes individuals solely on the basis of their qualifications for the job to be filled.
GEICO reasonably accommodates qualified individuals with disabilities to enable them to receive equal employment opportunity and/or perform the essential functions of the job, unless the accommodation would impose an undue hardship to the Company. This applies to all applicants and associates. GEICO also provides a work environment in which each associate is able to be productive and work to the best of their ability. We do not condone or tolerate an atmosphere of intimidation or harassment. We expect and require the cooperation of all associates in maintaining an atmosphere free from discrimination and harassment with mutual respect by and for all associates and applicants.
#J-18808-Ljbffr
– GEICO Join to apply for the role at GEICO.
Base Pay Range $115,000 – $230,000 per year
Company Culture At GEICO, we offer a rewarding career where your ambitions are met with endless possibilities.
Every day we honor our iconic brand by offering quality coverage to millions of customers and being there when they need us most. We thrive through relentless innovation to exceed our customers’ expectations while making a real impact for our company through our shared purpose.
When you join our company, we want you to feel valued, supported and proud to work here. That’s why we offer The GEICO Pledge: Great Company, Great Culture, Great Rewards and Great Careers.
Position Summary GEICO is seeking an experienced Engineer with a passion for building high-performance, low maintenance, zero-downtime platforms, and core data infrastructure. You will help drive our insurance business transformation as we transition from a traditional IT model to a tech organization with engineering excellence as its mission, while co-creating the culture of psychological safety and continuous improvement.
Position Description Our Staff Engineer is a key member of the engineering staff working across the organization to innovate and bring the best open-source data infrastructure and practices into GEICO as we embark on a greenfield project to implement a core Data Lakehouse for all GEICO’s core data use‑cases across each of the company’s business verticals.
Position Responsibilities
Scope, design, and build scalable, resilient Data Lakehouse components
Lead architecture sessions and reviews with peers and leadership
Accountable for the quality, usability, and performance of the solutions
Spearhead new software evaluations and innovate with new tooling
Determine and support resource requirements, evaluate operational processes, measure outcomes to ensure desired results, and demonstrate adaptability and sponsoring continuous learning
Collaborate with customers, team members, and other engineering teams to solve our toughest problems
Be a role model and mentor, helping to coach and strengthen the technical expertise and know‑how of our engineering community
Consistently share best practices and improve processes within and across teams
Share your passion for staying on top of the latest open-source projects, experimenting with, and learning recent technologies, participating in internal and external technology communities, and mentoring other members of the engineering community
Qualifications
Exemplary ability to design and develop, perform experiments
Experience developing new and enhancing existing open-source based Data Lakehouse platform components
Experience cultivating relationships with and contributing to open-source software projects
Experience with:
Apache Superset for data visualization and business intelligence
Jupyter Notebook for data science and machine learning development
Cloud computing (AWS, Microsoft Azure, Google Cloud, Hybrid Cloud, or equivalent)
Expertise in developing large‑scale distributed systems that are scalable, resilient, and highly available, with a focus on:
Designing and implementing systems that can handle high traffic and large data volumes
Ensuring system reliability, uptime, and performance in complex environments
Expertise in container technology like Docker and Kubernetes platform development
Experience with continuous delivery and infrastructure as code
In‑depth knowledge of DevOps concepts and cloud architecture
Experience in Azure Network (Subscription, Security zoning, etc.) or equivalent
Desirable:
Experience with ML Ops pipeline development and management, including:
Designing and implementing data pipelines for machine learning workflows
Ensuring data quality, integrity, and security in ML pipelines
Monitoring and optimizing ML pipeline performance and efficiency
Experience working with Large Language Models (LLM) to create Agentic systems, including:
Integrating LLMs with data lakehouse platforms and other systems
Developing and deploying Agentic models and workflows
Ensuring model performance, reliability, and security in production environment
Ability to excel in a fast‑paced, startup‑like environment
Experience
6+ years of professional experience in data software development, programming languages and developing with big data technologies
4+ years of experience on open‑source development or large scale distributed systems
3+ years of experience with architecture and design
3+ years of experience with AWS, GCP, Azure, or another cloud service
Education
Bachelor’s degree in Computer Science, Information Systems, Mathematics or equivalent education or work experience
Seniority Level Mid‑Senior level
Employment Type Full‑time
Job Function Engineering and Information Technology
Industries Insurance
Equal Employment Opportunity Statement The equal employment opportunity policy of the GEICO Companies provides for a fair and equal employment opportunity for all associates and job applicants regardless of race, color, religious creed, national origin, ancestry, age, gender, pregnancy, sexual orientation, gender identity, marital status, familial status, disability or genetic information, in compliance with applicable federal, state and local law. GEICO hires and promotes individuals solely on the basis of their qualifications for the job to be filled.
GEICO reasonably accommodates qualified individuals with disabilities to enable them to receive equal employment opportunity and/or perform the essential functions of the job, unless the accommodation would impose an undue hardship to the Company. This applies to all applicants and associates. GEICO also provides a work environment in which each associate is able to be productive and work to the best of their ability. We do not condone or tolerate an atmosphere of intimidation or harassment. We expect and require the cooperation of all associates in maintaining an atmosphere free from discrimination and harassment with mutual respect by and for all associates and applicants.
#J-18808-Ljbffr