Overland AI
About Overland AI
Founded in 2022 and headquartered in Seattle, Washington, Overland AI is transforming land operations for modern defense. The company leverages over a decade of advanced research in robotics and machine learning, as well as a field‑test forward ethos, to deliver combined capabilities for unit commanders. Our OverDrive autonomy stack enables ground vehicles to navigate and operate off‑road in any terrain without GPS or direct operator control. Our intuitive OverWatch C2 interface provides commanders with precise coordination capabilities essential for mission success.
Overland AI has secured funding from prominent defence tech investors including 8VC and Point 72, and built trusted partnerships with DARPA, the U.S. Army, Marine Corps, and Special Operations Command. Backed by eight‑figure contracts across the Department of Defense, we are strengthening national security by iterating closely with end users engaged in tactical operations.
Role Summary Overland AI is hiring a Test Data Analyst to develop and maintain a first‑order, data‑driven understanding of how our autonomous vehicles behave in real‑world testing. This role sits within the Systems, Safety, and Test (SST) organization and partners closely with software, hardware, and test teams to turn daily field test outputs into reliable insight that improves autonomy performance, safety, and system maturity.
This role is centered on deep, hands‑on analysis of field test data. You will spend your time immersed in autonomy runs, synchronized logs, ROS MCAPs, sensor outputs, and recorded test video — building deep intuition for system behaviour by repeatedly reviewing the same routes and scenarios over time. This sustained exposure and consistent analysis enables you to not only annotate and tag data but identify subtle patterns, regressions, and improvements that are not visible through metrics alone.
You will be embedded in the test workflow, translating observed behaviour into structured datasets, high‑quality issue reports, and clear test summaries. Your work forms the factual record of system behaviour that engineering, leadership, and customers rely on to assess readiness and risk in demanding defence environments.
This role sits at the intersection of autonomy testing, data analysis, and systems thinking, with a strong emphasis on accuracy, traceability, and clarity over speed.
Key Responsibilities Primary Responsibility: Field Test Data Review & Behaviour Analysis
Perform deep review of autonomy field test data, including synchronized video, ROS MCAPs, telemetry, and sensor outputs
Build strong familiarity with system behaviour by analysing repeated routes and scenarios across changing software and hardware configurations
Annotate autonomy behaviour, anomalies, and decision‑making moments with precise timestamps and contextual notes
Identify subtle deviations, trends, and regressions that emerge through longitudinal analysis rather than single test runs
Issue Identification, Trends & Root Cause Insight
Identify, classify, and document hardware, software, and system‑level behaviours observed during autonomy testing
Own the quality of issue reporting by producing, reviewing, and enriching bug reports with clear context, timestamps, and supporting evidence
Track and trend system behaviour across repeated routes, environments, and software/hardware releases to identify regressions and improvements
Analyse recurring anomalies (e.g., odometry stability, localisation consistency, planner decisions) using longitudinal test data
Perform structured analysis to identify contributing factors across autonomy software, vehicle systems, sensing, and operations
Support issue prioritisation by providing data‑backed context that distinguishes isolated events from systemic risk
Test Reporting & Evidence Development
Generate clear, structured test summaries that synthesize large volumes of data into conclusions and recommendations
Contribute traceable evidence to support hazard analysis, validation activities, and future certification efforts
Help define repeatable standards and formats for test reporting as the organisation scales
Data Visibility & Communication
Transform raw test data and analysis into visual, consumable artefacts for engineers, operators, and leadership
Create clear plots, summaries, timelines, and annotated media that communicate system behaviour and test outcomes
Support shared understanding of system performance, risk, and maturity across technical and non‑technical audiences
What You’ll Need to Succeed
Bachelor’s degree in a technical field (Engineering, Computer Science, Applied Math, Physics, Data Science, or similar) or equivalent practical experience
2–5 years of experience analysing data from complex, real‑world systems
Experience working with sensor‑rich or operational data from domains such as autonomy, robotics, automotive, aerospace, defence, or similar environments
Comfort working with autonomy and robotics data artefacts, including logs, telemetry, ROS artefacts, and sensor outputs
Strong analytical skills and the discipline to methodically work through large volumes of real‑world test data
Ability to reason about autonomous system behaviour across perception, planning, control, and vehicle interfaces
High attention to detail with a bias toward accuracy, traceability, and completeness
Clear written communication skills for producing bug reports, analyses, and test summaries
Working proficiency with Python or similar tools for data analysis and lightweight automation
Comfort operating in fast‑paced, field‑forward development environments
What Will Set You Apart
Experience working directly with autonomous vehicle sensor data, including LiDAR, radar, and camera streams
Demonstrated ability to synthesize ambiguous or incomplete datasets into clear, defensible conclusions
Experience analysing long‑duration test data or reviewing extensive test video to identify subtle system behaviours
Familiarity with systems engineering concepts such as Operational Design Domains (ODDs), duty cycles, or performance requirements
Exposure to safety analysis, certification activities, or formal verification and validation workflows
Experience producing structured test reports or evidence packages for external or regulated stakeholders
Strong data visualisation skills and the ability to communicate technical results clearly to both technical and non‑technical audiences
Location The preferred location for this position is onsite in Seattle, WA.
Compensation Annual Base Pay:
$95,000 – $120,000 USD
Benefits
Equity compensation
Best‑in‑class healthcare, dental, and vision plans
Unlimited PTO
401(k) with company match
Parental leave
#J-18808-Ljbffr
Overland AI has secured funding from prominent defence tech investors including 8VC and Point 72, and built trusted partnerships with DARPA, the U.S. Army, Marine Corps, and Special Operations Command. Backed by eight‑figure contracts across the Department of Defense, we are strengthening national security by iterating closely with end users engaged in tactical operations.
Role Summary Overland AI is hiring a Test Data Analyst to develop and maintain a first‑order, data‑driven understanding of how our autonomous vehicles behave in real‑world testing. This role sits within the Systems, Safety, and Test (SST) organization and partners closely with software, hardware, and test teams to turn daily field test outputs into reliable insight that improves autonomy performance, safety, and system maturity.
This role is centered on deep, hands‑on analysis of field test data. You will spend your time immersed in autonomy runs, synchronized logs, ROS MCAPs, sensor outputs, and recorded test video — building deep intuition for system behaviour by repeatedly reviewing the same routes and scenarios over time. This sustained exposure and consistent analysis enables you to not only annotate and tag data but identify subtle patterns, regressions, and improvements that are not visible through metrics alone.
You will be embedded in the test workflow, translating observed behaviour into structured datasets, high‑quality issue reports, and clear test summaries. Your work forms the factual record of system behaviour that engineering, leadership, and customers rely on to assess readiness and risk in demanding defence environments.
This role sits at the intersection of autonomy testing, data analysis, and systems thinking, with a strong emphasis on accuracy, traceability, and clarity over speed.
Key Responsibilities Primary Responsibility: Field Test Data Review & Behaviour Analysis
Perform deep review of autonomy field test data, including synchronized video, ROS MCAPs, telemetry, and sensor outputs
Build strong familiarity with system behaviour by analysing repeated routes and scenarios across changing software and hardware configurations
Annotate autonomy behaviour, anomalies, and decision‑making moments with precise timestamps and contextual notes
Identify subtle deviations, trends, and regressions that emerge through longitudinal analysis rather than single test runs
Issue Identification, Trends & Root Cause Insight
Identify, classify, and document hardware, software, and system‑level behaviours observed during autonomy testing
Own the quality of issue reporting by producing, reviewing, and enriching bug reports with clear context, timestamps, and supporting evidence
Track and trend system behaviour across repeated routes, environments, and software/hardware releases to identify regressions and improvements
Analyse recurring anomalies (e.g., odometry stability, localisation consistency, planner decisions) using longitudinal test data
Perform structured analysis to identify contributing factors across autonomy software, vehicle systems, sensing, and operations
Support issue prioritisation by providing data‑backed context that distinguishes isolated events from systemic risk
Test Reporting & Evidence Development
Generate clear, structured test summaries that synthesize large volumes of data into conclusions and recommendations
Contribute traceable evidence to support hazard analysis, validation activities, and future certification efforts
Help define repeatable standards and formats for test reporting as the organisation scales
Data Visibility & Communication
Transform raw test data and analysis into visual, consumable artefacts for engineers, operators, and leadership
Create clear plots, summaries, timelines, and annotated media that communicate system behaviour and test outcomes
Support shared understanding of system performance, risk, and maturity across technical and non‑technical audiences
What You’ll Need to Succeed
Bachelor’s degree in a technical field (Engineering, Computer Science, Applied Math, Physics, Data Science, or similar) or equivalent practical experience
2–5 years of experience analysing data from complex, real‑world systems
Experience working with sensor‑rich or operational data from domains such as autonomy, robotics, automotive, aerospace, defence, or similar environments
Comfort working with autonomy and robotics data artefacts, including logs, telemetry, ROS artefacts, and sensor outputs
Strong analytical skills and the discipline to methodically work through large volumes of real‑world test data
Ability to reason about autonomous system behaviour across perception, planning, control, and vehicle interfaces
High attention to detail with a bias toward accuracy, traceability, and completeness
Clear written communication skills for producing bug reports, analyses, and test summaries
Working proficiency with Python or similar tools for data analysis and lightweight automation
Comfort operating in fast‑paced, field‑forward development environments
What Will Set You Apart
Experience working directly with autonomous vehicle sensor data, including LiDAR, radar, and camera streams
Demonstrated ability to synthesize ambiguous or incomplete datasets into clear, defensible conclusions
Experience analysing long‑duration test data or reviewing extensive test video to identify subtle system behaviours
Familiarity with systems engineering concepts such as Operational Design Domains (ODDs), duty cycles, or performance requirements
Exposure to safety analysis, certification activities, or formal verification and validation workflows
Experience producing structured test reports or evidence packages for external or regulated stakeholders
Strong data visualisation skills and the ability to communicate technical results clearly to both technical and non‑technical audiences
Location The preferred location for this position is onsite in Seattle, WA.
Compensation Annual Base Pay:
$95,000 – $120,000 USD
Benefits
Equity compensation
Best‑in‑class healthcare, dental, and vision plans
Unlimited PTO
401(k) with company match
Parental leave
#J-18808-Ljbffr