Blue Water Autonomy
We are an early-stage, maritime defense technology startup backed by top tier investors. Our team is designing and building autonomous and highly producible ships that can serve multiple missions on the open ocean.
The Role We are seeking a
Lead Perception Engineer
based out of our Lexington, MA headquarters with hybrid/field-based options to architect, implement, and operationalize the
end-to-end perception, sensor ingestion, and sensor fusion pipeline
for autonomous maritime navigation. This is a
foundational, hands-on technical leadership role . You will personally build the initial perception and data ingestion stack while establishing the technical direction and standards for a future perception team. The ideal candidate is equally comfortable designing system architecture, writing performance-critical code, and working in the field to ensure sensors and data pipelines operate reliably in real conditions. You will work closely with
hardware, autonomy, and systems engineering teams
to ensure sensor choices, data ingestion, and perception outputs scale with the company’s roadmap.
Autonomy at sea introduces unique perception challenges—long-range sensing, dense and ambiguous scenes, dynamic backgrounds, adverse weather, and limited infrastructure. Solving these problems requires engineers who can build
robust, production-ready perception systems , not just prototypes.
U.S. Person status is required as this position needs to access export controlled data.
What You'll Do
Design and implement the full perception stack for autonomous maritime navigation, including:
Sensor fusion across vision, navigation sensors, and future modalities
Interfaces to planning, control, and autonomy systems
Reliable, time-synchronized ingestion of high-bandwidth sensor streams
Scalable abstractions that support evolving sensor configurations
Data formats and APIs that enable downstream perception, logging, and replay
Define system-level requirements, performance metrics, and validation strategies for perception and sensor ingestion in maritime environments.
Hands-On Development
Build production-quality perception and ingestion pipelines in
modern C++ and Python , from raw sensor input through onboard runtime deployment.
Develop and deploy machine learning–based perception models, including optimization for onboard and embedded compute.
Where appropriate, implement
GPU-accelerated components
(e.g., CUDA) to meet latency, throughput, and power constraints.
Own dataset curation, model evaluation, regression testing, and continuous performance improvement.
Data Collection & Field Operations
Lead and participate directly in field testing, sensor bring-up, and data collection on vessels.
Design and execute sensor calibration, synchronization, and validation procedures.
Build robust logging, replay, and annotation workflows to ensure high-quality training and evaluation data.
Work hands-on in real operating conditions to validate both sensors and ingestion pipelines.
Hardware & Cross-Functional Integration
Partner closely with Mechanical, Electrical, and Systems Engineering teams to:
Define camera and sensor requirements
Influence sensor selection, placement, and mounting
Ensure ingestion pipelines reflect real hardware constraints and failure modes
Collaborate with autonomy and controls engineers to define perception outputs that enable safe and reliable navigation decisions.
Technical Leadership
Serve as the technical owner of perception and sensor ingestion within the autonomy organization.
Establish engineering best practices, coding standards, and architectural patterns.
Mentor and help hire future perception engineers as the team grows.
Who You Are
7+ years of experience building perception systems for robotics, autonomous vehicles, or complex real-world systems.
Deep expertise in computer vision and perception, including detection, tracking, and sensor fusion.
Strong proficiency in
modern C++ and Python
for production systems.
Experience designing
sensor data ingestion pipelines
for high-throughput, time-sensitive systems.
Demonstrated ability to work across software, hardware, and systems boundaries.
Comfortable working hands-on with sensors, prototypes, and field-deployed systems.
Clear technical communication and strong ownership from concept through deployment.
Nice To Haves
Experience with
GPU programming and acceleration , CUDA.
Experience optimizing perception or ingestion pipelines for real-time or embedded constraints.
Prior technical leadership of perception or robotics teams.
Experience designing perception systems from a blank slate through operational deployment.
Background in robotics or autonomy companies with strong hardware–software integration.
Experience operating in unstructured outdoor environments (marine, off-road, industrial).
What We Offer
Incredibly high-caliber teammates. You’ll work directly with our co-founders Rylan, Scott, and Austin.
A fast-paced, creative working environment that offers a lot of room for ownership and growth.
Opportunity to join a meaningful mission that protects American and our Democracy.
Expected Salary Range: $215,000-$250,000 annual base salary. Final compensation will depend on experience and skill level
Startup equity options
Generous PTO, medical, dental, and vision coverage
We are an equal opportunity employer. All hiring is contingent on eligibility to work in the United States. We are unable to sponsor or transfer visas for applicants.
#J-18808-Ljbffr
The Role We are seeking a
Lead Perception Engineer
based out of our Lexington, MA headquarters with hybrid/field-based options to architect, implement, and operationalize the
end-to-end perception, sensor ingestion, and sensor fusion pipeline
for autonomous maritime navigation. This is a
foundational, hands-on technical leadership role . You will personally build the initial perception and data ingestion stack while establishing the technical direction and standards for a future perception team. The ideal candidate is equally comfortable designing system architecture, writing performance-critical code, and working in the field to ensure sensors and data pipelines operate reliably in real conditions. You will work closely with
hardware, autonomy, and systems engineering teams
to ensure sensor choices, data ingestion, and perception outputs scale with the company’s roadmap.
Autonomy at sea introduces unique perception challenges—long-range sensing, dense and ambiguous scenes, dynamic backgrounds, adverse weather, and limited infrastructure. Solving these problems requires engineers who can build
robust, production-ready perception systems , not just prototypes.
U.S. Person status is required as this position needs to access export controlled data.
What You'll Do
Design and implement the full perception stack for autonomous maritime navigation, including:
Sensor fusion across vision, navigation sensors, and future modalities
Interfaces to planning, control, and autonomy systems
Reliable, time-synchronized ingestion of high-bandwidth sensor streams
Scalable abstractions that support evolving sensor configurations
Data formats and APIs that enable downstream perception, logging, and replay
Define system-level requirements, performance metrics, and validation strategies for perception and sensor ingestion in maritime environments.
Hands-On Development
Build production-quality perception and ingestion pipelines in
modern C++ and Python , from raw sensor input through onboard runtime deployment.
Develop and deploy machine learning–based perception models, including optimization for onboard and embedded compute.
Where appropriate, implement
GPU-accelerated components
(e.g., CUDA) to meet latency, throughput, and power constraints.
Own dataset curation, model evaluation, regression testing, and continuous performance improvement.
Data Collection & Field Operations
Lead and participate directly in field testing, sensor bring-up, and data collection on vessels.
Design and execute sensor calibration, synchronization, and validation procedures.
Build robust logging, replay, and annotation workflows to ensure high-quality training and evaluation data.
Work hands-on in real operating conditions to validate both sensors and ingestion pipelines.
Hardware & Cross-Functional Integration
Partner closely with Mechanical, Electrical, and Systems Engineering teams to:
Define camera and sensor requirements
Influence sensor selection, placement, and mounting
Ensure ingestion pipelines reflect real hardware constraints and failure modes
Collaborate with autonomy and controls engineers to define perception outputs that enable safe and reliable navigation decisions.
Technical Leadership
Serve as the technical owner of perception and sensor ingestion within the autonomy organization.
Establish engineering best practices, coding standards, and architectural patterns.
Mentor and help hire future perception engineers as the team grows.
Who You Are
7+ years of experience building perception systems for robotics, autonomous vehicles, or complex real-world systems.
Deep expertise in computer vision and perception, including detection, tracking, and sensor fusion.
Strong proficiency in
modern C++ and Python
for production systems.
Experience designing
sensor data ingestion pipelines
for high-throughput, time-sensitive systems.
Demonstrated ability to work across software, hardware, and systems boundaries.
Comfortable working hands-on with sensors, prototypes, and field-deployed systems.
Clear technical communication and strong ownership from concept through deployment.
Nice To Haves
Experience with
GPU programming and acceleration , CUDA.
Experience optimizing perception or ingestion pipelines for real-time or embedded constraints.
Prior technical leadership of perception or robotics teams.
Experience designing perception systems from a blank slate through operational deployment.
Background in robotics or autonomy companies with strong hardware–software integration.
Experience operating in unstructured outdoor environments (marine, off-road, industrial).
What We Offer
Incredibly high-caliber teammates. You’ll work directly with our co-founders Rylan, Scott, and Austin.
A fast-paced, creative working environment that offers a lot of room for ownership and growth.
Opportunity to join a meaningful mission that protects American and our Democracy.
Expected Salary Range: $215,000-$250,000 annual base salary. Final compensation will depend on experience and skill level
Startup equity options
Generous PTO, medical, dental, and vision coverage
We are an equal opportunity employer. All hiring is contingent on eligibility to work in the United States. We are unable to sponsor or transfer visas for applicants.
#J-18808-Ljbffr