Rethink recruit
About Amidon Heavy Industries
Today, most subsea operations depend on large crewed vessels. They are expensive, slow to mobilize, and the reason infrastructure is rarely inspected until something fails.
Amidon Heavy Industries builds autonomous systems for offshore work. Our first systems use uncrewed surface vessels paired with ROVs to inspect pipelines, subsea cables, and offshore assets without ship mobilization. Inspection is the starting point. The long-term goal is a ship‑free operating model for offshore work that expands into monitoring, survey, and intervention over time.
The Role We are hiring a Software Engineer, Perception & Sensors to own the sensing stack end to end, from sensor selection through delivered data product. Our vessels exist to gather high‑fidelity real‑world data from surface to seafloor. This role is responsible for choosing the right sensors, understanding how they behave in real maritime environments, and ensuring the data they produce is meaningful, interpretable, and fit for customer use.
Responsibilities
Own the sensing stack end to end, from sensor evaluation and selection through perception outputs and delivered data products
Evaluate, select, and validate sonar, camera, video, radar, and navigation sensors based on real‑world performance rather than datasheets
Characterize sensor behavior, noise, bias, resolution limits, latency, and failure modes in operational maritime conditions
Design and implement calibration, synchronization, and validation workflows to ensure data is accurate, repeatable, and comparable over time
Develop perception and sensor fusion approaches that transform raw sensor signals into usable representations of the environment
Define and enforce clear standards for acceptable data quality across sensing modalities
Build tooling to visualize, replay, inspect, and audit raw and processed sensor data
Diagnose sensing and perception failures during field operations and drive fixes at the sensor, software, or system level
Partner with autonomy, systems, and downstream users to ensure perception outputs translate into a usable, trustworthy data product
Qualifications
Experience owning sensing or perception systems where the output data itself was the product
Strong C++ and Python skills in a Linux environment
Hands‑on experience working with real sensors, including sonar, vision systems, radar, GNSS, and IMU
Deep understanding of sensor physics, noise characteristics, calibration challenges, and real‑world failure modes
Experience turning raw sensor data into usable representations for downstream systems or customers
Comfort making judgment calls about data trustworthiness under imperfect, noisy, or ambiguous conditions
Bonus Points
Experience with sonar and acoustic sensing in subsea or maritime environments
Background in field robotics, survey systems, inspection platforms, or remote sensing
Experience delivering sensor‑derived data products used for operational or commercial decision‑making
Familiarity with edge compute constraints and high‑bandwidth sensing systems
Experience taking sensing and perception systems from prototype to commercial deployment
#J-18808-Ljbffr
Amidon Heavy Industries builds autonomous systems for offshore work. Our first systems use uncrewed surface vessels paired with ROVs to inspect pipelines, subsea cables, and offshore assets without ship mobilization. Inspection is the starting point. The long-term goal is a ship‑free operating model for offshore work that expands into monitoring, survey, and intervention over time.
The Role We are hiring a Software Engineer, Perception & Sensors to own the sensing stack end to end, from sensor selection through delivered data product. Our vessels exist to gather high‑fidelity real‑world data from surface to seafloor. This role is responsible for choosing the right sensors, understanding how they behave in real maritime environments, and ensuring the data they produce is meaningful, interpretable, and fit for customer use.
Responsibilities
Own the sensing stack end to end, from sensor evaluation and selection through perception outputs and delivered data products
Evaluate, select, and validate sonar, camera, video, radar, and navigation sensors based on real‑world performance rather than datasheets
Characterize sensor behavior, noise, bias, resolution limits, latency, and failure modes in operational maritime conditions
Design and implement calibration, synchronization, and validation workflows to ensure data is accurate, repeatable, and comparable over time
Develop perception and sensor fusion approaches that transform raw sensor signals into usable representations of the environment
Define and enforce clear standards for acceptable data quality across sensing modalities
Build tooling to visualize, replay, inspect, and audit raw and processed sensor data
Diagnose sensing and perception failures during field operations and drive fixes at the sensor, software, or system level
Partner with autonomy, systems, and downstream users to ensure perception outputs translate into a usable, trustworthy data product
Qualifications
Experience owning sensing or perception systems where the output data itself was the product
Strong C++ and Python skills in a Linux environment
Hands‑on experience working with real sensors, including sonar, vision systems, radar, GNSS, and IMU
Deep understanding of sensor physics, noise characteristics, calibration challenges, and real‑world failure modes
Experience turning raw sensor data into usable representations for downstream systems or customers
Comfort making judgment calls about data trustworthiness under imperfect, noisy, or ambiguous conditions
Bonus Points
Experience with sonar and acoustic sensing in subsea or maritime environments
Background in field robotics, survey systems, inspection platforms, or remote sensing
Experience delivering sensor‑derived data products used for operational or commercial decision‑making
Familiarity with edge compute constraints and high‑bandwidth sensing systems
Experience taking sensing and perception systems from prototype to commercial deployment
#J-18808-Ljbffr