Wardstone (YC F25)
Sensing & Perception Engineer
Wardstone (YC F25), San Francisco, California, United States, 94199
Join to apply for the
Sensing & Perception Engineer
role at
Wardstone (YC F25)
This range is provided by Wardstone (YC F25). Your actual pay will be based on your skills and experience — talk with your recruiter to learn more.
Base pay range $100,000.00/yr - $160,000.00/yr
What You’ll Do IR & EO Sensor Integration
Integrate sensor modules into avionics architecture.
Own sensor selection, FOV analysis, lensing, NUC, calibration workflows, and driver integration.
Develop high-reliability camera interfaces: SPI/I2C, MIPI, USB3, Ethernet, GMSL, or custom buses.
Real‑Time Computer Vision & Tracking
Build perception pipelines capable of tracking high-speed objects with extremely low latency.
Implement target detection, centroid extraction, filtering, gating, and motion‑compensation algorithms.
Develop robust tracking under noise, flare, thermal clutter, high dynamic range, and glint effects.
Sensor Fusion & Estimation
Fuse IR/EO data with IMU, GPS, and onboard state estimators.
Work with controls engineers to build perception‑to‑guidance interfaces for closed loop control.
Build EKF/UKF filters, temporal alignment, and prediction modules.
Verification & Validation
Design sensor test rigs: hot/cold targets, blackbody references, collimators, high‑G mounts.
Conduct hardware‑in‑the‑loop tests with simulated targets, scene generators, and flight dynamics models.
Collect, label, and analyze high‑speed footage from ground+flight experiments, HAB flights, and tracking tests.
Stress‑test perception under temperature, vibration, and shock conditions.
Required Qualifications
B.S. in EE, CS, Physics, Robotics, or similar.
Strong experience with IR/EO camera systems, Radar, imaging physics, calibration, and noise modeling.
Deep proficiency in computer vision (OpenCV, NumPy, SciPy, C++/Python).
Experience implementing real‑time detection and tracking algorithms.
Understanding of signal processing, filtering, sensor fusion, and coordinate transforms.
Hands‑on testing experience using thermal cameras, radar, calibration targets, or optical equipment.
U.S. citizenship required.
Nice to Haves
Experience working with LWIR sensors (FLIR Boson/Tau, ULIS, Teledyne, etc.) or Radar Units.
Background in missile seekers, UAV tracking, air‑to‑air systems, or autonomous robotics under high speed.
Experience with GPU acceleration (CUDA, TensorRT), embedded systems, or FPGA‑based vision pipelines.
Familiarity with optical modeling, lens design, radiometry, and atmospheric attenuation.
Knowledge of IR scene modeling, signature tracking, or synthetic target generation.
Experience with high‑dynamic‑range thermal environments or high‑G mount design.
Exposure to defense/aerospace sensor standards or MIL‑STD environments.
Seniority Level Entry level
Employment Type Full‑time
Job Function Engineering and Information Technology
#J-18808-Ljbffr
Sensing & Perception Engineer
role at
Wardstone (YC F25)
This range is provided by Wardstone (YC F25). Your actual pay will be based on your skills and experience — talk with your recruiter to learn more.
Base pay range $100,000.00/yr - $160,000.00/yr
What You’ll Do IR & EO Sensor Integration
Integrate sensor modules into avionics architecture.
Own sensor selection, FOV analysis, lensing, NUC, calibration workflows, and driver integration.
Develop high-reliability camera interfaces: SPI/I2C, MIPI, USB3, Ethernet, GMSL, or custom buses.
Real‑Time Computer Vision & Tracking
Build perception pipelines capable of tracking high-speed objects with extremely low latency.
Implement target detection, centroid extraction, filtering, gating, and motion‑compensation algorithms.
Develop robust tracking under noise, flare, thermal clutter, high dynamic range, and glint effects.
Sensor Fusion & Estimation
Fuse IR/EO data with IMU, GPS, and onboard state estimators.
Work with controls engineers to build perception‑to‑guidance interfaces for closed loop control.
Build EKF/UKF filters, temporal alignment, and prediction modules.
Verification & Validation
Design sensor test rigs: hot/cold targets, blackbody references, collimators, high‑G mounts.
Conduct hardware‑in‑the‑loop tests with simulated targets, scene generators, and flight dynamics models.
Collect, label, and analyze high‑speed footage from ground+flight experiments, HAB flights, and tracking tests.
Stress‑test perception under temperature, vibration, and shock conditions.
Required Qualifications
B.S. in EE, CS, Physics, Robotics, or similar.
Strong experience with IR/EO camera systems, Radar, imaging physics, calibration, and noise modeling.
Deep proficiency in computer vision (OpenCV, NumPy, SciPy, C++/Python).
Experience implementing real‑time detection and tracking algorithms.
Understanding of signal processing, filtering, sensor fusion, and coordinate transforms.
Hands‑on testing experience using thermal cameras, radar, calibration targets, or optical equipment.
U.S. citizenship required.
Nice to Haves
Experience working with LWIR sensors (FLIR Boson/Tau, ULIS, Teledyne, etc.) or Radar Units.
Background in missile seekers, UAV tracking, air‑to‑air systems, or autonomous robotics under high speed.
Experience with GPU acceleration (CUDA, TensorRT), embedded systems, or FPGA‑based vision pipelines.
Familiarity with optical modeling, lens design, radiometry, and atmospheric attenuation.
Knowledge of IR scene modeling, signature tracking, or synthetic target generation.
Experience with high‑dynamic‑range thermal environments or high‑G mount design.
Exposure to defense/aerospace sensor standards or MIL‑STD environments.
Seniority Level Entry level
Employment Type Full‑time
Job Function Engineering and Information Technology
#J-18808-Ljbffr