Logo
Wardstone (YC F25)

Sensing & Perception Engineer

Wardstone (YC F25), Los Angeles, California, United States, 90079

Save Job

Overview We are looking for a Sensing and Perception Engineer to own Wardstone’s entire sensing system from sensor selection and integration to low‑level image pipelines to real‑time target tracking. You will work at the intersection of sensor hardware, computer vision algorithms, embedded systems, optics, and vehicle guidance. This role is central to Wardstone’s mission. Your system determines whether an interceptor knows where to go! You will work directly with founders, build test rigs, run high‑speed experiments, and ship perception systems into hardware that performs real intercepts.

Responsibilities

Integrate sensor modules into avionics architecture.

Own sensor selection, FOV analysis, lensing, NUC, calibration workflows, and driver integration.

Develop high‑reliability camera interfaces: SPI/I2C, MIPI, USB3, Ethernet, GMSL, or custom buses.

Build perception pipelines capable of tracking high‑speed objects with extremely low latency.

Implement target detection, centroid extraction, filtering, gating, and motion‑compensation algorithms.

Develop robust tracking under noise, flare, thermal clutter, high dynamic range, and glint effects.

Fuse IR/EO data with IMU, GPS, and onboard state estimators.

Work with controls engineers to build perception‑to‑guidance interfaces for closed‑loop control.

Build EKF/UKF filters, temporal alignment, and prediction modules.

Design sensor test rigs: hot/cold targets, blackbody references, collimators, high‑G mounts.

Conduct hardware‑in‑the‑loop tests with simulated targets, scene generators, and flight dynamics models.

Collect, label, and analyze high‑speed footage from ground and flight experiments, HAB flights, and tracking tests.

Stress‑test perception under temperature, vibration, and shock conditions.

Required Qualifications

B.S. in EE, CS, Physics, Robotics, or similar.

Strong experience with IR/EO camera systems, Radar, imaging physics, calibration, and noise modeling.

Deep proficiency in computer vision (OpenCV, NumPy, SciPy, C++/Python).

Experience implementing real‑time detection and tracking algorithms.

Understanding of signal processing, filtering, sensor fusion, and coordinate transforms.

Hands‑on testing experience using thermal cameras, radar, calibration targets, or optical equipment.

US citizenship required.

Nice to Haves

Experience working with LWIR sensors (FLIR Boson/Tau, ULIS, Teledyne, etc.) or Radar Units.

Background in missile seekers, UAV tracking, air‑to‑air systems, or autonomous robotics under high speed.

Experience with GPU acceleration (CUDA, TensorRT), embedded systems, or FPGA‑based vision pipelines.

Familiarity with optical modeling, lens design, radiometry, and atmospheric attenuation.

Knowledge of IR scene modeling, signature tracking, or synthetic target generation.

Experience with high‑dynamic‑range thermal environments or high‑G mount design.

Exposure to defense/aerospace sensor standards or MIL‑STD environments.

Seniority Level Entry level

Employment Type Full-time

Job Function Engineering and Information Technology

Industries Defense and Space Manufacturing

#J-18808-Ljbffr