Skylark Labs
Location:
Bay Area (preferred) • US remote considered • Occasional travel to test sites
Team:
Robotics & Sensors
Type:
Full-time
About the Role We’re looking for a Systems Engineer who lives at the intersection of
robotics, sensors, and VLA (Vision‑Language‑Action) models . You’ll architect, integrate, and harden multi‑sensor robotic systems—then wire them to VLA policies so robots can understand instructions, plan, and act safely in the real world.
What You’ll Do
Own system integration
across sensors (EO/IR cameras, LiDAR, radar, audio, tactile), compute (Jetson/RTX), and platforms (UAV/UGV/static towers).
Build reliable robotics stacks : ROS 2 nodes, real‑time pipelines, health monitoring, logging/telemetry, OTA updates.
Interface with autonomy : connect perception + state estimation to planners/controllers; ensure timing, synchronization, and fail‑safes.
VLA enablement : integrate and evaluate VLA/LLM‑based policies for instruction following, task decomposition, and grounding to robot actions.
Simulation to field : prototype in Isaac Sim/Gazebo; drive flight/ground tests; close the sim‑to‑real gap with calibration and data feedback loops.
Sensor fusion & calibration : time sync (PTP), extrinsics/intrinsics, multi‑modal fusion, target tracking.
Quality & safety : bring‑up checklists, watchdogs, E‑stop and geofencing, regression tests, and performance benchmarking.
Cross‑functional : collaborate with ML, controls, and hardware teams; document designs and handoffs.
What You’ll Bring
Robotics systems experience
shipping real hardware (UAVs, UGVs, or fixed installations).
Sensors : hands‑on with at least two: RGB/thermal cameras, LiDAR, radar, microphone arrays, IMU/GNSS; calibration & synchronization expertise.
Software : ROS 2, C++ and/or Python; Linux; git; containerization.
Perception/ML familiarity : PyTorch or similar; comfortable deploying models at the edge (Jetson); basic CUDA awareness.
Controls/autonomy basics : planners, controllers, safety states, telemetry.
VLA exposure : grounding language to action, policy evaluation on robots, prompt/skill graphs, or task planners that translate intents to robot APIs.
Excellent debugging chops
with oscilloscopes/loggers, frame‑timing, and field testing under constraints.
Nice to Have
PX4/ArduPilot, MAVLink, DJI OSDK; Unitree/AgileX; MoveIt/Navigation2.
Multi‑sensor fusion (Kalman/UKF), radar processing, thermal pipelines.
Real‑time Linux, RT scheduling, low‑latency networking (DDS, ZeroMQ).
Simulation: NVIDIA Isaac Sim, Gazebo, CARLA; domain randomization.
Data engines & ops: on‑device logging, ROS bags, label/feedback loops.
Safety cases, checklists, and regulatory/test documentation.
Experience with VLA/VLM stacks (e.g., OpenVLA‑style policies, LLM planners, behavior trees, imitation/RL for skill learning).
Our Tech Stack ROS 2 • C++/Python • PyTorch • Jetson/RTX • CUDA • PX4/MAVLink • Isaac Sim/Gazebo • LiDAR/Radar/EO-IR • DDS
#J-18808-Ljbffr
Bay Area (preferred) • US remote considered • Occasional travel to test sites
Team:
Robotics & Sensors
Type:
Full-time
About the Role We’re looking for a Systems Engineer who lives at the intersection of
robotics, sensors, and VLA (Vision‑Language‑Action) models . You’ll architect, integrate, and harden multi‑sensor robotic systems—then wire them to VLA policies so robots can understand instructions, plan, and act safely in the real world.
What You’ll Do
Own system integration
across sensors (EO/IR cameras, LiDAR, radar, audio, tactile), compute (Jetson/RTX), and platforms (UAV/UGV/static towers).
Build reliable robotics stacks : ROS 2 nodes, real‑time pipelines, health monitoring, logging/telemetry, OTA updates.
Interface with autonomy : connect perception + state estimation to planners/controllers; ensure timing, synchronization, and fail‑safes.
VLA enablement : integrate and evaluate VLA/LLM‑based policies for instruction following, task decomposition, and grounding to robot actions.
Simulation to field : prototype in Isaac Sim/Gazebo; drive flight/ground tests; close the sim‑to‑real gap with calibration and data feedback loops.
Sensor fusion & calibration : time sync (PTP), extrinsics/intrinsics, multi‑modal fusion, target tracking.
Quality & safety : bring‑up checklists, watchdogs, E‑stop and geofencing, regression tests, and performance benchmarking.
Cross‑functional : collaborate with ML, controls, and hardware teams; document designs and handoffs.
What You’ll Bring
Robotics systems experience
shipping real hardware (UAVs, UGVs, or fixed installations).
Sensors : hands‑on with at least two: RGB/thermal cameras, LiDAR, radar, microphone arrays, IMU/GNSS; calibration & synchronization expertise.
Software : ROS 2, C++ and/or Python; Linux; git; containerization.
Perception/ML familiarity : PyTorch or similar; comfortable deploying models at the edge (Jetson); basic CUDA awareness.
Controls/autonomy basics : planners, controllers, safety states, telemetry.
VLA exposure : grounding language to action, policy evaluation on robots, prompt/skill graphs, or task planners that translate intents to robot APIs.
Excellent debugging chops
with oscilloscopes/loggers, frame‑timing, and field testing under constraints.
Nice to Have
PX4/ArduPilot, MAVLink, DJI OSDK; Unitree/AgileX; MoveIt/Navigation2.
Multi‑sensor fusion (Kalman/UKF), radar processing, thermal pipelines.
Real‑time Linux, RT scheduling, low‑latency networking (DDS, ZeroMQ).
Simulation: NVIDIA Isaac Sim, Gazebo, CARLA; domain randomization.
Data engines & ops: on‑device logging, ROS bags, label/feedback loops.
Safety cases, checklists, and regulatory/test documentation.
Experience with VLA/VLM stacks (e.g., OpenVLA‑style policies, LLM planners, behavior trees, imitation/RL for skill learning).
Our Tech Stack ROS 2 • C++/Python • PyTorch • Jetson/RTX • CUDA • PX4/MAVLink • Isaac Sim/Gazebo • LiDAR/Radar/EO-IR • DDS
#J-18808-Ljbffr