ZipRecruiter
Job DescriptionJob Description
Principal AI / ML Perception Engineer Build the Intelligence Behind Autonomous Defense Systems!
Austin, TX | Onsite
Opportunity Summary
A well-funded, venture-backed startup is developing compact, intelligent robotic systems to defend against high-speed aerial threats. As one of the first engineering hires, youll take full ownership of the autonomy and perception pipeline, working side-by-side with the technical leadership team to bring a real-world, real-time system to life. This is a zero-to-one role, designing, implementing, and validating machine learning and perception capabilities from the ground up. Your work will go beyond simulations and academic models; youll deploy your code into rugged field hardware and see it operate under real operational conditions. This is a rare opportunity to help define the brain of an autonomous defense platform at its earliest stage.
About Us
We are building cutting-edge robotic defense systems that operate autonomously in chaotic, high-threat environments. Our team combines experience in defense systems, autonomy, and machine learning. Were tackling the toughest challenges in perception, control, and systems integration to deliver hardware that works in any conditions. Our pace is fast, our expectations are high, and our mission is urgent.
Job Duties
Design and implement visual and multi-sensor perception pipelines optimized for speed, accuracy, and reliability
Build and deploy computer vision and object detection models capable of running in compute-constrained embedded environments
Develop audio-enhanced sensing systems for drone detection and tracking using microphone arrays or acoustic sensors
Lead data collection, labeling, and training workflows to support model validation across diverse field scenarios
Architect the onboard fusion of camera, IMU, and audio data for robust state estimation and environmental awareness
Bring up and integrate sensor hardware including cameras, IMUs, and audio inputs with custom drivers and embedded code
Prototype and test your work in both controlled and high-noise outdoor environments, iterating quickly with field results
Collaborate with controls, embedded, and hardware engineers to ensure full system-level functionality and reliability
Create internal tools and scripts for logging, replay, visualization, and real-time diagnostics
Help define the perception roadmap and influence future product direction
Qualifications
3+ years of experience building ML/CV systems for robotics, autonomy, or embedded platforms
Proven track record deploying models into latency- and resource-constrained environments
Fluency in Python and C++ for both prototyping and embedded development
Strong background in computer vision, including object detection, classification, and tracking
Experience with real-time sensor fusion using camera and IMU data (EKF/UKF or equivalent)
Exposure to audio-based sensing or signal processing in robotics or autonomy applications
Experience with embedded inference platforms such as NVIDIA Jetson, Coral, or similar
Hands-on mindset with a comfort level bringing up sensors, debugging drivers, and tuning systems in hardware
Experience
Prior work on perception systems in unmanned vehicles, cUAS, robotics, or defense technology
Familiarity with edge inference optimization (e.g., model pruning, quantization, TensorRT)
Knowledge of acoustic localization, audio triangulation, or noise filtering for detection tasks
Experience working on fast iteration teams in early-stage or startup environments
Background in safety-critical or real-time embedded systems
Why Join Us
Founding engineer position with full-stack ownership of perception and autonomy
Real-world product: your work moves from code to field in days, not months
High-trust, engineering-led environment focused on solving hard problems fast
Ability to shape the long-term vision, roadmap, and technical culture
Build a system that directly protects people and infrastructure in the real world
Early equity
Healthcare coverage (medical, dental, vision)
Relocation support if moving to Austin
Compensation Details
$150,000 - $220,000 plus equity
#LI-AV
#J-18808-Ljbffr
#J-18808-Ljbffr