Logo
Shield AI

Senior Manager, Software - Perception (R3770)

Shield AI, Washington, District Of Columbia, United States, 20599

Save Job

Senior Manager, Software - Perception

This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments. A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions. We are seeking a skilled and motivated manager to lead technical teams and support direct projects integrating perception solutions for defense platforms. Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience. What You'll Do:

Multidisciplinary Team Leadership

Lead teams across autonomy, integration, and testing by aligning technical efforts, resolving cross-functional challenges, and driving mission-focused execution. Balance hands-on technical oversight with performance optimization, innovation, and clear stakeholder communication. Develop advanced perception algorithms

Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities. Implement sensor fusion frameworks

Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness. Develop state estimation capabilities

Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation. Analyze and utilize sensor ICDs

Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization. Optimize perception performance

Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments. Support autonomy integration

Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules. Validate in simulated and operational settings

Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness. Collaborate with hardware and sensor teams

Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads. Drive innovation in airborne sensing

Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments. Travel Requirement

Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events). Required Qualifications:

BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience Typically requires a minimum of 10 years of related experience with a Bachelor's degree; or 9 years and a Master's degree; or 7 years with a PhD; or equivalent work experience. 7+ years of experience in Unmanned Systems programs in the DoD or applied R&D 2+ years of people leadership experience Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models. Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches. Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications. Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs. Proficiency with version control, debugging, and test-driven development in cross-functional teams. Ability to obtain a SECRET clearance. Preferred Qualifications:

Hands-on integration or algorithm development with airborne sensing systems. Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks. Experience deploying perception software on SWaP-constrained platforms. Familiarity with validating perception systems during flight test events or operational environments. Understanding of sensing challenges in denied or degraded conditions. Exposure to perception applications across air, maritime, and ground platforms.