General Atomics
General Atomics Aeronautical Systems, Inc. (GA-ASI), an affiliate of General Atomics, is a world leader in proven, reliable remotely piloted aircraft and tactical reconnaissance radars, as well as advanced high-resolution surveillance systems.
We are looking for an experienced Machine Learning Systems Engineer to join our team in Poway, CA. This person will lead a team in system development and bring prior expertise in Data and Machine Learning automation to help scale data-driven airborne sensing systems. The ideal candidate must have a wide breadth of engineering knowledge to support writing target detection algorithms for programs such as Air to Ground and Air to Air Radar, Broadband RF, Electronic Intelligence, Electronic Attack and Anti-Submarine Warfare. Join our Perception & Sensor Fusion group to help build and refine the Dynamic Environment Model (DEM) powering multi-sensor understanding and autonomous decision-making. You will develop ML models that extract features, infer dynamics, support multi-sensor association, and enhance our probabilistic world model. Your work bridges raw sensing and high-confidence fused tracks — advancing perception, prediction, and environment representation for real-world autonomous systems.
DUTIES AND RESPONSIBILITIES
With limited direction, this position exercises considerable latitude in determining technical objectives for the review, research, design, development, and/or solution to advanced technical engineering problem(s).
Modeling & Algorithm Development
Develop and train ML models for:
Multi-sensor perception (Radar / EO-IR / ESM inputs)
Detection, segmentation, and spatiotemporal occupancy inference
Feature extraction and learned embeddings to support tracking + association
Uncertainty-aware prediction and motion estimation
Explore and integrate modern world-modeling techniques:
Learned occupancy networks / BEV encoders
Neural fields, flow-based motion models, volumetric prediction
ML-aided fusion (e.g., learned association probabilities)
Support tracking engineers by improving sensor feature fusion, classification confidence, and track priors
Infrastructure & Deployment
Build robust training + replay datasets from real sensor missions
Implement evaluation frameworks for multi-modal fusion performance
Deploy trained models into real-time C++/GPU perception pipelines (with support from systems team)
Validate models in simulation and flight test environments
Cross-functional Collaboration
Work closely with:
Perception Systems engineers (DEM & fusion infrastructure)
Tracking & state estimation engineers
Autonomy team using DEM outputs
Sensor teams to understand real signal behavior, noise, and uncertainty
Contribute to sim-to-real transfer strategy and rapid iteration loop
We recognize and appreciate the value and contributions of individuals with diverse backgrounds and experiences and welcome all qualified individuals to apply.
Job Qualifications
Typically requires a bachelor's degree, master's degree or PhD in computer science, engineering, mathematics or a related technical discipline from an accredited institution and progressive machine learning experience as follows; fourteen or more years of experience with a bachelor's degree, twelve or more years of experience with a master's degree, or nine or more years with a PhD. May substitute equivalent machine learning experience in lieu of education.
3+ years building ML systems for perception or spatiotemporal data
Proficiency in Python + PyTorch or TensorFlow
Experience in one or more:
3D deep learning, BEV networks, voxel/occupancy models
Multi-sensor perception (camera, radar, IR, RF preferable but not required)
Sequence / motion models (RNN/Transformers/flow fields)
Uncertainty modeling & calibration
Experience building datasets + training pipelines
Preferred
Familiarity with:
Tracking algorithms (JPDA, IMM-EKF/UKF, PHD/RFS)
Radar signal characteristics or EO/IR processing
CUDA / Triton / TensorRT (bonus, not required)
Spatiotemporal world models (NeRF, occupancy grids, neural mapping)
Work in autonomous systems, robotics, or defense perception domains
Ability to obtain and maintain a DOD security clearance is required.
Salary:
$140,940 - $252,293
#J-18808-Ljbffr
We are looking for an experienced Machine Learning Systems Engineer to join our team in Poway, CA. This person will lead a team in system development and bring prior expertise in Data and Machine Learning automation to help scale data-driven airborne sensing systems. The ideal candidate must have a wide breadth of engineering knowledge to support writing target detection algorithms for programs such as Air to Ground and Air to Air Radar, Broadband RF, Electronic Intelligence, Electronic Attack and Anti-Submarine Warfare. Join our Perception & Sensor Fusion group to help build and refine the Dynamic Environment Model (DEM) powering multi-sensor understanding and autonomous decision-making. You will develop ML models that extract features, infer dynamics, support multi-sensor association, and enhance our probabilistic world model. Your work bridges raw sensing and high-confidence fused tracks — advancing perception, prediction, and environment representation for real-world autonomous systems.
DUTIES AND RESPONSIBILITIES
With limited direction, this position exercises considerable latitude in determining technical objectives for the review, research, design, development, and/or solution to advanced technical engineering problem(s).
Modeling & Algorithm Development
Develop and train ML models for:
Multi-sensor perception (Radar / EO-IR / ESM inputs)
Detection, segmentation, and spatiotemporal occupancy inference
Feature extraction and learned embeddings to support tracking + association
Uncertainty-aware prediction and motion estimation
Explore and integrate modern world-modeling techniques:
Learned occupancy networks / BEV encoders
Neural fields, flow-based motion models, volumetric prediction
ML-aided fusion (e.g., learned association probabilities)
Support tracking engineers by improving sensor feature fusion, classification confidence, and track priors
Infrastructure & Deployment
Build robust training + replay datasets from real sensor missions
Implement evaluation frameworks for multi-modal fusion performance
Deploy trained models into real-time C++/GPU perception pipelines (with support from systems team)
Validate models in simulation and flight test environments
Cross-functional Collaboration
Work closely with:
Perception Systems engineers (DEM & fusion infrastructure)
Tracking & state estimation engineers
Autonomy team using DEM outputs
Sensor teams to understand real signal behavior, noise, and uncertainty
Contribute to sim-to-real transfer strategy and rapid iteration loop
We recognize and appreciate the value and contributions of individuals with diverse backgrounds and experiences and welcome all qualified individuals to apply.
Job Qualifications
Typically requires a bachelor's degree, master's degree or PhD in computer science, engineering, mathematics or a related technical discipline from an accredited institution and progressive machine learning experience as follows; fourteen or more years of experience with a bachelor's degree, twelve or more years of experience with a master's degree, or nine or more years with a PhD. May substitute equivalent machine learning experience in lieu of education.
3+ years building ML systems for perception or spatiotemporal data
Proficiency in Python + PyTorch or TensorFlow
Experience in one or more:
3D deep learning, BEV networks, voxel/occupancy models
Multi-sensor perception (camera, radar, IR, RF preferable but not required)
Sequence / motion models (RNN/Transformers/flow fields)
Uncertainty modeling & calibration
Experience building datasets + training pipelines
Preferred
Familiarity with:
Tracking algorithms (JPDA, IMM-EKF/UKF, PHD/RFS)
Radar signal characteristics or EO/IR processing
CUDA / Triton / TensorRT (bonus, not required)
Spatiotemporal world models (NeRF, occupancy grids, neural mapping)
Work in autonomous systems, robotics, or defense perception domains
Ability to obtain and maintain a DOD security clearance is required.
Salary:
$140,940 - $252,293
#J-18808-Ljbffr