Intuitive Machines
Perception Software Lead - Lunar Terrain Vehicle
Intuitive Machines, Glen Burnie, Maryland, United States, 21060
Perception Software Lead - Lunar Terrain Vehicle
Houston, Texas
About Intuitive Machines Intuitive Machines is an innovative and cutting‑edge space company making cislunar space accessible to both public and private customers. Our mission is to further science, exploration, communications, and economic progress from the Earth to the Moon and beyond. With the first commercial lunar landing in history, multiple NASA lunar missions in development, and additional private missions on our manifest, we pride ourselves in supporting our customers and the nation in paving the way to return humans to the surface of the Moon.
NASA LTVS Award Contingent Employment in this role is contingent upon NASA selecting Intuitive Machines as the winner of the LTVS program, scheduled for announcement later in 2025.
About The Role Lead a small team in the development and certification of the LTV perception system, fusing data from LiDARs, cameras, and IMUs into a single coherent map of the rover’s surroundings.
Responsibilities
Proactively identify and document requirements for the vehicle perception system
Architect and decompose the software solution for vehicle perception
Coordinate the testing and certification of the system
Comfortable developing software in a high‑reliability environment, compliant to NASA standards NPR 7150.2D and NASA-STD-8739.8B and NASA CBCS requirements (e.g. SSP 50038)
Represent perception software at NASA safety panels
Supervise several internal employees and contractors
Qualifications
Bachelor's degree in computer science, computer engineering, etc.
Expertise in machine vision, sensor fusion, and Simultaneous Localization and Mapping (SLAM), including LiDAR‑inertial and visual‑inertial odometry
Experience ingesting, processing, and fusing data from monocular and stereoscopic imaging systems, LiDARs, IMUs, Star Trackers, and GP
Sensor fusion techniques, including classical Kalman filters, multiplicative Kalman filters, and pose graph optimization
Failure Detection, Isolation, and Response (FDIR) logic for all‑of‑the‑above
Experience in verification by simulation for complex perception systems
Development and maintenance of high‑quality design and testing documentation
Demonstrated relevant project experience (planetary rovers, autonomous vehicles, robotics, etc.)
Language / Tool Experience
C or C++ (required)
Python or other modeling/analysis languages (nice to have)
Familiar with FPGA, GPU, and NPU hardware acceleration techniques for algorithms and able to coordinate with hardware acceleration engineers to implement and validate algorithms
Git, Jira, Jama
EEOC Intuitive Machines is an Equal Opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected veteran status, age, or any other characteristic protected by law.
#J-18808-Ljbffr
About Intuitive Machines Intuitive Machines is an innovative and cutting‑edge space company making cislunar space accessible to both public and private customers. Our mission is to further science, exploration, communications, and economic progress from the Earth to the Moon and beyond. With the first commercial lunar landing in history, multiple NASA lunar missions in development, and additional private missions on our manifest, we pride ourselves in supporting our customers and the nation in paving the way to return humans to the surface of the Moon.
NASA LTVS Award Contingent Employment in this role is contingent upon NASA selecting Intuitive Machines as the winner of the LTVS program, scheduled for announcement later in 2025.
About The Role Lead a small team in the development and certification of the LTV perception system, fusing data from LiDARs, cameras, and IMUs into a single coherent map of the rover’s surroundings.
Responsibilities
Proactively identify and document requirements for the vehicle perception system
Architect and decompose the software solution for vehicle perception
Coordinate the testing and certification of the system
Comfortable developing software in a high‑reliability environment, compliant to NASA standards NPR 7150.2D and NASA-STD-8739.8B and NASA CBCS requirements (e.g. SSP 50038)
Represent perception software at NASA safety panels
Supervise several internal employees and contractors
Qualifications
Bachelor's degree in computer science, computer engineering, etc.
Expertise in machine vision, sensor fusion, and Simultaneous Localization and Mapping (SLAM), including LiDAR‑inertial and visual‑inertial odometry
Experience ingesting, processing, and fusing data from monocular and stereoscopic imaging systems, LiDARs, IMUs, Star Trackers, and GP
Sensor fusion techniques, including classical Kalman filters, multiplicative Kalman filters, and pose graph optimization
Failure Detection, Isolation, and Response (FDIR) logic for all‑of‑the‑above
Experience in verification by simulation for complex perception systems
Development and maintenance of high‑quality design and testing documentation
Demonstrated relevant project experience (planetary rovers, autonomous vehicles, robotics, etc.)
Language / Tool Experience
C or C++ (required)
Python or other modeling/analysis languages (nice to have)
Familiar with FPGA, GPU, and NPU hardware acceleration techniques for algorithms and able to coordinate with hardware acceleration engineers to implement and validate algorithms
Git, Jira, Jama
EEOC Intuitive Machines is an Equal Opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected veteran status, age, or any other characteristic protected by law.
#J-18808-Ljbffr