Awesome Motive
Embedded Firmware Engineer
Efference
– Robust Robotic Perception Location : North Beach, San Francisco, CA Employment
Type : Full Time Location
Type : On-site Department : Engineering Company Overview Efference builds robust, high-performance robotic perceptual systems that make robots easier to develop and faster to deploy. Our technology integrates proprietary optical designs, advanced sensor fusion, depth estimation, image sensor processing, and tightly optimized hardware-software stacks to enable reliable perception across general and specialized robotic systems. We are currently deploying the first generation of vision systems with state-of-the-art hardware and software while actively prototyping generations two and three of our core perceptual system. Efference was founded by Gianluca Bencomo, a PhD student at Princeton University whose research spans visual neuroscience, Bayesian filtering for fast and robust sensor fusion, neural network inductive biases, and robotics perception. The company is backed by Y Combinator and Anti Fund. Position Overview As a Embedded Firmware Engineer at Efference, you will lead the design, implementation, and integration of low-level software that powers our next-generation robotic perception systems. You will work at the boundary between custom hardware and advanced perception software, enabling reliable, high-performance sensing across cameras, IMUs, and other robotic sensors. This role requires deep expertise in embedded Linux, real-time systems, and hardware bring-up, as well as the ability to collaborate closely with PCB designers, perception researchers, and full-stack engineers. Your work will directly influence the performance, reliability, and deployability of Efference’s custom camera and sensing platforms across multiple product generations. Key Responsibilities Lead firmware architecture and development for ARM-based SoCs and MCUs
Read, modify, and maintain device trees, BSPs, and board support packages
Review and approve schematics and PCB layouts, with a focus on signal integrity, power distribution, and manufacturability
Develop and maintain embedded Linux systems using Buildroot and Yocto
Implement and optimize RTOS-based firmware for real-time sensing and control
Write and maintain kernel drivers and low-level device drivers for sensors and peripherals
Integrate and optimize MIPI-CSI camera pipelines, including basic camera bring-up and tuning
Implement and debug WiFi and Bluetooth connectivity on embedded platforms
Work with a wide range of communication interfaces, including CAN, USB, RS-485, UART, I²C, SPI, GMSL, etc.
Integrate and calibrate sensors such as IMUs, cameras, LiDAR, and other robotic sensing modalities
Perform hands-on hardware debugging using oscilloscopes, logic analyzers, and other lab tools
Collaborate with perception, robotics, and full-stack teams to support teleoperation, data collection, and on-device inference
Contribute to product roadmaps, technical documentation, and platform evolution across multiple hardware generations
Qualifications Bachelor’s degree or higher in Electrical Engineering or a related field
5+ years of professional experience in embedded systems or firmware engineering
Strong proficiency in C/C++ for embedded and systems programming
Experience working with embedded Linux on ARM-based SoCs
Deep understanding of hardware–software co-design, including power, clock, and memory subsystems
Proven experience bringing up custom hardware from early prototypes through production
Familiarity with common MCUs and peripheral architectures
Strong debugging and root-cause analysis skills across hardware and software layers
Preferred Qualifications Experience with Rust for embedded or systems programming
Background in robotics, perception, or sensor fusion
Familiarity with robotic control systems and real-time constraints
Personal interest or hands-on experience in robot learning or autonomous systems
Experience supporting high-throughput image or sensor data pipelines
Benefits Competitive salary and meaningful equity in a seed-stage, venture-backed startup
Opportunity to work at the cutting edge of robotic perception and applied research
Close collaboration with world-class researchers and engineers
High-impact role with significant technical ownership and growth opportunities
#J-18808-Ljbffr
Efference
– Robust Robotic Perception Location : North Beach, San Francisco, CA Employment
Type : Full Time Location
Type : On-site Department : Engineering Company Overview Efference builds robust, high-performance robotic perceptual systems that make robots easier to develop and faster to deploy. Our technology integrates proprietary optical designs, advanced sensor fusion, depth estimation, image sensor processing, and tightly optimized hardware-software stacks to enable reliable perception across general and specialized robotic systems. We are currently deploying the first generation of vision systems with state-of-the-art hardware and software while actively prototyping generations two and three of our core perceptual system. Efference was founded by Gianluca Bencomo, a PhD student at Princeton University whose research spans visual neuroscience, Bayesian filtering for fast and robust sensor fusion, neural network inductive biases, and robotics perception. The company is backed by Y Combinator and Anti Fund. Position Overview As a Embedded Firmware Engineer at Efference, you will lead the design, implementation, and integration of low-level software that powers our next-generation robotic perception systems. You will work at the boundary between custom hardware and advanced perception software, enabling reliable, high-performance sensing across cameras, IMUs, and other robotic sensors. This role requires deep expertise in embedded Linux, real-time systems, and hardware bring-up, as well as the ability to collaborate closely with PCB designers, perception researchers, and full-stack engineers. Your work will directly influence the performance, reliability, and deployability of Efference’s custom camera and sensing platforms across multiple product generations. Key Responsibilities Lead firmware architecture and development for ARM-based SoCs and MCUs
Read, modify, and maintain device trees, BSPs, and board support packages
Review and approve schematics and PCB layouts, with a focus on signal integrity, power distribution, and manufacturability
Develop and maintain embedded Linux systems using Buildroot and Yocto
Implement and optimize RTOS-based firmware for real-time sensing and control
Write and maintain kernel drivers and low-level device drivers for sensors and peripherals
Integrate and optimize MIPI-CSI camera pipelines, including basic camera bring-up and tuning
Implement and debug WiFi and Bluetooth connectivity on embedded platforms
Work with a wide range of communication interfaces, including CAN, USB, RS-485, UART, I²C, SPI, GMSL, etc.
Integrate and calibrate sensors such as IMUs, cameras, LiDAR, and other robotic sensing modalities
Perform hands-on hardware debugging using oscilloscopes, logic analyzers, and other lab tools
Collaborate with perception, robotics, and full-stack teams to support teleoperation, data collection, and on-device inference
Contribute to product roadmaps, technical documentation, and platform evolution across multiple hardware generations
Qualifications Bachelor’s degree or higher in Electrical Engineering or a related field
5+ years of professional experience in embedded systems or firmware engineering
Strong proficiency in C/C++ for embedded and systems programming
Experience working with embedded Linux on ARM-based SoCs
Deep understanding of hardware–software co-design, including power, clock, and memory subsystems
Proven experience bringing up custom hardware from early prototypes through production
Familiarity with common MCUs and peripheral architectures
Strong debugging and root-cause analysis skills across hardware and software layers
Preferred Qualifications Experience with Rust for embedded or systems programming
Background in robotics, perception, or sensor fusion
Familiarity with robotic control systems and real-time constraints
Personal interest or hands-on experience in robot learning or autonomous systems
Experience supporting high-throughput image or sensor data pipelines
Benefits Competitive salary and meaningful equity in a seed-stage, venture-backed startup
Opportunity to work at the cutting edge of robotic perception and applied research
Close collaboration with world-class researchers and engineers
High-impact role with significant technical ownership and growth opportunities
#J-18808-Ljbffr