Blue Signal
Autonomous Systems Perception Engineer
Blue Signal, Vancouver, Washington, United States, 98662
Overview
Autonomous Systems Perception Engineer Location: Vancouver, CA (On-site or Hybrid) A pioneering robotics and autonomous systems innovator is seeking a skilled Autonomous Systems Perception Engineer to push the boundaries of real-time environmental awareness for intelligent machines. This is a chance to influence how robots perceive and navigate the world—contributing to meaningful advancements in autonomy and AI integration. This opportunity offers the flexibility of a hybrid schedule and hands-on collaboration with some of the most cutting-edge tools in robotics. The successful candidate will be instrumental in integrating advanced sensor inputs, enabling systems to perform safely and intelligently in complex, real-world environments. Responsibilities
Architect and implement multi-sensor data fusion pipelines using LiDAR, cameras, and IMUs for robust environmental perception. Develop and optimize algorithms for real-time obstacle detection, mapping, and scene understanding. Integrate perception capabilities with navigation and planning stacks for seamless robotic decision-making. Run simulations and conduct field testing to validate system accuracy and performance under varied operational conditions. Collaborate cross-functionally with hardware, controls, and software teams to refine sensing systems. Continuously improve sensor calibration, synchronization, and data integrity processes. Qualifications
Bachelor’s or Master’s degree in Robotics, Computer Science, Electrical Engineering, or a closely related field. Minimum of 2 years of experience in robotic perception, sensor fusion, or related real-time embedded applications. Proficiency in C++ and Python, with experience using ROS or ROS2 environments. Familiarity with 3D perception tools such as OpenCV, PCL, and RViz. Hands-on experience with sensors such as LiDAR, stereo/depth cameras, and IMUs. Working knowledge of SLAM techniques, point cloud processing, and visual-inertial odometry. Preferred Experience
Background in multi-modal sensor fusion methods (EKF, UKF, or graph-based systems). Exposure to embedded deployment platforms (Jetson, Xavier, or similar). Previous work in autonomous vehicle, UAV, or advanced robotic platforms. Understanding of probabilistic robotics and large-scale environment modeling. What’s in It for You
Highly competitive compensation and benefits. Hybrid flexibility and an onsite lab for advanced testing and experimentation. Join a mission-driven organization committed to reshaping how intelligent machines perceive their surroundings. Work in an innovation-rich culture where your technical insights directly influence autonomous functionality. Take the next step in your robotics career and apply today to join a team building the next generation of perception systems.
#J-18808-Ljbffr
Autonomous Systems Perception Engineer Location: Vancouver, CA (On-site or Hybrid) A pioneering robotics and autonomous systems innovator is seeking a skilled Autonomous Systems Perception Engineer to push the boundaries of real-time environmental awareness for intelligent machines. This is a chance to influence how robots perceive and navigate the world—contributing to meaningful advancements in autonomy and AI integration. This opportunity offers the flexibility of a hybrid schedule and hands-on collaboration with some of the most cutting-edge tools in robotics. The successful candidate will be instrumental in integrating advanced sensor inputs, enabling systems to perform safely and intelligently in complex, real-world environments. Responsibilities
Architect and implement multi-sensor data fusion pipelines using LiDAR, cameras, and IMUs for robust environmental perception. Develop and optimize algorithms for real-time obstacle detection, mapping, and scene understanding. Integrate perception capabilities with navigation and planning stacks for seamless robotic decision-making. Run simulations and conduct field testing to validate system accuracy and performance under varied operational conditions. Collaborate cross-functionally with hardware, controls, and software teams to refine sensing systems. Continuously improve sensor calibration, synchronization, and data integrity processes. Qualifications
Bachelor’s or Master’s degree in Robotics, Computer Science, Electrical Engineering, or a closely related field. Minimum of 2 years of experience in robotic perception, sensor fusion, or related real-time embedded applications. Proficiency in C++ and Python, with experience using ROS or ROS2 environments. Familiarity with 3D perception tools such as OpenCV, PCL, and RViz. Hands-on experience with sensors such as LiDAR, stereo/depth cameras, and IMUs. Working knowledge of SLAM techniques, point cloud processing, and visual-inertial odometry. Preferred Experience
Background in multi-modal sensor fusion methods (EKF, UKF, or graph-based systems). Exposure to embedded deployment platforms (Jetson, Xavier, or similar). Previous work in autonomous vehicle, UAV, or advanced robotic platforms. Understanding of probabilistic robotics and large-scale environment modeling. What’s in It for You
Highly competitive compensation and benefits. Hybrid flexibility and an onsite lab for advanced testing and experimentation. Join a mission-driven organization committed to reshaping how intelligent machines perceive their surroundings. Work in an innovation-rich culture where your technical insights directly influence autonomous functionality. Take the next step in your robotics career and apply today to join a team building the next generation of perception systems.
#J-18808-Ljbffr