BotCrew
Join to apply for
Senior Software Engineer, Perception
role at
BotCrew .
Who We Are Founded in 2022, BotCrew has emerged as one of the leaders in the solar robotics space for solving real world problems that provide value to our end customers. Our robotics platform, Gravion, is trusted by 80% of the top Engineering, Procurement, and Construction companies in North America and we have ambitions to expand worldwide in the near future. For additional information about BotCrew and Gravion, please visit our website https://robots.botcrew.com.
About The Role We are seeking a skilled perception engineer to lead the design, implementation, and deployment of the perception stack powering our autonomous robotic systems. In this role, you will own the end-to-end perception pipeline, from sensors and calibration through real-time inference and tracking, delivering reliable scene understanding that enables safe, robust robot behavior in unstructured environments. You will work closely with autonomy, robotics software, and hardware teams to integrate and optimize computer vision and sensor‑fusion capabilities that operate on embedded compute at the edge.
Responsibilities
Architect, implement, and maintain BotCrew’s on‑robot perception stack, including detection, segmentation, depth/3D understanding, tracking, and state estimation inputs needed by autonomy.
Develop and deploy computer vision and machine learning models for real‑time operation on embedded or edge compute (e.g., NVIDIA Jetson/Orin‑class platforms), including optimization and profiling.
Build robust sensor pipelines (e.g., RGB cameras, stereo/depth, LiDAR, IMU), including time synchronization, calibration, and data validation.
Implement and productionize sensor fusion approaches (e.g., camera + depth/LiDAR + IMU) to improve reliability across lighting, weather, motion, and environmental variability.
Design evaluation methodologies and metrics; create tooling for offline analysis, dataset curation, model regression testing, and performance monitoring.
Partner with hardware and systems engineering to select sensors, define compute requirements, and ensure thermal/power/performance constraints are met.
Improve runtime resilience: fault detection, graceful degradation, and recovery behaviors when sensors or models underperform.
Lead technical decision‑making across perception; contribute to roadmap planning, technical reviews, and mentoring other engineers.
Document system architecture, interfaces, and operational playbooks to support testing, deployment, and field operations.
Qualifications
3+ years of professional software engineering experience, with significant ownership of production systems.
Strong proficiency in modern C++, including performance‑aware design for real‑time systems.
Demonstrated experience shipping perception or robotics capabilities to production (on‑robot, on‑vehicle, or edge deployment).
Solid understanding of computer vision fundamentals (multi‑view geometry, tracking, camera models) and practical ML deployment.
Experience with common perception tooling and frameworks (e.g., OpenCV, PyTorch/TensorFlow, ROS/ROS2 or equivalent middleware).
Experience integrating and validating sensors, including calibration, synchronization, and handling noisy/partial data.
Ability to debug complex systems using logs, traces, profiling tools, and structured experimentation.
Strong communication skills and ability to collaborate across autonomy, hardware, and operations teams.
Proven ability to leverage AI‑assisted tools (for coding, debugging, and technical research) as part of the development workflow.
Preferred/Bonus Qualifications
Experience deploying optimized inference (TensorRT, ONNX Runtime, CUDA) and accelerating models on NVIDIA GPUs/edge platforms.
Prior work with 3D perception: point clouds, voxel/BEV representations, LiDAR‑camera fusion, SLAM inputs, or depth estimation.
Experience with dataset and training pipelines: labeling strategies, active learning, data versioning, and ML experimentation platforms.
Familiarity with real‑time constraints and systems engineering (latency budgets, throughput, determinism, resource scheduling).
Experience designing safety‑and‑reliability‑oriented systems: monitoring, redundancy, fallback modes, and field diagnostics.
Exposure to simulation and synthetic data generation workflows for robotics validation.
Leadership experience mentoring engineers and driving cross‑functional technical initiatives from concept through deployment.
Seniority Level Mid‑Senior level
Employment Type Full‑time
Job Function Engineering and Information Technology
Industries Robot Manufacturing
Referrals increase your chances of interviewing at BotCrew by 2x.
#J-18808-Ljbffr
Senior Software Engineer, Perception
role at
BotCrew .
Who We Are Founded in 2022, BotCrew has emerged as one of the leaders in the solar robotics space for solving real world problems that provide value to our end customers. Our robotics platform, Gravion, is trusted by 80% of the top Engineering, Procurement, and Construction companies in North America and we have ambitions to expand worldwide in the near future. For additional information about BotCrew and Gravion, please visit our website https://robots.botcrew.com.
About The Role We are seeking a skilled perception engineer to lead the design, implementation, and deployment of the perception stack powering our autonomous robotic systems. In this role, you will own the end-to-end perception pipeline, from sensors and calibration through real-time inference and tracking, delivering reliable scene understanding that enables safe, robust robot behavior in unstructured environments. You will work closely with autonomy, robotics software, and hardware teams to integrate and optimize computer vision and sensor‑fusion capabilities that operate on embedded compute at the edge.
Responsibilities
Architect, implement, and maintain BotCrew’s on‑robot perception stack, including detection, segmentation, depth/3D understanding, tracking, and state estimation inputs needed by autonomy.
Develop and deploy computer vision and machine learning models for real‑time operation on embedded or edge compute (e.g., NVIDIA Jetson/Orin‑class platforms), including optimization and profiling.
Build robust sensor pipelines (e.g., RGB cameras, stereo/depth, LiDAR, IMU), including time synchronization, calibration, and data validation.
Implement and productionize sensor fusion approaches (e.g., camera + depth/LiDAR + IMU) to improve reliability across lighting, weather, motion, and environmental variability.
Design evaluation methodologies and metrics; create tooling for offline analysis, dataset curation, model regression testing, and performance monitoring.
Partner with hardware and systems engineering to select sensors, define compute requirements, and ensure thermal/power/performance constraints are met.
Improve runtime resilience: fault detection, graceful degradation, and recovery behaviors when sensors or models underperform.
Lead technical decision‑making across perception; contribute to roadmap planning, technical reviews, and mentoring other engineers.
Document system architecture, interfaces, and operational playbooks to support testing, deployment, and field operations.
Qualifications
3+ years of professional software engineering experience, with significant ownership of production systems.
Strong proficiency in modern C++, including performance‑aware design for real‑time systems.
Demonstrated experience shipping perception or robotics capabilities to production (on‑robot, on‑vehicle, or edge deployment).
Solid understanding of computer vision fundamentals (multi‑view geometry, tracking, camera models) and practical ML deployment.
Experience with common perception tooling and frameworks (e.g., OpenCV, PyTorch/TensorFlow, ROS/ROS2 or equivalent middleware).
Experience integrating and validating sensors, including calibration, synchronization, and handling noisy/partial data.
Ability to debug complex systems using logs, traces, profiling tools, and structured experimentation.
Strong communication skills and ability to collaborate across autonomy, hardware, and operations teams.
Proven ability to leverage AI‑assisted tools (for coding, debugging, and technical research) as part of the development workflow.
Preferred/Bonus Qualifications
Experience deploying optimized inference (TensorRT, ONNX Runtime, CUDA) and accelerating models on NVIDIA GPUs/edge platforms.
Prior work with 3D perception: point clouds, voxel/BEV representations, LiDAR‑camera fusion, SLAM inputs, or depth estimation.
Experience with dataset and training pipelines: labeling strategies, active learning, data versioning, and ML experimentation platforms.
Familiarity with real‑time constraints and systems engineering (latency budgets, throughput, determinism, resource scheduling).
Experience designing safety‑and‑reliability‑oriented systems: monitoring, redundancy, fallback modes, and field diagnostics.
Exposure to simulation and synthetic data generation workflows for robotics validation.
Leadership experience mentoring engineers and driving cross‑functional technical initiatives from concept through deployment.
Seniority Level Mid‑Senior level
Employment Type Full‑time
Job Function Engineering and Information Technology
Industries Robot Manufacturing
Referrals increase your chances of interviewing at BotCrew by 2x.
#J-18808-Ljbffr