Logo
Sangha Partners

Machine Learning Engineer- Robotics Perception & Control

Sangha Partners, Houston, Texas, United States, 77246

Save Job

Machine Learning Engineer – Robotics Perception & Control Our client is building advanced robotic systems that operate in real industrial environments, where reliability, perception accuracy, and precise control matter every second. As a Machine Learning Engineer focused on robotics perception and manipulation, you’ll develop the ML‑driven vision and control systems that enable robots to understand their surroundings, identify weld seams or work surfaces, and execute complex tasks with precision.

This is not a generic ML role. This is not NLP or LLM work. This is real robotics—object detection, depth, segmentation, and closed‑loop control running on physical hardware in production environments. If you’ve trained and deployed perception models onto robots, we want to talk to you.

Base pay range $150,000.00/yr - $200,000.00/yr

What You’ll Do

Build real‑time perception pipelines for object detection, segmentation, depth estimation, and geometric understanding in challenging industrial settings

Develop manipulation and control policies that integrate perception signals for trajectory generation, tool control, and servoing

Own end‑to‑end ML workflows, from data collection to model training, optimization, deployment, validation, and continuous improvement

Integrate ML models with physical robot hardware (robot arms, sensors, depth cameras, calibration pipelines)

Collaborate closely with robotics, controls, and field engineering teams to achieve scalable, stable performance in the real world

Contribute to on‑site testing, iteration, and validation in customer environments

What You Bring

3–7+ years of experience building ML systems for robotics, autonomous systems, or real‑world computer vision

Hands‑on experience with object detection, segmentation, depth estimation, or tracking (e.g., YOLO, Detectron2, segmentation networks, 3D CV)

Experience deploying ML/CV models on real robots or hardware, not just simulation

Strong coding skills in Python and C++ for real‑time robotics applications

Experience working with RGB‑D, stereo, or structured‑light cameras, including calibration and debugging

Experience optimizing models for real‑time inference ( Strongly Preferred

Experience with robot arms (ABB, Fanuc, UR, KUKA, Yaskawa, etc.)

Familiarity with ROS/ROS2, hardware bring‑up, and sensor integration

Background in industrial robotics, automation, or manufacturing environments

Experience with SLAM, 3D reconstruction, point clouds, or visual servoing

Experience deploying systems in harsh, cluttered, or dynamic environments

Why This Role Is Unique

Your models are deployed on

real robots , not in notebooks

Rapid iteration → what you build ships to production quickly

Huge ownership over perception, control, and deployment pipelines

Work directly with field teams to solve real industrial challenges

Opportunity to shape the core ML stack of a frontier robotics platform

Seniority level Mid‑Senior level

Employment type Full‑time

Industries Robotics Engineering and Robot Manufacturing

#J-18808-Ljbffr