Ebots
Get AI-powered advice on this job and more exclusive features.
We’re hiring a Robotics Engineer (Vision-Language-Action) to help build and deploy robot foundation models that translate visual and language input into robust, real-world robotic behaviors.
In this role, you’ll fine-tune and evaluate multimodal transformer/VLA models on simulated and physical robots, design data collection and sim-to-real pipelines, implement training and inference systems, and integrate models into ROS or production robot stacks.
Strong candidates will have demonstrated experience applying VLM/VLA models on real hardware, a background in imitation learning or reinforcement learning for control, and solid software engineering skills for model serving and on-device optimization.
Key Responsibilities
Fine-tune, deploy, and evaluate VLM/VLA models on real robots and high-fidelity simulations.
Design and implement data collection, training, and sim-to-real pipelines for robot learning.
Integrate learned policies into ROS-based or production robot software stacks.
Develop and maintain robotic arm control software, including motion planning, control algorithms, and user interfaces.
Implement and optimize algorithms for kinematics, dynamics, and sensor integration.
Conduct software testing, debugging, and troubleshooting to ensure reliability and performance.
Participate in field testing and validation of robotic systems in real-world environments.
Stay up to date with the latest advancements in VLA models, robotics software, and related technologies to drive innovation.
Collaborate with cross-functional teams to translate requirements into robust software solutions.
Qualifications
Master’s or Ph.D. in Computer Science, Robotics, Electrical Engineering, or a related field.
3+ years of robotics software development experience, with a focus on robotic arms.
Proficiency in C++, Python, and ROS, with familiarity in Git and Docker.
Experience with ROS-based integration workflows and the NVIDIA Isaac ecosystem (Sim, Lab, GR00T).
Familiarity with modern VLA methods, large-scale training infrastructure (e.g., Google Cloud, NVIDIA Brev), and synthetic data generation (e.g., GR00T-Dreams, Egomimic).
Strong understanding of robotic arm kinematics, dynamics, control, and sensor integration.
Excellent problem-solving, attention to detail, and clear communication skills.
Ability to thrive in a fast-paced, dynamic environment and manage multiple projects.
Preferred Qualifications
Hands-on deployment and evaluation of VLA policies in closed-loop control setups.
Experience with ROS 2 and real-time systems.
Familiarity with agile development methodologies.
Relevant certifications in robotics or software engineering.
We are an equal opportunity employer who offers:
Competitive Salary & Equity
Comprehensive healthcare benefits for you and your family
Flexible PTO
The chance to work with robots!
Seniority level
Mid-Senior level
Employment type
Full-time
Job function
Engineering and Information Technology
Industries
Automation Machinery Manufacturing
#J-18808-Ljbffr
We’re hiring a Robotics Engineer (Vision-Language-Action) to help build and deploy robot foundation models that translate visual and language input into robust, real-world robotic behaviors.
In this role, you’ll fine-tune and evaluate multimodal transformer/VLA models on simulated and physical robots, design data collection and sim-to-real pipelines, implement training and inference systems, and integrate models into ROS or production robot stacks.
Strong candidates will have demonstrated experience applying VLM/VLA models on real hardware, a background in imitation learning or reinforcement learning for control, and solid software engineering skills for model serving and on-device optimization.
Key Responsibilities
Fine-tune, deploy, and evaluate VLM/VLA models on real robots and high-fidelity simulations.
Design and implement data collection, training, and sim-to-real pipelines for robot learning.
Integrate learned policies into ROS-based or production robot software stacks.
Develop and maintain robotic arm control software, including motion planning, control algorithms, and user interfaces.
Implement and optimize algorithms for kinematics, dynamics, and sensor integration.
Conduct software testing, debugging, and troubleshooting to ensure reliability and performance.
Participate in field testing and validation of robotic systems in real-world environments.
Stay up to date with the latest advancements in VLA models, robotics software, and related technologies to drive innovation.
Collaborate with cross-functional teams to translate requirements into robust software solutions.
Qualifications
Master’s or Ph.D. in Computer Science, Robotics, Electrical Engineering, or a related field.
3+ years of robotics software development experience, with a focus on robotic arms.
Proficiency in C++, Python, and ROS, with familiarity in Git and Docker.
Experience with ROS-based integration workflows and the NVIDIA Isaac ecosystem (Sim, Lab, GR00T).
Familiarity with modern VLA methods, large-scale training infrastructure (e.g., Google Cloud, NVIDIA Brev), and synthetic data generation (e.g., GR00T-Dreams, Egomimic).
Strong understanding of robotic arm kinematics, dynamics, control, and sensor integration.
Excellent problem-solving, attention to detail, and clear communication skills.
Ability to thrive in a fast-paced, dynamic environment and manage multiple projects.
Preferred Qualifications
Hands-on deployment and evaluation of VLA policies in closed-loop control setups.
Experience with ROS 2 and real-time systems.
Familiarity with agile development methodologies.
Relevant certifications in robotics or software engineering.
We are an equal opportunity employer who offers:
Competitive Salary & Equity
Comprehensive healthcare benefits for you and your family
Flexible PTO
The chance to work with robots!
Seniority level
Mid-Senior level
Employment type
Full-time
Job function
Engineering and Information Technology
Industries
Automation Machinery Manufacturing
#J-18808-Ljbffr