Arctic Research and Development
Arctic Research and Development builds high-latitude technology to unlock the last great frontier on Earth. An expanse 1.5 times the size of the US, the Arctic is the triple point of national security, climatic shifts, and economic opportunity. We cut through this complexity while preserving the integrity of the planet’s most fragile wilderness.
Built by pioneers with 30+ years of polar experience, ARD knows that the High North is a wicked harsh environment. We've got frost-bite to prove it. Working at the fringes of technological capabilities — with an eye towards autonomy assisting in a place that is utterly inhospitable for humans — this is a company that takes big bets on hard problems.
If you want to build and deploy ground-up technology with real-world impact, at an inflection point in history that won’t reappear, we want you at ARD.
The role We're looking for a Perception Engineer to build the eyes for autonomous systems operating in the world's most sensor-hostile environment. You'll develop computer vision and sensor fusion systems that cut through Arctic whiteout, polar night darkness, and ice fog — where traditional perception stacks fail. This isn't lab work: you'll deploy ML models on edge hardware that operates at -40°C, where every watt counts and failure means mission abort.
Core responsibilities
Design and deploy perception pipelines for Arctic autonomous platforms (surface, subsurface, aerial)
Fuse multi-modal sensor data (vision, LiDAR, radar, IMU, GNSS) for situational awareness and autonomy
Build ML models for ice classification, terrain mapping, and obstacle detection in low-visibility conditions
Train, evaluate, and harden ML models against snow cover, ice glare, low sun angles, and polar night conditions
Create synthetic training data and simulation environments for Arctic-specific scenarios
Work closely with mechanical, electrical, RF, and autonomy teams to co-design sensor placement, calibration, and integration
Support field deployments in the Arctic and iterate rapidly based on real-world data
What we value
5+ years industry experience in robotics, autonomy, or real-world ML perception systems
Strong foundation in computer vision, SLAM, and multi-sensor fusion
Proficiency in Python and C++ with production‑grade engineering practices
Experience with perception frameworks (ROS / ROS2, OpenCV, PyTorch, TensorFlow, etc.)
Comfort working with real sensors, calibration workflows, and noisy data
A track record of shipping perception systems that work outside controlled environments
Willingness to operate in safe but extreme cold environments during R&D field testing
What we offer
Competitive compensation
Real field exposure – travel to Arctic sites and Outposts when needed
Mission‑driven culture – focus on impact, not hours logged
#J-18808-Ljbffr
Built by pioneers with 30+ years of polar experience, ARD knows that the High North is a wicked harsh environment. We've got frost-bite to prove it. Working at the fringes of technological capabilities — with an eye towards autonomy assisting in a place that is utterly inhospitable for humans — this is a company that takes big bets on hard problems.
If you want to build and deploy ground-up technology with real-world impact, at an inflection point in history that won’t reappear, we want you at ARD.
The role We're looking for a Perception Engineer to build the eyes for autonomous systems operating in the world's most sensor-hostile environment. You'll develop computer vision and sensor fusion systems that cut through Arctic whiteout, polar night darkness, and ice fog — where traditional perception stacks fail. This isn't lab work: you'll deploy ML models on edge hardware that operates at -40°C, where every watt counts and failure means mission abort.
Core responsibilities
Design and deploy perception pipelines for Arctic autonomous platforms (surface, subsurface, aerial)
Fuse multi-modal sensor data (vision, LiDAR, radar, IMU, GNSS) for situational awareness and autonomy
Build ML models for ice classification, terrain mapping, and obstacle detection in low-visibility conditions
Train, evaluate, and harden ML models against snow cover, ice glare, low sun angles, and polar night conditions
Create synthetic training data and simulation environments for Arctic-specific scenarios
Work closely with mechanical, electrical, RF, and autonomy teams to co-design sensor placement, calibration, and integration
Support field deployments in the Arctic and iterate rapidly based on real-world data
What we value
5+ years industry experience in robotics, autonomy, or real-world ML perception systems
Strong foundation in computer vision, SLAM, and multi-sensor fusion
Proficiency in Python and C++ with production‑grade engineering practices
Experience with perception frameworks (ROS / ROS2, OpenCV, PyTorch, TensorFlow, etc.)
Comfort working with real sensors, calibration workflows, and noisy data
A track record of shipping perception systems that work outside controlled environments
Willingness to operate in safe but extreme cold environments during R&D field testing
What we offer
Competitive compensation
Real field exposure – travel to Arctic sites and Outposts when needed
Mission‑driven culture – focus on impact, not hours logged
#J-18808-Ljbffr