Gecko Robotics is hiring an

Autonomy Computer Vision Engineer

Pittsburgh, United States

Who We Are

The mission of Gecko Robotics is to protect today’s critical infrastructure, and give form to tomorrow’s: refineries, power plants, heavy manufacturing facilities, vessels, water storage, and many more. We accomplish this through our robotics platforms in tandem with our enterprise software solutions, creating a virtuous cycle of data acquisition, processing, analysis, and decision-making. Our robots operate in some of the most dangerous industrial environments, collecting data of unprecedented value and magnifying the contributions of human experts, while keeping them out of harm’s way. Gecko is uniquely poised to maximize both the production and the useful life of the assets we rely on to meet the world's energy needs.


Role at a Glance 

Gecko’s operations and services are growing to new industries and locations. Each quarter, we enter an industry requiring us to scan assets of higher topological complexity and scale. The need for automating parts of the data collection is increasing. The autonomy team aims to improve our operational process by enhancing the robot's position estimation with respect to the asset and automating the robot's motion.

As part of the autonomy team, you will be contributing to solving the localization problem. The localization stack consists of multiple sensor subsystems, including - IMU, encoders, cameras, and range-based sensors. These sensors feed into our sensor fusion and graph-based localization algorithms.

What you will do

As an experienced member of the autonomy team, you will be responsible for developing the perception and machine learning pipeline, graph-based localization techniques, and associated sensor subsystems. 

  • Research and develop the vision/perception pipeline for camera and range-based sensors.
  • Research and develop graph-based optimization techniques for localization (SLAM).
  • Maintain the deployment pipeline for the vision repositories, emphasizing CI/CD practices.
  • Deploy and demonstrate your system in the field.

Technologies We Use

The localization stack is primarily in Linux, with some sensors requiring other extended platforms. We are currently using ROS as our interprocess communication pipeline. Exposure to the following technologies and skills is preferred - 

  • Experience with Linux and command-line tools.
  • Familiarity with DevOps or CI/CD – GitHub automation and docker or Jenkins. 
  • 4+ years of production-focused C++ software development experience. Strong background in modern C++ ( C++11 and beyond).
  • Experience with ROS/2 and perception libraries, including but not limited to  - OpenCV, PCL, and Eigen.
  • Strong background in at least two of the following:
    • 3D and projective geometry.
    • Deep learning/machine learning for computer vision.
    • Practical applications of classical computer vision and machine learning techniques.

About You

We would like to hear from you if you have experience building software architecture comprising multiple subsystems designed for iterative development. Enjoy the process of characterizing the performance of your work and the impact it creates on deployment.

  • Understand the theory, live the practice - how it should work and how to get it to work.
  • Software engineer by trade, a problem solver at the core. 
  • Real-world experience in building and maintaining software production environments.
  • Desire to have a high impact at a fast-moving startup as a key contributor on a new project and fast-growing team.
  • Strong passion for learning and growth. Open to new ideas and technologies. For instance -  DevOps practices and continuous feedback.
  • Master’s degree in Computer Science, Robotics, or closely related field (or equivalent experience).
  • High self-motivation and love of self-directed learning.


  • Open/unlimited PTO
  • Stock options/equity
  • 401k with company match
  • Medical, dental, and vision coverage
  • Parental Leave

This job is no longer available

Enter your email address below to get notified whenever we find a similar job post.

Unsubscribe at any time.