Data Engineer
TLDR
Contribute to developing critical data pipelines and analytics infrastructure that power real-time space domain awareness and customer insights.
Why LeoLabs?
At LeoLabs, we’re building the living map of activity in space. Through our proprietary global radar network and AI-enabled analytics platform, we collect millions of measurements daily on more than 25,000 objects in low Earth orbit (LEO). Our radar-powered intelligence protects billions in assets, monitors adversarial behavior, and ensures safe operations for commercial and government missions.
We’re not just building technology, we are redefining global security, safety, and transparency in space. As orbital activity accelerates and threats grow more complex, LeoLabs is a trusted partner for Space Domain Awareness, Space Traffic Management, and Satellite Operations for top-tier space operators and allied defense organizations.
If you're looking to work on mission-critical challenges at the forefront of aerospace, national security, and AI, your impact starts here.
The Opportunity
We are seeking a motivated and hands-on Data Engineer to join LeoLabs’ growing Insights team. You will play a key role in building and operating the data pipelines and analytics infrastructure that power customer insights, internal decision-making, and real-time space domain awareness capabilities.
You will work closely with software engineers, radar and catalog teams, and data scientists to ensure reliable extraction, transformation, and loading (ETL) of mission-critical datasets. This includes developing scalable batch and streaming data workflows, enabling advanced analytics, and supporting machine learning initiatives.
Your contributions will help transform large volumes of sensor and orbital data into actionable intelligence that enables users to safely operate and manage assets in low-Earth orbit. This role is primarily hands-on development, with opportunities to grow into increased ownership of data platform design and optimization.
Qualifications
- B.S. or M.S. in Computer Science, Data Science, Engineering, Mathematics, Physics, or equivalent experience
- 0-2 years of experience in data engineering, software engineering, analytics engineering, or related technical roles.
- Experience designing and building data pipelines or ETL/ELT workflows
- Hands-on experience with Databricks, Apache Spark, or distributed data processing frameworks
- Proficiency in Python and SQL for data transformation and analysis
- Familiarity with data modeling concepts and modern data lake or warehouse architectures
- Experience working in cloud-native environments (AWS preferred)
- Understanding of software development best practices including version control, testing, and CI/CD
- Strong analytical mindset and ability to troubleshoot complex data issues
- Effective communication skills and ability to collaborate across distributed engineering teams
- Ability to participate in operational support rotations during critical incidents
Data Preferred Qualifications
- Experience supporting data science or machine learning workflows, including feature engineering pipelines
- Familiarity with Delta Lake, Lakehouse architectures, or large-scale telemetry data processing
- Exposure to streaming data systems such as Kafka or Spark Structured Streaming
- Experience with workflow orchestration tools such as Airflow or Databricks Workflows
- Background in orbital mechanics, aerospace, physics, or applied mathematics
- Experience building analytics datasets or semantic models for BI tools
- Active U.S. security clearance or ability to obtain one
Within 1 Month, You’ll
- Complete onboarding to gain familiarity with LeoLabs’ mission, products, and data platform
- Successfully configure development environments, tooling, and data access
- Participate in team ceremonies and begin ramping up on existing data pipelines and platform services
- Deliver initial low-risk improvements such as pipeline enhancements, monitoring updates, or documentation contributions
Within 3 Months, You’ll
- Develop working knowledge of LeoLabs’ data architecture, ingestion systems, and Databricks workflows
- Contribute to building and maintaining small-to-medium data pipeline features
- Collaborate with data scientists and product stakeholders to support analytics and modeling initiatives
- Improve pipeline reliability, performance, and observability
Within 6 Months, You’ll
- Demonstrate solid understanding of platform architecture and operational practices
- Independently deliver well-tested and scalable data workflows
- Take ownership of defined datasets or pipeline components
Within 12 Months, You’ll
- Demonstrate proficiency across LeoLabs’ data engineering stack and cloud data platform
- Deliver pipeline enhancements end-to-end with minimal oversight
- Contribute to architecture discussions and platform roadmap planning
- Identify opportunities to improve data quality, scalability, and mission impact
Perks and Benefits
- Global workforce: flexible remote/hybrid opportunities
- Work on complex, meaningful missions with real-world impact
- Unlimited paid time off for most roles
- Competitive salary and equity packages
- Comprehensive health, dental, and vision coverage
- Access to the forefront of commercial space operations and defense innovation
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identify, national origin, disability, or status as a protected veteran.
Benefits
Flexible Work Hours
Global workforce: flexible remote/hybrid opportunities
Health Insurance
Comprehensive health, dental, and vision coverage
Paid Time Off
Unlimited paid time off for most roles
LeoLabs is creating a detailed, real-time map of activities in low Earth orbit using a proprietary global radar network and AI-driven analytics. Our services are designed for commercial and government sectors to enhance Space Domain Awareness, manage space traffic, and safeguard assets in space.