Forward Deployed Engineer

TLDR

As a Forward Deployed Engineer, you'll manage client implementations end-to-end, building scalable data connectors and ensuring robust integration between diverse data sources.

Forward Deployed Engineer

About Arkham:

Arkham is a Data & AI platform that helps large enterprises:

  • Unify fragmented systems and data
  • Build a single source of trusted operational metrics
  • Solve complex challenges with AI tailored to their operations

Teams at Circle K and Kimberly-Clark partner with us to deploy AI-powered solutions for sell-out forecasting, pricing and promo analysis, and automated order assignment. With Arkham, they achieve high-impact results fast, creating a strong foundation for long-term AI transformation.

Know more about Arkham:

About the Role

Our implementation model is built around two core roles:

  • Forward Deployed Engineer
  • Forward Deployed Data Scientist

As a Forward Engineer, you'll be the technical bridge between Arkham's Data Sync platform and our clients. Your primary focus is building the connectors that make data flow — REST APIs & JDBC/ODBC, — using frameworks like dlt, Airbyte, and Meltano (desired but not mandatory). But unlike a purely internal engineering role, you'll own the full implementation cycle: working directly with client teams to understand their data landscape, then building and deploying the connectors that unlock it.

You'll manage 3-4 client implementations at a time, moving fast from discovery to production while keeping your connector work reusable and scalable across engagements.

What You’ll Work On

Build and Own Data Connectors

  • Design and implement connectors for diverse data sources including:
  • REST APIs
  • JDBC / ODBC databases
  • Flat files and object storage
  • Custom ingestion scripts

Every connector should prioritize reusability, configurability, and scalability.

Lead Client Implementations End-to-End

Own the data integration process from initial discovery to production deployment.

This includes:

  • Understanding the client’s data architecture
  • Defining integration requirements
  • Building connectors
  • Deploying them into Arkham’s platform

Most integrations are expected to reach production within 2–4 weeks.

Design Robust Integration Architecture

Create adapter and abstraction layers that standardize how connectors handle:

  • Authentication mechanisms
  • Pagination and incremental syncs
  • Rate limits
  • Error handling and retries
  • Schema normalization

Your goal is to ensure connectors behave consistently across systems.

Work Directly With Client Technical Teams

Collaborate with engineering and data teams to:

  • Understand source systems
  • Map and validate data flows
  • Ensure outputs align with business requirements

You’ll translate real-world operational systems into clean, reliable data pipelines.

Build Analytics-Ready Data Pipelines

Write clean SQL and design transformations that produce high-quality datasets ready for analytics and AI workflows.

Your pipelines should be:

  • Correct
  • Observable
  • Maintainable
  • Scalable

Improve the Connector Library

Each integration should strengthen Arkham’s platform.

You’ll contribute reusable patterns and improvements back to the shared connector framework so that every engagement accelerates the next one.

What We Require

Experience

  • 2+ years of backend or data engineering experience
  • Strong Python programming skills
  • Hands-on experience building REST API integrations
  • Solid SQL and relational database fundamentals

Communication & Ownership

You should be comfortable:

  • Explaining technical work to technical and non-technical stakeholders
  • Working in client-facing environments
  • Owning integrations from problem definition to production

This role requires engineers who take responsibility for outcomes, not just code.

Technical Environment

Experience with cloud environments is expected.

Preferred:

  • AWS

Bonus Skills

Nice-to-have experience includes:

  • CI/CD applied to data workflows
  • Data observability and testing frameworks
  • Familiarity with AI-driven analytics and Generative AI use cases
  • Experience with data integration frameworks such as:
    • dlt
    • Airbyte
    • Meltano

Arkham Technologies builds a Data & AI platform that empowers large enterprises to unify their fragmented systems and datasets, providing a reliable source of operational metrics. With a suite of advanced tools tailored for machine learning and generative AI, we help customers like Circle K and Televisa Editorial tackle complex operational challenges, streamline their processes, and drive significant efficiencies.

View all jobs
Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Forward Deployed Engineer Q&A's
Report this job
Apply for this job