Senior Data / Platform Engineer (Embedded - Data & Analytics Engineering)

AI overview

Accelerate the delivery of a highly customized data platform by building reusable infrastructure and improving data reliability, while working within a collaborative engineering team.

Welcome to Decision Foundry - Data Analytics Division!

We are proud to introduce ourselves as a certified "Great Place to Work," where we prioritize creating an exceptional work environment. As a global company, we embrace a diverse culture, fostering inclusivity across all levels.

Originating from a well-established 19-year web analytics company, we remain dedicated to our employee-centric approach. By valuing our team members, we aim to enhance engagement and drive collective success.

We are passionate about harnessing the power of data analytics to transform decision-making processes. Our mission is to empower data-driven decisions that contribute to a better world. In our workplace, you will enjoy the freedom to experiment and explore innovative ideas, leading to outstanding client service and value creation.

We win as an organization through our core tenets. They include:

·       One Team. One Theme.

·       We sign it. We deliver it.

·       Be Accountable and Expect Accountability.

·       Raise Your Hand or Be Willing to Extend it

About the Role

We’re looking for a Senior Data / Platform Engineer to embed directly into our Data & Analytics Engineering team and help accelerate delivery across a highly customized, API-driven data platform. This role is focused on augmenting and hardening the existing platform, building and expanding pipelines, and developing reusable infrastructure and library components to support scalable ingestion and transformation workflows.

This is a hands-on engineering role best suited for someone who thrives in software-engineering style data work—building modular Python libraries, deploying pipeline infrastructure, and improving reliability, observability, and test coverage across a production data ecosystem.

Location: Remote – EST Hours
Type: Contract
Team: Data Platform / Analytics Engineering

Key Responsibilities:

What You’ll Work On

You will integrate into our team to accelerate well-scoped execution work, including:

  • Data pipeline and ingestion expansion across multiple sources and delivery patterns
  • Platform hardening and refactoring initiatives to improve scalability and maintainability
  • Observability, testing, and reliability improvements across orchestration and batch workloads
  • Deployment and modularization of pipeline components to support repeatable onboarding of net-new data capabilities
  • Supporting dbt model and mart development (big plus) and maintaining analytics transformations in Snowflake


Core Responsibilities

  • Build and maintain serverless, containerized batch pipelines orchestrated via Prefect (similar to Airflow)
  • Expand ingestion and connectivity patterns across:
    • APIs
    • S3-based sources
    • SFTP infrastructure
    • Email scraping
    • Web scraping
  • Develop and enhance internal Python libraries used to standardize ingestion, transformation, and pipeline deployment patterns
  • Implement and improve data observability practices including monitoring, alerting, and failure diagnostics
  • Contribute to infrastructure-as-code using Terraform to support repeatable deployments and environment consistency
  • Support and improve the data warehouse ecosystem:
    • Snowflake as the primary data warehouse
    • dbt on Snowflake for modeling and analytics transformations
  • Collaborate closely with internal engineers through PR reviews, sprint workflows, and team standards.
  • Operate within existing repos, processes, and CI/CD workflows to increase throughput while maintaining quality


Technical Environment

  • Python (expert level required)
  • Prefect (workflow orchestration)
  • AWS (cloud-native compute, containerized/serverless batch workloads)
  • Terraform (IaC)
  • Snowflake (data warehouse)
  • dbt (transformations and marts)
  • Highly integrated and customized platform with heavy API-based data flows


What Success Looks Like

  • Net-new ingestion capabilities are delivered faster without sacrificing reliability
  • Pipelines are more modular, reusable, and deployable through standardized patterns
  • Failures are easier to detect and debug through improved observability and testing
  • The platform becomes easier to maintain as codebases are refactored and hardened
  • Internal senior engineers retain architectural ownership while execution throughput increases

Requirements

Required Qualifications

  • 6+ years of experience in Data Engineering, Platform Engineering, or Software Engineering with strong data systems exposure
  • Expert-level Python skills with a track record of building production-grade libraries and services
  • Strong experience building and operating batch pipeline infrastructure in cloud environments (AWS preferred)
  • Experience with workflow orchestration tools such as Prefect, Airflow, Dagster, etc.
  • Strong understanding of data pipeline design: modularity, idempotency, retries, deployment patterns, and maintainability
  • Experience implementing data observability, monitoring, logging, alerting, and testing frameworks
  • Hands-on experience with Terraform or similar infrastructure-as-code tooling
  • Comfortable working in an embedded model: collaborating inside existing repos, PR workflows, and delivery processes


Preferred / Nice-to-Have

  • Strong experience with dbt (models, marts, testing, documentation)
  • Experience with Snowflake performance optimization and warehouse best practices
  • Experience with web scraping and/or email scraping pipelines
  • Familiarity with containerized workloads and serverless compute patterns
  • Strong instincts for platform refactoring, system hardening, and reliability engineering


Working Model / Team Approach

Our internal team retains ownership of architecture, modeling standards, and technical direction. This role operates as an embedded senior engineer within our workflows to accelerate delivery, increase throughput, and protect senior internal capacity—without compromising quality.

Benefits

Equal Opportunity Statement

We are committed to building a diverse and inclusive team. We welcome applications from candidates of all backgrounds and are an equal opportunity employer. We provide equal employment opportunities to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability, gender identity, sexual orientation, marital status, or veteran status.

Decision Foundry is a certified Salesforce integration partner, offering global consulting services to bridge data access, platform adoption, and business impact.

View all jobs
Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Analytics Engineer Q&A's
Report this job
Apply for this job