Senior Data Engineer, AWS Lakehouse

AI overview

Design and operate a modern AWS data platform consolidating data from various sources into a lakehouse architecture, with hands-on responsibilities for ingestion and BI delivery.

We're building a modern data platform on AWS to consolidate data from Salesforce, monday.com | The AI work platform , and operational databases into a unified lakehouse architecture. You'll own the design, build, and operation of this platform from ingestion through BI delivery. This is a hands-on role — you'll write the code, not just draw the diagrams.


What You'll Do

  • Build and maintain a medallion architecture (landing/integrated/reporting) on S3 using Apache Iceberg
  • Develop and operate data pipelines using dbt-core running against Athena
  • Implement ingestion patterns using AWS AppFlow, DMS, and custom Lambda functions
  • Design incremental load and merge logic for CDC data streams
  • Optimize query performance for BI consumption (partitioning, aggregation, file compaction)
  • Build orchestration workflows using Step Functions
  • Establish data quality testing, monitoring, and alerting
  • Support PowerBI integration and troubleshoot refresh issues
  • Document data models and maintain lineage


What You Bring

  • 5+ years working with cloud data platforms (AWS strongly preferred)
  • Production experience with dbt (dbt-core or dbt Cloud)
  • Strong SQL skills including window functions, CTEs, and incremental/merge patterns
  • Hands-on experience with at least one modern table format (Iceberg, Delta, or Hudi)
  • Familiarity with Athena, Glue, or similar serverless query engines
  • Experience with CDC pipelines and handling SaaS data sources
  • Working knowledge of Python for custom ingestion and tooling
  • Understanding of BI tool requirements and how to optimize serving layers for reporting
  • Upper-Intermediate/B2 English across all four skills


Nice to Have

  • Experience with Salesforce and/or monday.com | The AI work platform  data extraction
  • Infrastructure-as-code (Terraform, CloudFormation)
  • Cost optimization on AWS
  • Data governance and cataloging practices


What You Won't Do

  • Manage Spark clusters or heavy distributed compute infrastructure
  • Build real-time streaming pipelines (this is batch/incremental)
  • Own the BI layer — you'll support it, not build dashboards


Location: Remote (Europe / CET Time Zone preferred), with the ability to overlap with EST working hours

Type: Hourly Contractor, full-time

Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Senior Data Engineer Q&A's
Report this job
Apply for this job