Data Engineer

AI overview

Design and implement scalable ETL/ELT pipelines and data models in Snowflake and Airflow, optimizing performance and collaborating with BI teams to refine data definitions.

We’re hiring a Data Engineer to design, build, and operate reliable end-to-end ETL/ELT pipelines and data models on Snowflake + Airflow. You’ll own how data is structured, stored, and transformed - consolidating data from databases, REST APIs, and files into trusted, well-modeled datasets with clear SLAs and ownership.

Responsibilities:

  • Design scalable data architectures and data models in Snowflake (staging, integration, marts).
  • Build robust ETL/ELT processes and pipelines: ingest from RDBMS/APIs/files (batch/incremental; CDC when applicable).
  • Develop modular SQL and Python transformations; handle semi-structured JSON; create consumer-ready tables/views.
  • Orchestrate pipelines with Airflow: DAG dependencies, retries, backfills, SLAs, monitoring, and alerting.
  • Ensure idempotent re-runs/backfills; maintain runbooks and perform RCA for incidents.
  • Optimize performance and cost in Snowflake (warehouse sizing, pruning; clustering when needed).
  • Collaborate with BI/Analytics to refine data definitions, contracts, and SLAs.

Requirements:

  • Strong SQL as the core skill: designing and implementing production ETL/ELT processes and data models.
  • Python proficiency (must-have) for building data tooling, transformations, and integrations.
  • Experience as an ETL developer and data modeler: dimensional modeling, schema evolution, best practices for data storage.
  • Snowflake hands-on experience (must-have, preferred expertise): Streams/Tasks/Time Travel, performance tuning, JSON handling.
  • API integrations (auth, pagination, rate limits, idempotency).
  • Advanced English level.
  • Git-based CI/CD;  privacy/GDPR basics.

Will be a plus:

  • iGaming familiarity: stakes, wins, GGR/NGR, RTP, retention/ARPDAU, funnels; responsible gaming/regulatory awareness.
  • Interest or experience in AI/automation: Snowflake Cortex for auto-documentation, semantic search over logs/runbooks, parsing partner PDFs (with guardrails).
  • Exposure to cloud storage (GCS/S3/ADLS), Terraform/Docker, and BI consumption patterns (Tableau/Looker/Power BI).
  • Airflow proficiency: reliable DAGs, retries/backfills, monitoring, alert routing

What we offer:

  • Direct cooperation with the already successful, long-term, and growing project.
  • Flexible work arrangements.
  • 20 days of vacation.
  • Truly competitive salary.
  • Help and support from our caring HR team.

Perks & Benefits Extracted with AI

  • Flexible Work Hours: Flexible work arrangements.
  • Paid Time Off: 20 days of vacation.

Globaldev Group is a team of professionals specializing in creating engineering teams for technological businesses for the Western Europe, Israel, USA. Only long-term product teams for EU with investments and a modern stack of technologies.We set up the company in 2010 in Kharkiv. In 2017 we opened the office in Berlin. Currently, there are 400+ of us and we are constantly growing.Now in 2024 we have development hubs in Israel, Ukraine, Portugal, Armenia and Poland.If you’re interested in joining the team and taking part in decision making that will affect the product and business - let us know!

View all jobs
Get hired quicker

Be the first to apply. Receive an email whenever similar jobs are posted.

Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Data Engineer Q&A's
Report this job
Apply for this job