Data Engineer

TLDR

Build and optimize scalable data pipelines and Lakehouse solutions using AWS and Databricks for a leading banking organization, ensuring strong data security and governance.

Role Summary

We are hiring a Data Engineer to build and optimize scalable data pipelines and Lakehouse solutions using AWS & Databricks for a leading organization in the banking/financial services domain.
This is an excellent opportunity to work on enterprise-grade data platforms with strong requirements in security, governance, and performance.


Key Responsibilities

  • Design, develop, and maintain ETL/ELT pipelines on Databricks using PySpark, Spark SQL, and Delta Lake
  • Build reliable ingestion frameworks using AWS services:
  • S3, Glue, Lambda, Step Functions
  • Kafka/MSK or Kinesis (streaming ingestion)
  • Integration with on-prem databases / RDS / Redshift
  • Automate workflows using Databricks Workflows, Airflow, or similar tools
  • Optimize Lakehouse performance (partitioning, Delta optimization, cost/compute tuning)
  • Implement data quality checks, monitoring, and incident troubleshooting for production pipelines
  • Apply governance and security controls (PII protection, access control, audit readiness)
  • Collaborate with data analysts/scientists and business stakeholders to deliver trusted datasets


Requirements

Must-have

  • 2-5+ years of Data Engineering experience
  • Strong hands-on experience with:
  • AWS (S3, Glue, Lambda, IAM, Step Functions)
  • Databricks (PySpark, Delta Lake, Workflows)
  • Python & SQL
  • Good understanding of data modeling (relational/dimensional)
  • Experience working in large-scale distributed data environments

Nice-to-have

  • Banking/financial domain experience (core banking, payments, lending, reporting)
  • Streaming experience (Kafka/MSK, Kinesis, Spark Structured Streaming)
  • Governance/catalog tools (Unity Catalog preferred)
  • IaC experience (Terraform / AWS CDK)
  • Familiar with compliance/security in regulated environments

👉 Our Benefit Packages:
  • Attractive salary range and we are open to negotiate if you're a strong fit.
  • Work equipment support
  • Allowance for Certification & Skill Development
  • Year-end bonus & performance-based rewards
  • 22 paid leaves from your 5th year - take a full month off
  • Career growth with personal coaching sessions
  • Open, collaborative team culture - no micromanagement, only trust
  • Tools & AI-powered workflows that make remote work easier

Benefits

Learning Budget

Allowance for Certification & Skill Development

Personal coaching sessions

Career growth with personal coaching sessions

22 paid leaves after 5 years

22 paid leaves from your 5th year - take a full month off

Qode is a technology-driven platform that transforms how recruiters and candidates connect by leveraging data and automation. Our solutions streamline the hiring process through machine learning, creating private talent pools and automating workflows, ultimately enhancing the quality of candidate evaluation and decision-making. With our no-code tools, we empower organizations to develop tailored recruitment strategies without needing extensive technical skills.

View all jobs
Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Data Engineer Q&A's
Report this job
Apply for this job