Senior Data Engineer

TLDR

Design and build scalable data pipelines and streaming architectures using Apache Kafka to support real-time analytics and data-driven applications.

Job Summary
We are looking for a Senior Data Engineer with strong experience in real-time data streaming using Apache Kafka. The ideal candidate will design and build scalable data pipelines, streaming architectures, and data platforms to support real-time analytics and data-driven applications.

Key Responsibilities
·      Design, develop, and maintain real-time data pipelines using Apache Kafka.
·      Build scalable stream processing solutions using Kafka Streams, Spark Streaming, or Flink.
·      Develop and optimize ETL/ELT pipelines for large-scale data processing.
·      Integrate streaming pipelines with data lakes, data warehouses, and analytics platforms.
·      Work with cross-functional teams including Data Scientists, Product Managers, and Analysts.
·      Ensure data reliability, scalability, and performance optimization.
·      Implement data governance, monitoring, and alerting for streaming systems.
·      Troubleshoot production issues and optimize streaming architecture.

Required Skills
·      8+ years of experience in Data Engineering.
·      Strong hands-on experience with Apache Kafka / Kafka Streaming.
·      Experience with Python, Java, or Scala.
·      Strong knowledge of real-time streaming architectures.
·      Experience with Spark, Flink, or Kafka Streams.
·      Strong SQL and data modeling skills.
·      Experience working with cloud platforms (AWS, GCP, or Azure).
·      Familiarity with data lake technologies (S3, Delta Lake, Snowflake, etc.).

Preferred Skills
·      Experience with Databricks or Snowflake.
·      Experience with containerization (Docker, Kubernetes).
·      Knowledge of CI/CD pipelines and DevOps practices.
·      Experience with monitoring tools like Prometheus, Grafana, or Splunk.

Nice to Have
·      Experience with event-driven architecture.
·      Experience building high-throughput, low-latency data systems.


Qode is a technology-driven platform that transforms how recruiters and candidates connect by leveraging data and automation. Our solutions streamline the hiring process through machine learning, creating private talent pools and automating workflows, ultimately enhancing the quality of candidate evaluation and decision-making. With our no-code tools, we empower organizations to develop tailored recruitment strategies without needing extensive technical skills.

View all jobs
Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Senior Data Engineer Q&A's
Report this job
Apply for this job