Kafka / Flink Engineer

AI overview

Design and maintain real-time data streaming solutions using Apache Kafka and Flink, ensuring high availability and scalability while collaborating with cross-functional teams.
Job Title: Kafka / Flink Engineer
Location : NYC, NY
Job Description:
We are looking for a Kafka / Flink Engineer to design, build, and support real-time streaming data platforms. The ideal candidate will have strong hands-on experience with Apache Kafka and Apache Flink, and a solid understanding of distributed systems, stream processing, and data pipelines in production environments.
Key Responsibilities:
  • Design, develop, and maintain real-time data streaming solutions using Apache Kafka and Apache Flink
  • Build and optimize Kafka producers, consumers, topics, and stream processing applications
  • Develop stateful and stateless Flink jobs for real-time analytics and event processing
  • Ensure high availability, fault tolerance, and scalability of streaming platforms
  • Monitor, troubleshoot, and resolve performance and reliability issues in production
  • Collaborate with data engineers, platform teams, and application teams to integrate streaming solutions
  • Implement data quality, security, and governance best practices
  • Participate in code reviews, design discussions, and technical documentation
Required Skills & Qualifications:
  • Strong hands-on experience with Apache Kafka (Kafka Streams, Connect, Schema Registry)
  • Solid experience with Apache Flink for real-time stream processing
  • Strong programming skills in Java or Scala (Python is a plus)
  • Experience working with distributed systems and event-driven architectures
  • Knowledge of message serialization formats (Avro, Protobuf, JSON)
  • Experience with monitoring and logging tools for streaming systems
  • Good understanding of data modeling and streaming data patterns
Preferred / Nice to Have:
  • Experience with cloud platforms (AWS, Azure, or GCP)
  • Knowledge of containerization and orchestration (Docker, Kubernetes)
  • Experience with CI/CD pipelines for data platforms
  • Exposure to other streaming tools such as Spark Streaming or Pulsar
  • Understanding of security, authentication, and authorization in Kafka ecosystems


Qode is dedicated to helping technical talent around the world find meaningful careers that match their skills and interests. Our platform provides a range of resources and tools that empower job seekers to take control of their careers and connect with top employers across a variety of industries. We believe that every individual deserves to find work that they're passionate about, and we are committed to making that vision a reality.Qode's team of experienced professionals is passionate about creating a better world of work by providing innovative solutions that improve the job search process for both job seekers and employers. We believe in transparency, trust, and collaboration, and we strive to build strong relationships with our customers and partners. Through our platform, we aim to create a more engaged and fulfilled global workforce that drives innovation and growth.

View all jobs
Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Engineer Q&A's
Report this job
Apply for this job