Deine Aufgaben
At Hypatos, we build vertical AI Agents that automate document-heavy, back-office workflows across enterprise functions like finance, compliance, procurement, and customer service. To scale our impact, we are looking for a
Senior Data Engineer who will architect and build robust, scalable data infrastructure powering our AI-driven automation systems.
This role is deeply technical and cross-functional: You’ll work closely with AI Engineers, Product, and Delivery teams to design distributed data pipelines, optimize real-time data flows, and ensure our systems are reliable, performant, and secure at scale.
Key Responsibilities
- Design and implement distributed data pipelines using Spark (PySpark), Kafka, and Kubernetes.
- Build and maintain scalable data infrastructure for real-time and batch processing.
- Develop and optimize data models and storage solutions using ClickHouse and vector databases.
- Collaborate with AI engineers to support Retrieval-Augmented Generation (RAG) pipelines and other LLM-powered workflows.
- Ensure data reliability, performance, and security across all environments.
- Contribute to CI/CD workflows, testing frameworks, and monitoring solutions for data systems.
- Translate business requirements into scalable data solutions in collaboration with Product and Delivery teams.
- Mentor junior engineers and promote best practices in data engineering and distributed systems.
Dein Profil
-
Attitude: Team player
-
Background: Degree in Computer Science, Software Engineering, or a related discipline (or equivalent professional experience).
-
Experience: 5+ years of experience in data engineering or backend systems, with a strong track record of building production-grade distributed systems.
-
Distributed Systems: Deep understanding of distributed computing patterns and technologies (e.g., Spark, Kafka, Kubernetes).
-
Data Infrastructure: Hands-on experience with ClickHouse, vector databases, and data modeling for high-performance applications.
-
DevOps: Familiarity with CI/CD pipelines, GitHub workflows, Docker, and cloud native environments.
-
AI Integration: Exposure to RAG pipelines and LLM-powered systems is a strong plus.
-
Problem-Solving: Strong analytical mindset with attention to detail, debugging skills, and performance optimization experience.
-
Communication Skills: Ability to clearly articulate technical concepts and collaborate effectively across teams.
-
Growth Mindset: Openness to learning new technologies and adapting to evolving problem spaces.
Nice to Have
- Experience with Python in production environments.
- Familiarity with AI/ML-powered products or enterprise automation platforms.
- Knowledge of cloud platforms (AWS, GCP, or Azure).
- Experience mentoring or leading engineering teams.
- Exposure to enterprise system integrations (e.g., SAP, Salesforce, Oracle)
Unser Versprechen
- We trust amazing people to do amazing things and make a long-term impact - we give you Freedom and ownership of meaningful work that directly impacts the business
- Beyond a top market compensation, you will enjoy a personal development budget, meal allowance, sporting activities and free beers as well as a hybrid working model
- We're building a positive organizational culture where personal and professional growth are just as important as business growth
- We believe different perspectives make Hypatos a better community - that is why we're committed to building a diverse and inclusive environment where you feel you belong