Join a dynamic team to develop and optimize ETL processes, maintain data infrastructure, and enhance user experience with modern technologies like Airflow and FastAPI.
Ajax Systems is a full-cycle company working from idea generation and R&D to mass production and sales. We do everything: we produce physical devices (the system includes many different sensors and hubs), write firmware for them, develop the server part and release mobile applications. The whole team is in one office in Kyiv, all technical and product decisions are made locally. We’re looking for a Data Engineer to join us and continue the evolution of a product that we love: someone who takes pride in their work to ensure that user experience and development quality are superb.
We are looking for a Middle Data Engineer to join our team and take responsibility for developing and optimizing ETL processes, working with relational databases, and maintaining our existing data infrastructure
The role includes a balance of data engineering and backend engineering tasks: building and maintaining Airflow pipelines, troubleshooting and optimizing SQL queries in PostgreSQL/MySQL, and occasionally extending REST APIs in FastAPI/Django to provide data services.
Tasks and responsibilities:
- Design, develop, and maintain ETL pipelines;
- Troubleshoot queries to address critical production issues;
- Assist other team members in refining complex queues and performance tuning;
- Extend existing REST API services using FastAPI/Django;
- React to monitoring alerts and ensure the stability and reliability of pipelines.
Job Description:
- 3+ years of experience in Data Engineering.
- Strong knowledge of SQL (optimization, indexing, query performance tuning).
- Hands-on experience with Airflow (or equivalent orchestration tools).
- Knowledge of Python and experience with data-related libraries.
Nice to have:
- Experience in working with large volumes of data and databases.
- Familiarity with REST API development (preferably FastAPI / Django / DRF).
- Experience with monitoring and logging tools (ElasticSearch, Kibana).
- Experience with Docker and CI/CD pipelines.
- Understanding of cloud data platforms (AWS/GCP/Azure).
Please mention you found this job on AI Jobs. It helps us get more startups to hire on our site. Thanks and good luck!
Get hired quicker
Be the first to apply. Receive an email whenever similar jobs are posted.
Ace your job interview
Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.