Devexperts is hiring a

Middle/Senior Data Engineer

Sofia, Bulgaria
Full-Time

We are looking for a Middle/Senior Data Engineer with a Java / Scala / Python background to join the project for a Top-5 US retail broker (by the number of users). The project is devoted to trading experience, finance reports, and risk management.

You will join a cross-functional team that excels in getting features done from zero to production.

Key responsibilities:

1. Data Pipeline Development:

  • Design, develop, and maintain robust data pipelines using Java within AWS infrastructure.
  • Implement scalable solutions for data analysis and transformation using Apache Spark and PySpark.
  • Utilise Airflow for efficient workflow orchestration in complex data processing tasks.
  • Ensure fast and interactive querying capabilities through the use of Presto.

2. Infrastructure Management:

  • Containerise applications using Docker for streamlined deployment and scaling.
  • Orchestrate and manage containers effectively with Kubernetes in production environments.
  • Implement infrastructure as code using Terraform for provisioning and managing AWS resources.

3. Collaboration and Communication:

  • Collaborate with cross-functional teams to understand data requirements and architect scalable solutions aligned with business goals.
  • Ensure data quality and reliability through robust testing methodologies and monitoring solutions.
  • Stay updated with emerging technologies and industry trends to continuously enhance the data engineering ecosystem.

1. Education and Experience:

  • Bachelor's degree in Computer Science, Engineering, or related field.
  • Minimum 4 years of hands-on experience in Java / Scala / Python development, emphasising object-oriented principles.

2. Technical Proficiency:

  • Proficient in Apache Spark or PySpark for large-scale data processing.
  • Experience with Airflow for workflow orchestration in production environments.
  • Familiarity with Docker for containerisation and Kubernetes for container orchestration.
  • Knowledge of Terraform for infrastructure as code implementation in AWS environments.
  • Experience managing AWS services such as S3, EMR, Glue, Athena, and Redshift.
  • Strong background in SQL and relational databases, with proficiency in technologies like Postgres.
  • Preference for experience with streaming platforms such as Kafka for real-time data processing.

3. Communication Skills:

  • Excellent English language communication skills, both verbal and written.
  • Ability to collaborate effectively with technical and non-technical stakeholders.
  • Paid vacation 20 + 5 days
  • Free MultiSport card
  • Medical insurance – premium package
  • Мodern office space
  • Panoramic view of Vitosha mountain
  • Gym & billiard in the office
  • Parking spot or public transport card
  • Mentorship program
  • Training, courses, workshops
  • Paid pro certifications
  • Subscriptions to pro sources
  • Participation in conferences
  • English courses
  • Trading contest within the company
  • Tech meetup dxTechTalk
  • Speaker's club
  • Opportunity to develop your personal brand as a speaker
  • Internal referral program
  • Remote work / Hybrid mode
  • Flexible schedule
  • Work & Travel program
  • Relocation opportunities
Apply for this job

Please mention you found this job on AI Jobs. It helps us get more startups to hire on our site. Thanks and good luck!

Get hired quicker

Be the first to apply. Receive an email whenever similar jobs are posted.

Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Senior Data Engineer Q&A's
Report this job
Apply for this job