We are looking for a Middle/Senior Data Engineer with a Java / Scala / Python background to join the project for a Top-5 US retail broker (by the number of users). The project is devoted to trading experience, finance reports, and risk management.
You will join a cross-functional team that excels in getting features done from zero to production.
Key responsibilities:
1. Data Pipeline Development:
- Design, develop, and maintain robust data pipelines using Java within AWS infrastructure.
- Implement scalable solutions for data analysis and transformation using Apache Spark and PySpark.
- Utilise Airflow for efficient workflow orchestration in complex data processing tasks.
- Ensure fast and interactive querying capabilities through the use of Presto.
2. Infrastructure Management:
- Containerise applications using Docker for streamlined deployment and scaling.
- Orchestrate and manage containers effectively with Kubernetes in production environments.
- Implement infrastructure as code using Terraform for provisioning and managing AWS resources.
3. Collaboration and Communication:
- Collaborate with cross-functional teams to understand data requirements and architect scalable solutions aligned with business goals.
- Ensure data quality and reliability through robust testing methodologies and monitoring solutions.
- Stay updated with emerging technologies and industry trends to continuously enhance the data engineering ecosystem.
1. Education and Experience:
- Bachelor's degree in Computer Science, Engineering, or related field.
- Minimum 4 years of hands-on experience in Java / Scala / Python development, emphasising object-oriented principles.
2. Technical Proficiency:
- Proficient in Apache Spark or PySpark for large-scale data processing.
- Experience with Airflow for workflow orchestration in production environments.
- Familiarity with Docker for containerisation and Kubernetes for container orchestration.
- Knowledge of Terraform for infrastructure as code implementation in AWS environments.
- Experience managing AWS services such as S3, EMR, Glue, Athena, and Redshift.
- Strong background in SQL and relational databases, with proficiency in technologies like Postgres.
- Preference for experience with streaming platforms such as Kafka for real-time data processing.
3. Communication Skills:
- Excellent English language communication skills, both verbal and written.
- Ability to collaborate effectively with technical and non-technical stakeholders.
Devexperts employees create the art of FinTech in comfortable working spaces located in modern business centers around the world.
Join our team in Porto and enjoy:
- Flexible schedules
- Paid vacation of 22 days
- Insurance coverage (for you and your children)
- Partial reimbursement for fitness memberships
- Meal vouchers provided
- Snacks and beverages are always available
- Workspaces with modern equipment
- Integration in a multicultural environment