As a (Senior) Data Engineer in our data engineering tribe spanning over data platform, analytics, and regulatory reporting teams, you play a crucial role in owning our data stack. You will be instrumental in building and maintaining robust data pipelines, making critical data available to various stakeholders throughout the company, and contributing to our data-driven culture.
- Develop the scalable, cloud-based data platform essential to our data-driven company, leveraging the most up-to-date data technologies
- Shape an AWS-based data processing solution, ingesting data from our internal backend services as well as from third parties
- Extend our lakehouse integrating data for regulatory use cases and empowering various teams in decision-making, driving innovation, performance optimization, and strategic objectives
- Prepare and clean structured and unstructured data and develop high-quality data models for reporting, advanced analytics, and AI use cases
- Design and implement testing and monitoring components to ensure the accuracy, reliability, and performance of data pipelines
- Collaborate closely with highly ambitious and skilled people in our growing data teams, as well as product & engineering colleagues to release smart features for our products
- Stay up to date with the latest industry developments in data engineering and cloud architecture and share your expertise best practices within the company
- University degree in computer science, mathematics, natural sciences, or a similar field
- Significant experience in data engineering
- Hands-on experience designing and operating data pipelines in AWS with services like S3, Athena, DMS and Glue
- Excellent SQL skills, including advanced concepts such as window functions, experience with dbt is a plus
- Advanced knowledge about cloud networking & security (AWS IAM, VPC, Security Groups, ...)
- Proficient and experienced with Infrastructure as Code, ideally Terraform
- Strong programming skills in Python, ideally including Airflow and data processing frameworks such as PySpark
- Experience with data streaming technologies like Kafka, Kinesis, Flink and Spark Streaming is a plus
- Strong problem-solving and organizational skills
- Interest in financial services and markets
- Fluent English communication and presentation skills
- Be part of one of the fastest-growing and most visible Fintech startups in Europe, creating innovative services that have a substantial impact on the lives of our customers
- Work with an international, diverse, inclusive, and ever-growing team that loves creating the best products for our clients
- Be productive with the latest hardware and tools
- Learn and grow by joining our in-house knowledge sharing or career development sessions and spending your individual Education Budget
- Learn and experience German culture first hand by joining our free German language classes
- International relocation support is provided if required
- Flexible vacation policy and the opportunity to work from abroad
- Benefit from an attractive compensation package and from the company pension scheme
- Monthly contribution of 50% for the ‘Deutschland Jobticket’
- Say goodbye to order commissions and say hello to your complimentary subscription of Scalable Capital's PRIME+ Broker
- Enjoy flexible and discounted sports activities with Urban Sports Club