Allegro is hiring a

Supply Chain Data Science Team Leader

Warsaw, Poland
Full-Time

We offer a contract of employment and a hybrid work model (3 days a week in the office).

Additionally, depending on your annual assessment and the company's results:

Annual bonus up to 20% of the annual salary gross

Long-term discretionary incentive plan based on Allegro.eu shares

Welcome to the Delivery Experience (DEX) Engine domain. At DEX, we build the technology we need to make Allegro deliveries easy and reliable. Our expertise spans across many areas like geospatial analysis or origin destination volume management (ODVM), wherein we utilize a mix of GIS spatial databases, advanced statistical software, BigQuery and Machine Learning (ML). This empowers us to discover hidden patterns and trends in spatial data, leading to optimization in route planning, asset allocation, and a superior understanding of our service areas. Further, we delve into network planning applying strategic thinking and modeling techniques. This facilitates us in devising optimal routes for delivery and managing our distribution network. We also hold proficiency in delivery time prediction, leveraging predictive analytics and machine learning to the best of their capacity. Consequently, we transform large volumes of data into actionable insights that make a significant difference in our business operations and decision-making.

With Allegro’s expansion into new markets and increased reliance on our own delivery networks, your leadership will ensure that our operations scale efficiently while reducing costs and enhancing delivery times. As a Engineering Manager You will lead a team of data scientists and engineers, working closely with product, operations, and other engineering teams to build predictive models, optimize delivery flows, and ensure cost-effective parcel routing.

We are looking for people who:

  • Are familiar with the domain of volume management (3+ years of experience)
  • Have experience in managing a technical team, can set and deliver realistic, yet ambitious goals (3+ years of experience)
  • Strong analytical skills to interpret data and drive decision-making
  • Have expertise in Python, Data Science, Big Data and cloud solution (preferable GCP)
  • Have good communication and interpersonal skills
  • Have a positive attitude and ability to work in a team
  • Are able to work closely with the business (can reconcile technical and business issues)
  • Are eager to constantly develop and broaden their knowledge
  • Know English at a minimum B2 level

Additionally, we offer:

  • We have well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms)
  • A wide selection of fringe benefits in a cafeteria plan – you choose what you like (e.g. medical, sports or lunch packages, insurance, purchase vouchers)
  • English classes that we pay for related to the specific nature of your job
  • 16" or 14" MacBook Pro with M1 processor and, 32GB RAM or a corresponding Dell with Windows (if you don’t like Macs) and other gadgets that you may need
  • Working in a team you can always count on — we have on board top-class specialists and experts in their areas of expertise
  • A high degree of autonomy in terms of organizing your team’s work; we encourage you to develop continuously and try out new things
  • Hackathons, team tourism, training budget and an internal educational platform, MindUp (including training courses on work organization, means of communications, motivation to work and various technologies and subject-matter issues)
  • If you want to learn more, check it out

In your daily work you will handle the following tasks:

 

  • Team Leadership: Lead and mentor a team of data scientists and engineers focused on volume management solutions, ensuring efficient resource utilization and cost control
  • Collaboration: Work closely with the ODVM Product Manager, volume management, and operational teams to prioritize tasks and align with business goals
  • Machine Learning Models: Collaborate with the Data Science Hub to build and maintain machine learning models for delivery optimization, including load balancing, route optimization, and delivery time prediction
  • Operational Excellence: Identify and mitigate supply chain bottlenecks, manage volume load across different routes, and dynamically allocate volume between Allegro’s own operations (AOK) and third-party carriers
  • Innovation: Develop cutting-edge data products to support analytics and business decision-making, focusing on geospatial data, graph databases, and cost arbitrage
  • Stakeholder Management: Liaise with cross-functional teams including engineering, logistics, and analytics to ensure cohesive execution of the ODVM strategy
  • Performance Metrics: Oversee key performance indicators like depot utilization, delivery time prediction accuracy, cost per parcel, and One Box utilization
  • Data Management: Collaborating with engineers to foster the availability, integrity, accuracy, and reliability of data and pipelines
  • Data Processing & Automation: Incorporating Python, PySpark, BigQuery, XGBoost, TensorFlow to process and automate petabytes of data
  • Research Participation: Involvement in research activities whose findings and recommendations directly influence the development direction of search products
  • Delivery Time Predictions: Adopting advanced data analysis and machine learning to predict delivery times, including crafting algorithms and performing simulations
  • Communication/Data Visualization: Providing clear, visually focused communication of advanced data insights to non-technical stakeholders through report writing and presenting

 

Why is it worth working with us:

  • We work with cutting-edge technologies to optimize logistics processes and improve delivery experiences, including automated route optimization, real-time tracking, data & geospatial technology and machine learning models for demand prediction
  • Depending on the team's needs, we use the latest versions of Java, Scala, Kotlin, Groovy, Go, Python, Spring, Spark, Mesos, TensorFlow, JavaScript/TypeScript, and React
  • Our IT team includes over 1700 members who actively share their knowledge at various conferences and co-create a blog: allegro.tech
  • We operate with several thousand microservices, handling 1.8m+ rps on our business data bus
  • Big Data – managing several petabytes of data and leveraging Machine Learning in production environments
  • We practice Code Review, Continuous Integration, Scrum/Kanban, Domain Driven Design, Test Driven Development, and Pair Programming, depending on the team
  • Our internal ecosystem is founded on self-service and widely used tools like Kubernetes, Docker, Consul, GitHub, or GitHub Actions. From day one, you will be able to develop software using any language, architecture, and scale, limited only by your creativity and imagination
  • To match the scale, we build entire Platforms of tools and technologies that accelerate and facilitate day-to-day development while ensuring the best Developer Experience for our teams
  • Technological autonomy: you get to choose which technology solves the problem at hand without needing management’s consent. You are responsible for what you create
  • Our deployment environment includes a combination of private Data Centers (tens of thousands of servers) and Public Clouds (Google Cloud and Microsoft Azure)
  • We have over 100 original open source projects and a few thousand stars on GitHub
  • We organize the Allegro Tech Live event, a 100% remote version of our offline Allegro Tech Talks meetups, and make guest appearances at various tech communities
  • We prioritize continuous development. We organize hackathons and internal conferences like the annual Allegro Tech Meeting. Our employees regularly attend events in Poland and abroad (Europe and USA), with each team having its own budget for training and study aids. If you want to keep growing and share your knowledge, we will always support you

This may also interest you

Allegro Tech Podcast → https://podcast.allegro.tech/

Booklet → https://jobs.allegro.eu/job-areas/tech-data/ 

 

Send in your CV and see why it is #dobrzetubyć (#goodtobehere)

Apply for this job

Please mention you found this job on AI Jobs. It helps us get more startups to hire on our site. Thanks and good luck!

Get hired quicker

Be the first to apply. Receive an email whenever similar jobs are posted.

Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Team Leader Q&A's
Report this job
Apply for this job