The Data Intelligence Tribe is responsible for leveraging Shippeo’s data from our large shipper and carrier base, to build data products that help our users (shippers and carriers alike) and ML models to provide predictive insights. This tribe’s typical responsibilities are to:
get accurately alerted in advance of any potential delays on their multimodal flows or anomalies so that they can proactively anticipate any resulting disruptions
extract the data they need, get direct access to it or analyze it directly on the platform to gain actionable insights that can help them increase their operational performance and the quality and compliance of their tracking
provide best-in-class data quality by implementing advanced cleansing & enhancement rules
As a Data Engineer at Shippeo, your objective is to ensure that data is available and exploitable by our Data Scientists and Analysts on our different data platforms. You will contribute to the construction and maintenance of Shippeo’s modern data stack that’s composed of different technology blocks:
Data Acquisition (Kafka, KafkaConnect, RabbitMQ),
Batch data transformation (Airflow, DBT),
Cloud Data Warehousing (Snowflake, BigQuery),
Stream/event data processing (Python, docker, Kubernetes) and all the underlying infrastructure that support these use cases.
Required:
You have a degree (MSc or equivalent) in Computer Science.
3+ years of experience as a Data Engineer.
Experience building, maintaining, testing and optimizing data pipelines and architectures
Programming skills in Python
Advanced working knowledge of SQL, experience working with relational databases and familiarity with a variety of databases.
Working knowledge of message queuing and stream processing.
Advanced knowledge of Docker and Kubernetes.
Advanced knowledge of a cloud platform (preferably GCP).
Advanced knowledge of a cloud based data warehouse solution (preferably Snowflake).
Experience with Infrastructure as code (Terraform/Terragrunt)
Experience building and evolving CI/CD pipelines (Github Actions).
Desired:
Experience with Kafka and KafkaConnect (Debezium).
Monitoring and alerting on Grafana / Prometheus.
Experience working on Apache Nifi.
Experience working with workflow management systems such as Airflow.
Application Requirements:
Since Shippeo operates internationally, please submit your CV in English. We’re eager to hear from you if you’re ready to take on a challenge and grow your career in a supportive, innovative environment.
This job is based in Paris but we are open to candidates working remotely in France.
Recruitment Process:
Interview with our Talent Acquisition Manager
Interview with the Hiring Manager
Business Case
Final Interview
We are looking for talents who share our values:
🚀 Ambition
💙 Care
🎯 Deliver
🤝 Collaboration
Find out more about our values in Our Culture Book
Discover your Dream Team!
Meet our Shippians and get to know more about their role at Shippeo!
Click here to watch their videos
Diversity Statement
We are committed to fostering diversity and inclusion within our workplace as we value the unique perspectives and experiences that individuals from all backgrounds bring to our team. We are dedicated to providing equal employment opportunities to all candidates, regardless of their background or abilities, and our commitment to inclusion is reflected in our policies, practices, and workplace culture.
We understand that candidates may have unique needs or questions related to disability inclusion. To facilitate this, you can reach our dedicated Disability Advisor at [email protected] with any inquiries or requests for accommodations during the application process.