Shippeo is hiring a

Data Engineer

Paris, France
Full-Time

The Data Intelligence Tribe is responsible for leveraging Shippeo’s data from our large shipper and carrier base, to build data products that help our users (shippers and carriers alike) and ML models to provide predictive insights. This tribe’s typical responsibilities are to:

  • get accurately alerted in advance of any potential delays on their multimodal flows or anomalies so that they can proactively anticipate any resulting disruptions
  • extract the data they need, get direct access to it or analyze it directly on the platform to gain actionable insights that can help them increase their operational performance and the quality and compliance of their tracking
  • provide best-in-class data quality by implementing advanced cleansing & enhancement rules

As a Data Engineer at Shippeo, your objective is to ensure that data is available and exploitable by our Data Scientists and Analysts on our different data platforms. You will contribute to the construction and maintenance of Shippeo’s modern data stack that’s composed of different technology blocks:

  • Data Acquisition (Kafka, KafkaConnect, RabbitMQ),
  • Batch data transformation (Airflow, DBT),
  • Cloud Data Warehousing (Snowflake, BigQuery),
  • Stream/event data processing (Python, docker, Kubernetes) and all the underlying infrastructure that support these use cases.

Required:

  • You have a degree (MSc or equivalent) in Computer Science.
  • 3+ years of experience as a Data Engineer.
  • Experience building, maintaining, testing and optimizing data pipelines and architectures
  • Programming skills in Python and experience with asynchronous event processing (asyncio).
  • Advanced working knowledge of SQL, experience working with relational databases and familiarity with a variety of databases.
  • Working knowledge of message queuing and stream processing.
  • Knowledge of Docker and Kubernetes.
  • Knowledge of a cloud platform (preferably GCP).
  • Experience working with workflow management systems such as Airflow.

Desired: 

  • Experience with cloud based data warehouse solutions (BigQuery, Snowflake).
  • Experience with Kafka and KafkaConnect (Debezium).
  • Experience with Infrastructure as code (Terraform/Terragrunt).
  • Experience building and evolving CI/CD pipelines with Github Actions.
  • Monitoring and alerting on Grafana / Prometheus.
  • Experience working on Apache Nifi.

We are looking for talents who share our values:

  •  🚀 Ambition
  • 💙  Care
  • 🎯  Deliver
  • 🤝  Collaboration

Find out more about our values in Our Culture Book

If you identify with our values and enjoy working in a fast-paced and international environment, Shippeo is just the place for you!

 

We are committed to fostering diversity and inclusion within our workplace as we value the unique perspectives and experiences that individuals from all backgrounds bring to our team. We are dedicated to providing equal employment opportunities to all candidates, regardless of their background or abilities, and our commitment to inclusion is reflected in our policies, practices, and workplace culture.

We understand that candidates may have unique needs or questions related to disability inclusion. To facilitate this, you can reach our dedicated Disability Advisor at [email protected] with any inquiries or requests for accommodations during the application process.

Apply for this job

Please mention you found this job on AI Jobs. It helps us get more startups to hire on our site. Thanks and good luck!

Get hired quicker

Be the first to apply. Receive an email whenever similar jobs are posted.

Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Data Engineer Q&A's
Report this job
Apply for this job