IT engineer Data Engineering

AI overview

Design and maintain data ingestion pipelines from various systems, drive metadata service implementation for over 100 data sources, and ensure data quality management.

* Design, implement, and maintain secure and scalable data ingestion pipelines from a wide variety of source systems, including SAP, SalesForce, SharePoint, APIs, and (legacy) manufacturing platforms.
* Build and enhance metadata-driven services that enable discoverability, access governance, and operational transparency of enterprise data.
* Serve as a technical expert and cross-functional enabler for structured and unstructured data acquisition, quality, and compliance.
* Establish and maintain holistic data quality management, monitoring and reporting.
* Contributes to a global data engineering team delivering to all major business domains.
* Drives ingestion and metadata service implementation for over 100 enterprise data sources.
* Collaborates across business IT, cybersecurity, infrastructure, and architecture teams to ensure secure and sustainable delivery.

 

Main Tasks: 

▪ Build and maintain Python- or Scala-based extraction services (e.g., Debezium Server, custom APIs, rclone)
• Implement CDC, delta, and event-based patterns.
• Push-based HTTP, Kerberos-authenticated DLT delivery.

• Establish, operate and troubleshoot extraction from SAP using tools like Theobald Extract Universal.o
• Integrate with systems such as Salesforce, SharePoint, and other API- or file-based endpoints.   

▪ Establish and maintain a business-friendly, web-accessible data catalog application, with dataset profiles, metadata, and usability features.
▪ Integrate dataset discoverability, preview/exploration options, and lineage information using Unity Catalog as a backend metadata system.
▪ Design and implement structured access request workflows including request submission, approval chains, audit trail, and enablement triggers.

• Perform design reviews with Cybersecurity.
• Ensure documentation and compliance for all interfaces and data ingress points.
• Manage audit and traceability requirements.      

• Collaborate closely with IT and business users to translate requirements into scalable technical patterns.
• Serve as technical escalation point for complex source integration.  

▪ Define and implement a multi-layered data quality framework, including unit-level, integration-level, and cross-pipeline validation rules.
▪ Establish centralized and version-controlled storage of DQ rules, with integration into orchestration and CI/CD pipelines.
▪ Implement automatic DQ monitoring with severity levels (Critical, High, Medium, Low) and enable flagging, filtering, and quarantining logic at relevant stages of the pipeline.
▪ Collaborate with source system owners and business stakeholders to define meaningful and actionable DQ thresholds.

 

 

 Degree in Computer Science, Data Engineering, or a related field. Azure or Databricks certification is a plus.

  5–8 years in data engineering, with hands-on experience in ingesting structured and unstructured enterprise data into modern cloud platforms.

  Proven implementation of source system ingestion frameworks, metadata automation, and compliance-controlled interfaces.

Not required; however, experience mentoring junior developers or leading implementation workstreams is a plus; contributes to engineering standards and code quality improvement initiatives.

 Comfortable working across geographies and time zones; collaborates effectively with global teams and enterprise stakeholders.

The well-being of our employees is important to us. That's why we offer exciting career prospects and support you in achieving a good work-life balance with additional benefits such as:

  • Training opportunities
  • Mobile and flexible working models
  • Sabbaticals

and much more...

Sounds interesting for you? Click here to find out more.

 

Diversity, Inclusion & Belonging are important to us and make our company strong and successful. We offer equal opportunities to everyone - regardless of age, gender, nationality, cultural background, disability, religion, ideology or sexual orientation.

Ready to drive with Continental? Take the first step and fill in the online application.

Perks & Benefits Extracted with AI

  • Education Stipend: Training opportunities
  • Flexible Work Hours: Mobile and flexible working models
  • Other Benefit: Sabbaticals

Continental desarrolla tecnologías y servicios vanguardistas para la movilidad sostenible e interconectada de personas y bienes. Fundada en 1871, la empresa de tecnología ofrece soluciones seguras, eficientes, inteligentes y asequibles para vehículos, máquinas, tráfico y transporte. En 2021, Continental generó ventas por 33 800 millones de euros y actualmente emplea a más de 190 000 personas en 58 países y mercados. El 8 de octubre de 2021, la empresa celebró su 150 aniversario.El sector del grupo Automotive incluye tecnologías para sistemas de seguridad pasiva, frenos, chasis, movimiento y control de movimiento. La cartera también cuenta con soluciones innovadoras para la conducción asistida y automatizada, tecnologías de visualización y operación, soluciones de audio y cámara para el interior del vehículo, así como con tecnología inteligente de información y comunicación para los servicios de movilidad de operadores de flotas y fabricantes de vehículos comerciales. La gama de productos y servicios se completa con actividades integrales relacionadas con tecnologías de conectividad, electrónica de vehículos y computadoras de alto rendimiento.

View all jobs
Get hired quicker

Be the first to apply. Receive an email whenever similar jobs are posted.

Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Data Engineer Q&A's
Report this job
Apply for this job