Join us and lead data innovation at Riverflex
Are you looking to elevate your career in data engineering? Join Riverflex’s Data & Analytics (D&A) team as a Medior Consultant Data Engineering. In this role, you will be hands-on in developing end-to-end data workflows and solutions, leveraging your expertise to support client transformations and contribute to the development of our data engineering capabilities.
Why choose Riverflex?
• Collaborative Community: Immerse yourself in a vibrant and dynamic team that values social connections and teamwork, fostering a strong sense of belonging.
• Global Diversity: Thrive in an international and diverse work environment where team members bring various backgrounds and perspectives, creating a rich tapestry of experiences.
• Ownership and Impact: At Riverflex, we provide you with the autonomy and the space to take ownership, drive tangible results, and make a lasting impact in the dynamic realm of data engineering.
• Professional Development: We prioritize your growth with tailored opportunities to propel your career to new heights, ensuring you stay at the forefront of data engineering innovation through continuous learning.
• Client Impact: Collaborate with a diverse clientele, contributing to meaningful outcomes for our clients and shaping the ever-evolving digital landscape.
• Cutting-Edge Environment: Riverflex is dedicated to pushing the boundaries of what’s possible in the data engineering field. Join us in fostering a dynamic, creative space where your innovative ideas flourish.
• Career Growth: Experience dynamic career growth with opportunities to lead, mentor, and collaborate with a community of industry experts, shaping your own path while inspiring others in the journey of data excellence.
• Global Reach: Engage with a wide array of clients and industries, transcending borders to contribute to our global technology initiatives.
Your key responsibilities
• Architect, develop, and maintain robust data solutions (from end-to-end workflows to entire data platforms) that support complex client needs.
• Apply programming skills in SQL, Python and the Spark framework for data processing and real-time data ingestion.
• Lead the development of data workflows utilizing cloud-based technologies, particularly Azure and Databricks.
• Ensure data quality and implement robust testing in varied data environments (leveraging frameworks such as Great Expectations)
• Maintain version control and thorough documentation (e.g., GIT, WIKI) to support development processes.
• In-depth understanding of storage formats such as Parquet and Avro, with practical knowledge of their use cases.
• Actively contribute to Agile/Scrum/Kanban for effective project delivery.
• Use DevOps, CI/CD and infrastructure as code (IaC) tools/principles for efficient deployment and maintenance (using e.g. Airflow/Terraform)
• Collaborate closely with client teams and stakeholders, showcasing effective communication skills and a proactive approach.
• Work both on-site in Amsterdam and remotely, balancing flexibility and in-person collaboration.