We are looking for exceptional candidates to join our growing Cloud Data Engineering team at Merkle to focus on delivering performant solutions for our clients. Successful candidates will understand modern data and cloud platform architecture and can solve complex technical problems to deliver to clients’ requirements. You will be a competent and enthusiastic team player, developing sensible solutions that are scalable, utilizing the appropriate technology and exemplifying engineering best practices.
Under the direction of a Solutions Architect or a Lead Engineer, the postholder will:
- Be confident in implementing modern cloud solutions utilizing the full range of cloud services and applications.
- Be proficient in translating solution design materials into concrete implementation tasks then working autonomously, or as part of a team, to complete those tasks.
- Work closely with local leadership as well as our global teams, bringing their key industry and product expertise and knowledge, to shape and deliver client solutions.
- Support our Data Scientists, E-commerce and Decisioning teams and wider Merkle capability teams working on both external and internal projects.
- Provide technical expertise and consultancy to clients to help them understand the role of data in their organization and how to unlock its potential.
You will have a deep understanding of distributed computing, data and application architectures, basic networking, security, and infrastructure. You will be able to develop high quality, testable code to drive production-strength data pipelines and scalable systems.
You will bring a knowledge of data and cloud engineering and be able to implement data pipelines, transformations, and data models.
Essential:
- You will be an enthusiastic supporter of the Google Cloud Platform and well informed about how it can offer value to clients.
- Understanding of applied Cloud Data Security patterns and best practices.
- Implementation of Data Lake and Data Zone patterns in GCP.
- A strong understanding of data modelling, data structures, databases, and data ingestion and transformation pipelines.
- Knowledge and expertise in Python plus other programming and scripting languages (Bash/Pwsh). You will be completely at ease creating and managing virtual environments, dependencies and structuring small to large codebases.
- Comfortable working in Linux environment(s), able to navigate, update and manage systems from a shell.
- DevOps such as CICD Pipelines (YAML), Git (including common patterns), GitHub Actions for Workflow automation and implementing these with security.
- Infrastructure as Code using Terraform and or GCloud CLI scripts.
- Comfortable using Docker and Docker Compose. Kubernetes knowledge would be an advantage.
- SQL Skills. Able to write, use and troubleshoot SQL queries. Understanding of T-SQL to manage database resources.
- Ability to stay organized across several projects at same time.
- Ability to create and run tests for your work.
Beneficial
- Formal GCP Certification or willingness to commit to achieving certification.
- Willingness to prepare and present at 'Show and Tell' project review sessions.
Benefits:
⛺ 5 weeks of vacation + 3 wellness days
❤️ 2 Volunteering days to share the kindness of your heart with others
⏰ Flexible working hours and home office
🎓 Full access to Dentsu Academy and on-site learning sessions
🍹 Team events: company parties, monthly breakfasts, and pub quizzes
🥪 Snacks, and drinks at the office
💸 Referral bonus programme
💻 Laptop and equipment
#LI-Hybrid
Merkle is an equal opportunity employer. We do not discriminate based on sex, gender identity, race, colour, national origin, religion, sexual orientation, disabilities or any other protected basis because we believe the come from all walks of life. We aspire to foster a community in which diversity is valued in both our employees and our ideas.