We are looking for a Senior Data Engineer to join our team and lead efforts in designing, implementing, and maintaining robust data pipelines and infrastructure. This role will work closely with our data science and analytics teams to ensure data integrity and accessibility for both internal and client projects.
Work at Exadel - Who We Are
Since 1998, Exadel has been engineering its products and custom software for clients of all sizes. Headquartered in Walnut Creek, California, Exadel has 2,000+ employees in development centers across America, Europe, and Asia. Our people drive Exadel’s success and are at the core of our values.
About the Customer
The client is the largest Google digital consulting agency in Europe, operating only in the Google cloud.
Requirements
- 5+ years of experience in data engineering or related roles
- Expertise in working with AlloyDB and PostgreSQL for database management and optimization
- Strong proficiency in SQL and SQL scripting for data manipulation and transformation
- Experience in designing and maintaining complex data pipelines
- Knowledge of database performance tuning and query optimization techniques
- Ability to work independently and proactively in a fast-paced environment
- Strong problem-solving skills and the ability to handle multiple projects simultaneously
Nice to Have
- Familiarity with Google Cloud Platform services such as Cloud Source, Cloud Build, and Cloud Run
- Experience with CI/CD pipelines, including tools like GitHub Actions and Cloud Build
- Knowledge of bash scripting for automation and task execution
- Proficiency with git for version control and collaboration on codebases
- Experience working with large-scale data processing systems in cloud environments
English level
Upper-Intermediate
Responsibilities
- Design, build, and maintain efficient, reliable, and scalable data pipelines for collecting, processing, and storing large datasets
- Develop and optimize databases, ensuring performance and security, primarily using AlloyDB and PostgreSQL
- Create complex SQL scripts and queries for data extraction, manipulation, and transformation
- Implement best practices for database design and ensure data quality and consistency
- Collaborate with data scientists and analysts to support their data infrastructure needs
- Ensure optimal performance of data systems, identifying and resolving bottlenecks
- Document processes, data workflows, and system architectures to ensure transparency and knowledge sharing
- Collaborate with the DevOps team to integrate data solutions with CI/CD pipelines, ensuring smooth deployment processes