GCP Data Engineer

TLDR

Build and optimize scalable data pipelines using GCP services, leveraging tools like BigQuery and Cloud Composer while ensuring compliance and data quality.

About the Role
We are looking for a skilled GCP Data Engineer with strong expertise in BigQuery, Dataflow, Cloud Composer, Python, and SQL to build and optimize scalable data pipelines on Google Cloud. The ideal candidate will have hands-on experience in ETL migration from legacy systems and exposure to AI-native engineering tools to accelerate development and improve productivity.

Role and responsibilities
·      Develop, and maintain scalable data pipelines using GCP services
·      Build and optimize ETL/ELT workflows using Dataflow and BigQuery
·      Orchestrate workflows using Cloud Composer (Apache Airflow)
·      Perform data migration from legacy systems (e.g., Teradata, on-prem databases) to GCP
·      Develop reusable and efficient Python-based data processing frameworks
·      Write optimized and complex SQL queries for data transformation and analytics
·      Leverage AI-native engineering tools (e.g., code assistants, automated testing, query optimization tools) to improve engineering throughput
·      Ensure data quality, validation, and governance compliance
·      Monitor and troubleshoot data pipelines and production issues
·      Optimize pipelines for performance, scalability, and cost efficiency
Technical skills requirements

·      The candidate must demonstrate proficiency in, Strong hands-on experience in Google Cloud Platform (GCP) – BigQuery, Dataflow, Cloud Composer
·      Proficiency in Python for data processing
·      Advanced knowledge of SQL (joins, window functions, performance tuning)
·      Experience in ETL/ELT pipeline development and migration to cloud
·      Understanding of data warehousing and data modeling concepts
·      Experience working with large-scale distributed data systems
·      Nice-to-have skills
·      Knowledge in Pyspark/Dataproc
·      Knowledge in Linux scripting
·      Knowledge on Github, Jenkins, Jira, etc.
 
Qualifications
·      Overall 6-15+ years with 3-5 years of relevant work experience in GCP Services
·      B.Tech., M.Tech. or MCA degree from a reputed university

Qode is a technology-driven platform that transforms how recruiters and candidates connect by leveraging data and automation. Our solutions streamline the hiring process through machine learning, creating private talent pools and automating workflows, ultimately enhancing the quality of candidate evaluation and decision-making. With our no-code tools, we empower organizations to develop tailored recruitment strategies without needing extensive technical skills.

View all jobs
Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Data Engineer Q&A's
Report this job
Apply for this job