Big Data Engineer

TLDR

Specialize in Hadoop ecosystem technologies and design ETL/ELT pipelines while leveraging cloud platforms for data solutions.

Technical skills requirements
 
The candidate must demonstrate proficiency in,
 
·      Strong expertise in Hadoop ecosystem – HDFS, Hive, Spark(PySpark/Scala),MapReduce
·      Hands-on experience with Sqoop for data ingestion
·      Strong programming skills in Python / Scala / Java
·      Advanced knowledge of SQL and query optimization
·      Experience in data lake architecture and maintenance
·      Strong understanding of distributed computing and parallel processing
·      Experience in ETL/ELT pipeline design and optimization
·      Exposure to cloud migration strategies and readiness planning
·      Experience with data governance tools and frameworks
·      Knowledge of CI/CD pipelines and DevOps practices
Nice-to-have skills
·      Experience with cloud platforms (GCP preferred – Dataproc, BigQuery, Cloud Storage)
·      Familiarity with workflow orchestration tools (Airflow / Cloud Composer)
Qualifications
 
  • Overall 8 + years with 7-10 years of relevant work experience in Big Data/Hadoop
  • B.Tech., M.Tech. or MCA degree from a reputed university

Qode is a technology-driven platform that transforms how recruiters and candidates connect by leveraging data and automation. Our solutions streamline the hiring process through machine learning, creating private talent pools and automating workflows, ultimately enhancing the quality of candidate evaluation and decision-making. With our no-code tools, we empower organizations to develop tailored recruitment strategies without needing extensive technical skills.

View all jobs
Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Data Engineer Q&A's
Report this job
Apply for this job