About the Role:
We are seeking a highly skilled and experienced Senior Big Data Engineer to join our dynamic team. The ideal candidate will have a strong background in developing and scaling both stream and batch processing systems, and a solid understanding of public cloud technologies, especially GCP. This role involves working in a remote environment, requiring excellent communication skills and the ability to solve complex problems independently and creatively.
What you will be doing
Build a reusable, and reliable code for stream and batch processing systems at scale. This includes working with technologies like Pub/Sub, Kafka, Kinesis, DataFlow, Flink, Hadoop, Pig, Hive, and Spark. Implementing automation/DevOps best practices for CI/CD, IaC, Containerization, etc.
Requirements
- About the Role:
- We are seeking a highly skilled and experienced Senior Big Data Engineer to join our dynamic team. The ideal candidate will have a strong background in developing batch processing systems, with extensive experience in Oozie, the Apache Hadoop ecosystem, Airflow, and a solid understanding of public cloud technologies, especially GCP. This role involves working in a remote environment, requiring excellent communication skills and the ability to solve complex problems independently and creatively.
- What you will be doing
- Develop scalable and robust code for batch processing systems. This includes working with technologies like Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, Hbase
- Develop, Manage and optimize data workflows using Oozie and Airflow within the Apache Hadoop ecosystem
- Leverage GCP for scalable big data processing and storage solutions
- Implementing automation/DevOps best practices for CI/CD, IaC, etc.
- Requirements:
- Experience with GCP managed services and understanding of cloud-based batch processing systems are critical.
- Proficiency in Oozie, Airflow, Map Reduce, Java
- Strong programming skills with Java (specifically Spark), Python, Pig, and SQL
- Expertise in public cloud services, particularly in GCP.
- Proficiency in the Apache Hadoop ecosystem with Oozie, Pig, Hive, Map Reduce
- Familiarity with BigTable and Redis
- Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes.
- Ability to tackle complex challenges and devise effective solutions. Use critical thinking to approach problems from various angles and propose innovative solutions.
- Worked effectively in a remote setting, maintaining strong written and verbal communication skills. Collaborate with team members and stakeholders, ensuring clear understanding of technical requirements and project goals.
- Proven experience in engineering batch processing systems at scale.
- Hands-on experience in public cloud platforms, particularly GCP. Additional experience with other cloud technologies is advantageous.