REQUIREMENTS:
- Experience: 7+ Years
- 5+ years of relevant experience in data engineering.
- Strong experience with Apache Spark and Databricks.
- Proficiency in AWS cloud services.
- Expertise in SQL and Python.
- Hands-on experience with PySpark for data processing.
- Basic experience in API development.
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
RESPONSIBILITIES:
- Understanding the client’s business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements.
- Mapping decisions with requirements and be able to translate the same to developers.
- Identifying different solutions and being able to narrow down the best option that meets the client’s requirements.
- Defining guidelines and benchmarks for NFR considerations during project implementation
- Writing and reviewing design document explaining overall architecture, framework, and high-level design of the application for the developers
- Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed.
- Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it
- Understanding and relating technology integration scenarios and applying these learnings in projects
- Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken.
- Carrying out POCs to make sure that suggested design/technologies meet the requirements.
Bachelor’s or master’s degree in computer science, Information Technology, or a related field.