Capco is hiring a

Pyspark Azure Synapse Data Engineer - Bangalore/ Chennai/ Pune/ Hyderabad/ Mumbai/ Gurgaon

Job Title: Data Engineer with Pyspark, Azure Synapse

About Us

“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the  British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. 

WHY JOIN CAPCO?

You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.

MAKE AN IMPACT

Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.

#BEYOURSELFATWORK

Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.

CAREER ADVANCEMENT

With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.

DIVERSITY & INCLUSION

We believe that diversity of people and perspective gives us a competitive advantage.

 

Key Skills: Data Engineering, Pyspark, ADLS, Synapse, Hadoop, CI/CD

Technical Requirement:

Role: Data Engineer / Azure Data Engineer

Responsibilities

  • Responsible for designing, building, and maintaining data pipelines and infrastructure on Microsoft Azure
  • Extract, transform, and load data from various sources including databases, APIs, and flat files.
  • Stay up to date with the latest advancements in Azure data technologies and best practices.
  • Propose suitable data migration sets to the relevant stakeholders.
  • Assist teams with processing the data migration sets as required.
  • Assist with the planning, tracking and coordination of the data migration team and with the migration run book and the scope for each customer.

Role Requirements

  • Strong Data Analysis experience in Financial Services
  • Knowledge of and experience using data models and data dictionaries in a Banking and Financial Markets context
  • Demonstrate a continual desire to implement “strategic” or “optimal” solutions and where possible, avoid workarounds or short-term tactical solutions.
  • Working with stakeholders to ensure that negative customer and business impacts are avoided.
  • Manage stakeholder expectations and ensure that robust communication and escalation mechanisms are in place across the project portfolio.
  • Good understanding of the control requirement surrounding data handling

Experience/Skillset

  • Proven experience working with Microsoft Azure Data Services, including Data Factory, Data Lake Storage, Synapse Analytics, and Azure SQL Database.
  • Strong knowledge of data warehousing, data modeling, and ETL/ELT principles.
  • Hands-on experience in Airflow, Elastic
  • Perform data cleansing, transformation, and integration tasks to ensure data quality and consistency.
  • Develop and maintain data models for data warehousing, data lakes, and other data storage solutions.
  • Proficiency in programming languages like Python, PySpark, Scala, or SQL.
  • Knowledge of and experience using data models and data dictionaries in a Banking and Financial Markets context
  • Knowledge of CI/CD processes for application software integration and deployment, including Maven, Git, and Jenkins
  • Implement and maintain security best practices for data storage and access.
  • Monitor and troubleshoot data pipelines and data quality issues.
  • Knowledge of COBOL copy book is preferable to migrate the COBOL copy book solution to Azure data solution.
  • Document and communicate technical solutions effectively.
  • Enthusiastic and energetic problem solver to join an ambitious team.
  • Business analysis skills, defining and understanding requirements.
  • Attention to detail.
  • Good knowledge of SDLC and formal Agile processes, a bias towards TDD and a willingness to test products as part of the delivery cycle.
  • Ability to communicate effectively in a multi-program environment across a range of stakeholders.
  • Strong verbal and written communication skills

If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit www.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube.

Apply for this job

Please mention you found this job on AI Jobs. It helps us get more startups to hire on our site. Thanks and good luck!

Get hired quicker

Be the first to apply. Receive an email whenever similar jobs are posted.

Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Data Engineer Q&A's
Report this job
Apply for this job