SENIOR DATA SPECIALIST

AI overview

Design and manage scalable ETL pipelines using PostgreSQL and MySQL, ensuring high-quality data delivery for a mission-critical project.

We are seeking a meticulous and highly responsible Senior Data Specialist who can work independently to design, manage, and optimize data workflows and pipelines. The ideal candidate should bring strong hands-on expertise in databases (PostgreSQL, MySQL), ETL processes, and practical experience with modern ETL tools. Familiarity with Apache NiFi, Talend, or Apache Airflow will be a strong advantage. The role requires 4–7 years of professional experience in handling complex data transformations and ensuring high-quality, reliable data delivery. This role demands a structured way of working, a strong sense of ownership, and the ability to deliver with precision in fast-paced environments.

 

DUTIES AND RESPONSIBILITIES

  • Design, build, and maintain scalable ETL pipelines for structured and unstructured data.
  • Work with PostgreSQL, MySQL, and other relational databases to optimize queries, schemas, and
  • performance.
  • Develop, automate, and monitor data workflows ensuring accuracy and timeliness of data.
  • Collaborate with engineering and product teams to understand data requirements and translate them
  • into technical solutions.
  • Ensure data quality, integrity, and security across pipelines and systems.
  • Troubleshoot and resolve data pipeline issues, ensuring minimal downtime.
  • Create and maintain detailed technical documentation for workflows, pipelines, and database
  • structures.
  • Evaluate and implement new tools/technologies to improve data engineering practices.
  • Bachelor’s degree in Computer Science, Information Systems, or related field.
  • Master’s degree (preferred, not mandatory).

WORK EXPERIENCE

  • 4–7 years of hands-on experience in data engineering, data integration, or database management.
  • Strong experience with PostgreSQL and MySQL.
  • Practical expertise with at least one ETL tool, ideally used in complex enterprise data environments.
  • Exposure to Apache NiFi, Talend, or Apache Airflow (desirable but not mandatory).
  • Proven track record of working independently, taking ownership, and delivering results under tight
  • timelines.

SKILLS, ABILITIES & KNOWLEDGE

  • Strong SQL skills and ability to optimize queries.
  • Solid understanding of data modeling and relational database concepts.
  • Structured and process-oriented way of working with strong attention to detail.
  • Ability to design reliable, maintainable, and scalable data solutions. · Excellent problem-solving and troubleshooting skills. · Strong communication skills to work effectively with cross-functional teams.
  • This is a high-responsibility role with significant autonomy. You will be the go-to expert for data workflows on the undertaken projects and will directly contribute to building the data backbone operational excellence on a mission critical project.

 

RMgX is a Gurgaon based digital product innovation & consulting firm. Here at RMgX, we design and build elegant, data-driven digital solutions for complex business problems. At the core of solutions crafted by us is a very strong user experience practice to deeply understand the goals and emotions of business and end-users. RMgX is driven by a passion for quality and we strongly believe in our people and their capabilities.For more details visit https://www.rmgx.in/

View all jobs
Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Specialist Q&A's
Report this job
Apply for this job