Software Engineer II - Data QA

Kathmandu , Nepal

AI overview

Work with a collaborative TechOps team on critical feature delivery, focusing on ETL processes and AWS services while driving architectural evolution in data pipelines.

About Us

Abacus Insights is changing the way healthcare works for you. We’re on a mission to unlock the power of data so health plans can enable the right care at the right time—making life better for millions of people. No more data silos, no more inefficiencies. Just smarter care, lower costs, and better experiences.

Backed by $100M from top VCs, we’re tackling big challenges in an industry that’s ready for change. And while GenAI is still new for many, we’ve already mastered turning complex healthcare data into clear, actionable insights. That’s our superpower—and it’s why we’re leading the way.

Abacus, innovation starts with people. We’re bold, curious, and collaborative—because the best ideas come from working together. Ready to make an impact? Join us and let's build the future, together.

About the Role: 

Our TechOps team is looking to bring on an experienced Data QA Engineer. If you are interested in being a guiding voice on a critical feature delivery team, this position is for you.  The connector factory team is responsible for implementing and configuring the data pipelines that ingest and do the ETL and process diverse sources of data. This includes building large batch processing and streaming systems. You will be exposed to every area of our platform and AWS services (Serverless – Lambda, EMR – Hadoop/Spark, EKS – Kubernetes, etc.) but even more so, help drive the evolution of the product and team as we continue to grow rapidly. We believe heavily in architecture evolution, and you will bring your experience to best practices as we build out new components of the platform. 

You Will: 

  • Review and Identify business requirement gaps from mapping specification document 
  • Effort estimation of QA activities/tasks 
  • Identify mapping test scenarios and authoring ETL test cases 
  • Understand the file layout of various input file formats and perform parser validation 
  • Write Source and Target SQLs for data validation 
  • Automation of Source to target validation in Pyspark, Databricks 
  • Identify data needs and perform test execution with Synthetic data that resembles real time data   
  • Writing SQLs for User Acceptance Criteria 
  • User Acceptance testing (UAT) support 
  • Perform day-to-day activities using Agile methodologies 
  • Capture QA activities using test management tools and Jira 
  • Onsite – offshore coordination on day-to-day activities 

What We're Looking For: 

  •  Educational Background: Bachelor’s degree in computer science, Engineering, Information Technology, Information Management, Computer Applications, or a closely related field. 
  • Professional Experience: At least 2 years of hands-on experience in Data Quality Assurance or Data Engineering QA roles. 
  • Healthcare Domain Expertise: Direct experience working with U.S. healthcare datasets is essential. 
  • SQL Proficiency: Expert-level skills in SQL, including writing complex queries, performing data validation, executing joins and aggregations, and optimizing query performance. 
  • Python Competency: Advanced Python capabilities for data analysis, synthetic data creation, and automation of data QA processes. 
  • Data Analysis Skills: Strong analytical abilities to work with large datasets, detect inconsistencies, and verify business logic. 
  • Quality Assurance Practices: Solid foundation in QA methodologies, with proven experience in designing test cases for data-centric systems. 
  • Analytics Tools Exposure: Familiarity with analytics platforms and reporting tools is advantageous. 
  • HEDIS Knowledge: Understanding of HEDIS quality measures is a plus. 
  • Communication Skills: Exceptional verbal and written communication skills, with the ability to clearly present technical insights and findings. 
  • Experience with Pyspark, Databricks, AWS services like Redshift, S3, Athena 

Nice to Have: 

  • Experience with other Cloud Data Warehouses (Big Query, Snowflake) 
  • Hands on experience of API testing  
  • ETL testing automation experience 

Our Commitment as an Equal Opportunity Employer

As a mission-led technology company helping to drive better healthcare outcomes, Abacus Insights believes that the best innovation and value we can bring to our customers comes from diverse ideas, thoughts, experiences, and perspectives. Therefore, we dedicate resources to building diverse teams and providing equal employment opportunities to all applicants. Abacus prohibits discrimination and harassment regarding race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws.

At the heart of who we are is a commitment to continuously and intentionally building an inclusive culture—one that empowers every team member across the globe to do their best work and bring their authentic selves. We carry that same commitment into our hiring process, aiming to create an interview experience where you feel comfortable and confident showcasing your strengths. If there’s anything we can do to support that—big or small—please let us know.

Get hired quicker

Be the first to apply. Receive an email whenever similar jobs are posted.

Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Quality Assurance (QA) Q&A's
Report this job
Apply for this job