Senior Data Engineer - Data Aggregation

AI overview

Join a values-driven team to develop and maintain data pipelines for life sciences companies, leveraging AI and advanced analytics to optimize commercialization strategies.
Beghou brings over three decades of experience helping life sciences companies optimize their commercialization through strategic insight, advanced analytics, and technology. From developing go-to-market strategies and building foundational data analytics infrastructures to leveraging artificial intelligence to improve customer insights and engagement, Beghou helps life sciences companies maximize performance across their portfolios. Beghou also deploys proprietary and third-party technology solutions to help companies forecast performance, design territories, manage customer data, organize, and report on medical and commercial data, and more. Headquartered in Evanston, Illinois, we have 10 global offices. Our mission is to bring together analytical minds and innovative technology to help life sciences companies navigate the complexity of health care and improve patient outcomes. Purpose of Job
  • Responsible and accountable for development and maintenance of data pipelines. This role works with the business team in US for gathering requirements and provide efficient solutions to business requests by using in-house enterprise data platform.
  • We'll trust you to:
  • Design, build, and maintain efficient, reusable, and reliable code
  • Ensure the best performance and quality of applications
  • Identify the issues, and provide solutions to mitigate and address these issues
  • Help maintain code quality, organization, and automation
  • Continuously expand body of knowledge via research
  • Comply with corporate quality policies and procedures.
  • Ensure all training requirements are completed in a timely manner
  • You'll need to have:
  • Minimum of 4 years experience in the following:
  • At least 3 years of working experience in with Data Aggregation projects (using datasets related to specialty pharmacy, hub, specialty distributor, 3PL) in US pharmaceuticals market is strongly preferred.
  • At least 3 years of experience in python programming and PySpark in development and maintenance of data pipelines.
  • Experience working with patient and/or payer data management projects.
  • Having advanced analytical and problem-solving skills.
  • Understands the business processes and business data used in US pharmaceuticals market.
  • Knowledge in SQL queries, Snowflake, Databricks, Azure blob storage and AWS.

  • Desired Soft Skills:

  • Excellent communication skills. Ability to communicate with business stakeholders as well as technical development teams
  • Excellent listening skills to understand customer needs
  • Excellent critical and creative thinking skills and ability to brainstorm ideas
  • Ability to lead meetings in place of Project Manager or Team Lead
  • Ability to work in a team and collaborate across teams
  • Ability to adapt to changing and increasing workload
  • Ability to assess and identify bottlenecks and potential failures
  • Fluency in English is required (spoken and written)
  • At Beghou Consulting, you'll join a highly collaborative, values-driven team where technical excellence, analytical rigor, and personal growth converge. Whether you're passionate about AI innovation, building commercialization strategies, or shaping the next generation of data-first solutions in life sciences, this is a place to make an impact!
    Ace your job interview

    Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

    Senior Data Engineer Q&A's
    Report this job
    Apply for this job