Engage in diverse Data Engineering projects using Kafka and REST APIs, contribute to Data Integration Frameworks, and develop robust data pipelines with best coding practices.
About the Company:
Headquartered in El Segundo, Calif., Internet Brands® is a fully integrated online media and software services organization focused on four high-value vertical categories: Health, Automotive, Legal, and Home/Travel. The company's award-winning consumer websites lead their categories and serve more than 250 million monthly visitors, while a full range of web presence offerings has established deep, long-term relationships with SMB and enterprise clients. Internet Brands' powerful, proprietary operating platform provides the flexibility and scalability to fuel the company's continued growth. Internet Brands is a portfolio company of KKR and Temasek.
WebMD Health Corp., an Internet Brands Company, is the leading provider of health information services, serving patients, physicians, health care professionals, employers, and health plans through our public and private online portals, mobile platforms, and health-focused publications. The WebMD Health Network includes WebMD Health, Medscape, Jobson Healthcare Information, prIME Oncology, MediQuality, Frontline, QxMD, Vitals Consumer Services, MedicineNet, eMedicineHealth, RxList, OnHealth, Medscape Education, and other owned WebMD sites. WebMD®, Medscape®, CME Circle®, Medpulse®, eMedicine®, MedicineNet®, theheart.org®, and RxList® are among the trademarks of WebMD Health Corp. or its subsidiaries.
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status.
Develop & Support multiple Data Engineering projects with heterogeneous data sources, produce/consume data to/from messaging queues like Kafka, push/pull data to/from REST API’s.
Support in-house build Data Integration Framework, Data Replication Framework, Data Profiling & Reconciliation Framework.
Develop Data Pipelines with good coding standards, unit testing with detailed test cases.
Willingness to learn new technologies.
Essential:
4+ years of experience using Data Integration Tools - Pentaho Or any other ETL/ELT tools.
4+ years of experience using traditional databases like Postgres, MSSQL, Oracle
1+ years of experience using Columnar databases like Vertica, Google BigQuery, Amazon Redshift
1+ years of experience in Scheduler/Orchestration Tools Like Control-M, Autosys, Airflow, JAMS
Good conceptual knowledge on ETL/ELT Strategies.
Good conceptual knowledge in any Code Versioning Tools
Good collaboration, communication and documentation skills.
Experience of working in Agile Delivery Model.
Requires minimal or no direct supervision
Desirable:
Good knowledge in Data Visualization Tools like Tableau, Pentaho BA Tools.
Digital Marketing/Web analytics or Business Intelligence a plus.
Knowledge of scripting languages such as Python.
Experience in the Linux environment is preferred but not mandatory.
Please mention you found this job on AI Jobs. It helps us get more startups to hire on our site. Thanks and good luck!
Get hired quicker
Be the first to apply. Receive an email whenever similar jobs are posted.
Ace your job interview
Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.