Are you motivated to participate in a dynamic, multi-tasking environment? Do you want to join a company that invests in its employees? Are you seeking a position where you can use your skills while continuing to be challenged and learn? Then we encourage you to dive deeper into this opportunity.
We believe in career development and empowering our employees. Not only do we provide career coaches internally, but we offer many training opportunities to expand your knowledge base! We have highly competitive benefits with a variety HMO and PPO options. We have company 401k match along with an Employee Stock Purchase Program. We have tuition reimbursement, leadership development, and even start employees off with 16 days of paid time off plus holidays. We offer wellness courses and have highly engaged employee resource groups. Come join the Neo team and be part of our amazing World Class Culture!
NeoGenomics has an opening for a Senior Data Engineer who wants to continue to learn in order to allow our company to grow. This is a remote position with a Monday – Friday day shift.
Now that you know what we're looking for in talent, let us tell you why you'd want to work at NeoGenomics:
As an employer, we promise to provide you with a purpose driven mission in which you have the opportunity to save lives by improving patient care through the exceptional work you perform. Together, we will become the world's leading cancer reference laboratory.
Position Summary:
As the Senior, Data Engineer, Informatics you will take a leading role in enabling analytics on cloud data assets, ensuring the highest quality of data used in cancer genetics research and innovation. You will be instrumental in cataloging, extracting, and structuring data from files stored in various storage systems to support various data-driven initiatives within the company with Snowflake as the target platform.
Responsibilities:
-
Requirements Gathering and Solution Design:
- Collaborate with customers to understand their data needs and requirements for tailored data assets
- Provide accurate effort estimates and design solutions aligning with customer specifications
-
Solution Implementation and Maintenance:
- Develop and maintain data pipelines to efficiently extract, transform, and load (ETL) data from various sources stored in AWS S3, Azure Blob Containers, and traditional network-accessible storage systems
- Implement, test, and maintain data solutions, ensuring scalability, reliability, and adherence to best practices
-
Customer Engagement and Support:
- Act as a liaison between technical teams and customers, addressing queries, and providing technical support
- Offer innovative solutions to accommodate new customer requirements, ensuring timely delivery and customer satisfaction
Experience, Education and Qualifications:
- Bachelor's degree required in Computer Science, Information Systems or related field
-
4+ years of hands-on experience required in data engineering, emphasizing ETL processes and data orchestration• Professional accreditation (AWS/Azure Data Engineering Certification or equivalent) strongly preferred
- Proven experience as a Data Engineer or similar role, with a focus on cloud storage (AWS S3, Azure Blob Containers), traditional network-accessible storage, and Snowflake
- Demonstrated expertise in engineering solutions for managing, transforming, and moving files and objects across various storage cloud locations such as AWS S3, Azure Blob Storage, etc.
- Proficiency in data cataloging tools (i.e. Qlik or Informatica) and techniques
- Experience in cloud data engineering best practices with AWS databases and Snowflake a plus
- Ability to design, implement, and optimize data workflows across heterogeneous systems and cloud platforms
- Experience in data modeling, schema design, and ETL processes
- Proficient in SQL and experience with database technologies such as Snowflake, MSS-SQL, or PostgreSQL
- Strong programming skills in languages such as Python, Java, or similar languages for scripting and automation tasks