In this role, you will work with our Business Analytics & Insights (BAI) team, collaborating with cross-functional teams to deliver high-quality data solutions in areas such as Supply Chain, Finance, Operations, Customer Experience, HR, Risk Management, and Global IT.
Responsibilities:
- Lead the technical planning for data migration, including data ingestion, transformation, storage, and access control within Azure Data Factory and Azure Data Lake.
- Design and implement scalable, efficient data pipelines to ensure smooth data movement from multiple sources using Azure Databricks.
- Develop reusable frameworks for the ingestion of large datasets.
- Ensure data quality and integrity by implementing robust validation and cleansing mechanisms throughout the data pipeline.
- Work with event-based/streaming technologies to ingest and process data in real-time.
- Provide technical support to the team, resolving challenges during the migration and post-migration phases.
- Stay current with the latest advancements in cloud computing, data engineering, and analytics technologies; recommend best practices and industry standards for data lake solutions.
- 6+ years of IT experience.
- Minimum of 4 years working with Azure Databricks.
- Experience with Python for data engineering purposes.
- Proficiency in Data Modeling and Source System Analysis.
- Strong knowledge of PySpark and SQL.
- Experience with Azure components: Data Factory, Data Lake, SQL Data Warehouse (DW), and Azure SQL.
- Ability to conduct data profiling, cataloging, and mapping for technical design and construction of data flows.
- Familiarity with data visualization/exploration tools.