Responsibilities
- Develop, manage, and optimize Snowflake data models and pipelines across six structured environments (DEV, UAT, DATA_DEV, PREPROD, PROD, DATA_RESTRICTION)
- Implement zero-copy cloning, data masking, and row access policies for privacy-compliant development workflows
- Collaborate on CI/CD pipelines using Terraform and dbt to deploy and validate infrastructure and data transformations
- Enable data provisioning to Power BI via Import and DirectQuery modes, optimizing for performance and consistency
- Perform end-to-end support: troubleshoot data issues, optimize queries, and manage deployments in UAT/PREPROD/PROD
- Integrate data from SAP PTB, Xtract Universal, Azure Blob Storage, and external APIs via data orchestration tools like Azure Data Factory
- Ensure smooth data validation workflows in PREPROD and monitor pipelines across the environments
- Work with platform architects to implement state-of-the-art architectural principles (cost efficiency, scalability, modularity)
- Ensure compliance with governance and security policies, including role-based access controls in Snowflake
- Support onboarding and enablement of junior developers and analysts working with Snowflake or consuming tools
Required Qualifications
-
3–5 years of hands-on experience with Snowflake, including role-based access control, zero-copy cloning, and multi-environment structures
- Proficient in SQL for Snowflake, dbt, and data modeling (star/snowflake schema, views, materialized views)
- Familiarity with CI/CD pipelines using Terraform, dbt, and version control (Git)
- Experience working with semi-structured data (JSON, Parquet) and optimizing for fast query performance
- Experience integrating with Power BI (Import mode and DirectQuery)
- Knowledge of data masking and privacy-compliant data workflows
- Experience using Azure Data Factory or similar ETL orchestration tools
- Familiarity with enterprise data governance and secure data sharing across environments
- Strong troubleshooting skills and experience with production support operations
Nice to Have
- Experience with SAP data extraction using tools like Xtract Universal
- Understanding of Azure B2C, Tardis, and RESTful API integration
- Familiarity with vendor lock-in mitigation and cloud cost optimization strategies
- Knowledge of data provisioning for benchmarking and interoperability with partners
- Experience working in a regulated enterprise environment (e.g. telecom, finance, government)
* Please be informed that our remote working possibility is only available within Hungary due to European taxation regulation.
* Please be informed that our remote working possibility is only available within Hungary due to European taxation regulation.