Design and develop robust ELT/ETL data pipelines using Snowflake and implement data ingestion workflows with tools like Azure Data Factory and DBT for optimized performance.
Senior Data Engineer
Primary Skills
DataStream, ETL Fundamentals, SQL, SQL (Basic + Advanced), Python, Data Warehousing, Time Travel and Fail Safe, Snowpipe, SnowSQL, Modern Data Platform Fundamentals, PLSQL, T-SQL, Stored Procedures
Job requirements
Design, develop, and maintain robust ELT/ETL data pipelines to load structured and semi-structured data into Snowflake. Implement data ingestion workflows using tools like Azure Data Factory, Informatica, DBT, or custom Python/SQL scripts.
Write and optimize complex SQL queries, stored procedures, views, and UDFs within Snowflake.
Use Snowpipe for continuous data ingestion and manage tasks, streams, and file formats for near real-time processing Optimize query performance using techniques like clustering keys, result caching, materialized views, and pruning strategies. Monitor and tune warehouse sizing and usage to balance cost and performance.
Design and implement data models (star, snowflake, normalized, or denormalized) suitable for analytical workloads.
Create logical and physical data models for reporting and analytics use cases.
Brillio is a global leader in Enterprise Digital Transformation Solutions, partnering with companies to drive business improvement and competitiveness through innovative technology solutions.
Please mention you found this job on AI Jobs. It helps us get more startups to hire on our site. Thanks and good luck!
Get hired quicker
Be the first to apply. Receive an email whenever similar jobs are posted.
Ace your job interview
Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.