Lead cloud modernization initiatives and develop scalable data pipelines using GCP and BigQuery, enabling real-time data processing for enterprise-level systems.
Lead Data Engineer
Primary Skills
GCP, Big Query, Python, Airflow, SQL, DBT
Job requirements
About the Role
We are seeking a Senior Data Engineer with deep expertise in Google Cloud Platform (GCP) and BigQuery to lead cloud modernization initiatives, develop scalable data pipelines, and enable real-time data processing for enterprise-level systems. This is a high-impact role focused on driving the transformation of legacy infrastructure into a robust, cloud-native data ecosystem.
Key Responsibilities
1. Data Migration & Cloud Modernization
Analyze legacy on-premises and hybrid cloud data warehouse environments (e.g., SQL Server).
Lead the migration of large-scale datasets to Google BigQuery.
Design and implement data migration strategies ensuring data quality, integrity, and performance.
2. Data Integration & Streaming
Integrate data from various structured and unstructured sources, including APIs, relational databases, and IoT devices.
Build real-time streaming pipelines for large-scale ingestion and processing of IoT and telemetry data.
3. ETL / Data Pipeline Development
Modernize and refactor legacy SSIS packages into cloud-native ETL pipelines.
Develop scalable, reliable workflows using Apache Airflow, Python, Spark, and GCP-native tools.
Ensure high-performance data transformation and loading into BigQuery for analytical use cases.
4. Programming & Query Optimization
Write and optimize complex SQL queries, stored procedures, and scheduled jobs within BigQuery.
Develop modular, reusable transformation scripts using Python, Java, Spark, and SQL.
Continuously monitor and optimize query performance and cost efficiency in the cloud data environment.
Required Skills & Experience
5+ years in Data Engineering with a strong focus on cloud and big data technologies.
Minimum 2+ years of hands-on experience with GCP, specifically BigQuery.
Proven experience migrating on-premise data systems to the cloud.
Strong development experience with Apache Airflow, Python, and Apache Spark.
Expertise in streaming data ingestion, particularly in IoT or sensor data environments.
Strong SQL development skills; experience with BigQuery performance tuning.
Solid understanding of cloud architecture, data modeling, and data warehouse design.
Familiarity with Git and CI/CD practices for managing data pipelines.
Preferred Qualifications
GCP Professional Data Engineer certification.
Experience with modern data stack tools like dbt, Kafka, or Terraform.
Exposure to ML pipelines, analytics engineering, or DataOps/DevOps methodologies.
Why Join Us?
Work with cutting-edge technologies in a fast-paced, collaborative environment.
Lead cloud transformation initiatives at scale.
Competitive compensation and benefits.
Remote flexibility and growth opportunities.
Perks & BenefitsExtracted with AI
Remote-Friendly:
Remote flexibility and growth opportunities.
Brillio is a global leader in Enterprise Digital Transformation Solutions, partnering with companies to drive business improvement and competitiveness through innovative technology solutions.
Please mention you found this job on AI Jobs. It helps us get more startups to hire on our site. Thanks and good luck!
Get hired quicker
Be the first to apply. Receive an email whenever similar jobs are posted.
Ace your job interview
Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.