Resilient Co
Resilient Co

Principal Data Engineer

TLDR

Lead the design and evolution of enterprise data architecture and modernize data processes while mentoring engineers and influencing cross-organizational projects.

Responsibilities
  • Collaborate with Data Architects and business partners to design and evolve enterprise data architecture and platform capabilities.
  • Translate architectural strategy into technical designs and delivery plans across teams.
  • Design, code, and optimize complex distributed data processing systems using Spark, Databricks, and cloud‑native data services.
  • Develop canonical data models, semantic structures, and reusable datasets to support reporting and machine learning.
  • Drive platform modernization initiatives such as Delta Lake and metadata‑driven design.
  • Create reusable frameworks and platform capabilities to accelerate analytics, ML, and governed self‑service data access.
  • Lead root‑cause analysis for major data issues and implement long‑term improvements in data quality, lineage, and observability.
  • Provide technical leadership, guidance, and mentorship to Staff, Senior, and mid‑level data engineers.
  • Influence cross‑organizational roadmaps and engineering investments; participate in architecture reviews and governance forums.
  • Must Have
  • Bachelor’s or Master’s degree in Computer Science, Information Systems, or equivalent experience.
  • 10+ years of experience in data engineering or a related technical field.
  • Expert proficiency in SQL, Python, and Spark for large‑scale data processing.
  • Extensive experience designing and building cloud‑native data pipelines, data models, and distributed data systems (Delta Lake, Spark, Unity Catalog, Jobs, Workflows).
  • Experience with Azure (required).
  • Strong experience designing and tuning distributed data processing systems at scale.
  • Deep knowledge of data engineering best practices including version control, CI/CD, automated testing, DevOps/DataOps, and observability.
  • Proven ability to lead cross‑functional technical initiatives and influence architectural direction.
  • Strong problem‑solving, debugging, analytical, and collaboration skills; ability to thrive in agile, dynamic teams.
  • Nice to have
  • Experience with Databricks Unity Catalog, Delta Live Tables, or Databricks Workflows.
  • Advanced data modeling skills (dimensional, data vault, semantic layers).
  • DataOps experience including pipeline observability, monitoring, and automated quality.
  • Experience with metadata management and governance platforms (Unity Catalog, Purview, Collibra, Alation).
  • Experience with streaming frameworks used with Spark Structured Streaming (Kafka, Event Hubs, Kinesis).
  • Experience contributing to architecture review boards, technical councils, or data governance forums.
  • Resilient Co builds scalable API platforms that enhance both customer-facing and internal services, utilizing advanced technologies such as GraphQL, Kubernetes, and AWS. Additionally, the company specializes in SAP BRIM solutions, tailored for financial contract accounting and seamless payment services integration, setting it apart in the tech landscape.

    View company profile
    Report this job
    Apply for this job