Anticipated Contract End Date/Length: July 10, 2026
Work set up: Hybrid (must be eligible for BPSS)
Our client in the Information Technology and Services industry is looking for a Data Modeler to design and standardize enterprise data models that enable scalable analytics, business intelligence, machine learning, and operational reporting across modern lakehouse architectures. This role will translate complex business requirements into well-structured conceptual, logical, physical, and dimensional models while defining semantic layers, governance standards, and reusable data artefacts to ensure consistency, performance, and regulatory alignment across data domains.
What you will do:
- Design conceptual, logical, physical, and dimensional data models following Kimball and Inmon methodologies.
- Translate complex business requirements into scalable and well-structured enterprise data models.
- Define and maintain enterprise semantic layers aligned to lakehouse and medallion architectures.
- Establish naming standards, modeling conventions, and reusable artefacts across domains.
- Collaborate with data engineers to guide physical implementation aligned with performance and modeling standards.
- Validate data structures using SQL, data profiling, and quality assessment techniques.
- Align data models with metadata management, lineage, and governance frameworks.
- Support BI, analytics, and AI use cases through effective semantic and dimensional modeling.
- Assess data quality issues and recommend model-level improvements to enhance trust and usability.
- Communicate modeling standards and decisions clearly to engineers, analysts, and business stakeholders.
- Contribute to data practices within federated or domain-driven environments such as Data Mesh.
- Expertise in conceptual, logical, physical, and dimensional data modeling methodologies.
- Strong understanding of data warehousing principles including star and snowflake schemas, fact tables, and slowly changing dimensions.
- Proven experience designing semantic layers for lakehouse architectures using tools such as dbt Semantic Layer, Power BI, or LookML.
- Experience working with Delta Lake, Parquet, medallion architectures, and distributed data platforms.
- Proficiency with data modeling tools such as ER/Studio, ERWin, Sparx EA, or similar.
- Strong SQL skills for validation, profiling, and model verification.
- Knowledge of metadata management, lineage, and governance practices in regulated environments.
- Ability to define standards and reusable data assets across enterprise data domains.
- Experience identifying and resolving data quality challenges at the modeling level.
- Strong communication skills to bridge technical and non-technical stakeholders.
- Familiarity with Data Mesh or federated data operating models.
All your information will be kept confidential according to EEO guidelines.
Candidates must be legally authorized to live and work in the country where the position is based, without requiring employer sponsorship.
HelloKindred is committed to fair, transparent, and inclusive hiring practices. We assess candidates based on skills, experience, and role-related requirements.
We appreciate your interest in this opportunity. While we review every application carefully, only candidates selected for an interview will be contacted.
HelloKindred is an equal opportunity employer. We welcome applicants of all backgrounds and do not discriminate on the basis of race, colour, religion, sex, gender identity or expression, sexual orientation, age, national origin, disability, veteran status, or any other protected characteristic under applicable law.