Enterprise Data Architect

TLDR

Support the design and optimization of large-scale data engineering and analytics solutions across the enterprise, collaborating with business and technology stakeholders.

Role: Enterprise Data Architect
Location: Dallas, Pittsburgh, Cleveland
Experience: 12+ years
Duration: Full time

We are seeking an Enterprise Data Consultant to support the design, delivery, and optimization of large-scale data engineering, analytics, and AI-enabled solutions across the enterprise.
This role partners closely with business, technology, and architecture teams to translate complex data requirements into scalable, secure, and compliant solutions.
 
Key Responsibilities:
Enterprise Data Analysis & Solution Delivery
  • Partner with business and technology stakeholders to analyze enterprise data requirements and translate them into scalable data engineering and analytics solutions.
  • Design, build, and support end-to-end data pipelines, including data ingestion, preprocessing, normalization, transformation, quality checks, and loading across complex data ecosystems.
  • Lead and contribute to ETL/ELT development using technologies such as Spark, Hadoop, Hive, Kafka, Python, and Scala, ensuring performance, reliability, and data accuracy.
Data Platforms & Architecture
  • Work with distributed data platforms including HDFS, HBase, Sqoop, Flume, and MapReduce, supporting both batch and real-time processing use cases.
  • Apply strong data modeling and data design principles to support analytics, reporting, regulatory, and operational needs.
  • Collaborate with enterprise architects on logical and physical data models aligned with PNC standards.
Data Quality, Governance & Compliance
  • Support and implement data quality frameworks, including profiling, validation rules, reconciliation, and monitoring to ensure trusted and compliant data.
  • Collaborate with cross-functional teams to ensure solutions align with enterprise architecture, security, governance, and regulatory requirements.
Cloud, Analytics & AI Enablement
  • Contribute to cloud-based data solutions, particularly on AWS, supporting data processing, analytics, and ML workloads.
  • Collaborate with data scientists and ML engineers to enable machine learning and AI use cases, including feature engineering, data preparation, and pipeline integration.
  • Support development and deployment of ML and AI systems, including exposure to LLM-based solutions, feature stores, and ML lifecycle management tools.
MLOps & Agile Delivery
  • Participate in or support MLOps practices, including model deployment, monitoring, retraining pipelines, and integration with platforms such as SageMaker, MLflow, Kubeflow, or similar tools.
  • Work in Agile delivery environments, actively participating in sprint planning, stand-ups, reviews, and retrospectives using tools such as Jira.
Stakeholder Engagement & Consulting
  • Serve as a client-facing consultant, coordinating across the SDLC and communicating technical concepts clearly to both technical and non-technical stakeholders.
  • Contribute to solutioning, estimations, POCs, and client proposals, helping shape data, analytics, and AI modernization initiatives.
People & Capability Development
  • Mentor junior team members, support onboarding, and promote best practices in data engineering, analytics, and platform design.
  • Foster collaboration across teams to support continuous improvement and delivery excellence.
 
Qualifications & Experience
  • 12+ years of experience in data engineering, data analytics, or enterprise data consulting.
  • Strong hands-on experience with big data and distributed data platforms.
  • Proficiency in Python, with experience in streaming and real-time data processing.
  • Solid understanding of data modeling, ETL/ELT design, and data quality practices.
  • Experience supporting cloud-based data platforms, preferably AWS.
  • Exposure to machine learning, AI, and MLOps concepts preferred.
  • Experience working in Agile/Scrum environments.
  • Strong communication and consulting skills with experience working in client-facing roles.
  • Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, or related field.

Qode is a technology-driven platform that transforms how recruiters and candidates connect by leveraging data and automation. Our solutions streamline the hiring process through machine learning, creating private talent pools and automating workflows, ultimately enhancing the quality of candidate evaluation and decision-making. With our no-code tools, we empower organizations to develop tailored recruitment strategies without needing extensive technical skills.

View all jobs
Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Data Architect Q&A's
Report this job
Apply for this job