Senior Data Engineer

TLDR

Own and build a production-grade data platform that enhances internal analytics and customer insights, ensuring regulatory compliance and operational efficiency.

Job title: Senior Data Engineer
Employment type: Permanent. Full time
Location and remote rules: Remote with the option to use the Edinburgh office.


About BR DGE
BR-DGE is an award winning FinTech founded in Edinburgh. Our platform enables enterprise ecommerce and technology businesses to design, optimise, and scale complex payment infrastructures with confidence. We operate in a regulated, technically demanding environment where reliability and execution quality matter. Our products increasingly rely on high quality data to support both internal decision making and merchant facing insight experiences. When relevant, we position clearly as compliance minded and white hat.

Why this role exists
  1. Data is becoming a critical part of BR DGE’s next growth phase, powering internal analytics and customer facing insights and monitoring.
  2. The data engineering space is largely greenfield. We need a production grade data platform that can ingest, transform, validate, and monitor data from core systems and operational tooling.
  3. The robustness, scalability, and governance of our data architecture impacts our ability to grow safely and meet regulatory expectations.
  4. This role owns the insights data platform, while partnering closely with Analytics, Product, and Engineering to ensure the platform delivers trusted datasets and timely signals.


What you will do:
1. Design and ship a tiered data platform that supports multiple latency needs, including low latency pipelines for operational monitoring and customer facing insights, plus batch pipelines for reporting and deeper analysis.
2. Build and own end-to-end ingestion patterns across batch, micro batch, and selected near real time use cases, with strong orchestration and dependency management.
3. Implement schema evolution, data contracts, and approaches for late arriving and corrected data so consumers can trust the outputs.
4. Treat curated datasets as products that are well defined, documented, reliable, and safe to use for both internal and external consumers.
5. Set platform standards for idempotent ingestion, deduplication, data quality, lineage, and observability.
6. Ensure the platform meets regulated fintech & payments expectations for access control, security, and governance while staying cost efficient as volumes grow.
7. Partner with Product and Engineering on event and domain modelling. Decide what data gets emitted and what latency and granularity is needed for analytics and product goals.
8. Support Data Science with reliable feature ready datasets and pragmatic collaboration, without owning reporting or business analysis.
9. Evolve the current lightweight tooling into a more observable, structured platform. Improve standards without creating unnecessary platform complexity.
10. Automate data infrastructure and workflows using infrastructure as code and CI CD practices.

What we are looking for

Must have
1. Proven experience designing, building, and operating production grade data pipelines and platforms.
2. Strong SQL, specifically PostgreSQL, plus at least one programming language such as Python or Java.
3. Experience with data processing or orchestration tooling such as Spark, Airflow, or Kafka.
4. Experience designing data models for analytics and reporting workloads.
5. Practical knowledge of data quality, testing, observability, lineage, and governance patterns.
6. Strong experience with AWS based data platforms, with hands on use of services like S3, Glue, Athena, Redshift, Kinesis, EMR, or MSK.
7. Infrastructure as code experience using Terraform or CloudFormation, and comfort operating systems in production.
8. Ability to collaborate across Engineering, Product, Analytics, and Data Science, and drive standards through influence.

Nice to have
  1. Experience building customer-facing data products where latency and correctness affect user outcomes.
  2. Experience in regulated fintech or payments environments, especially around access control and auditability.
  3. Experience with cost and performance optimisation at scale in AWS data stacks.
  • Tech context
  • This role will work across ingestion, orchestration, modelling, governance, and observability in an AWS centric environment, with PostgreSQL and modern data tooling. Current tooling is intentionally lightweight, and the platform is evolving as BR-DGE grows. In some cases you do not need to be hands-on day to day, but you must be fluent enough to make strong technical decisions and review work.

  • What We Offer:
  • Flexible, remote-first working
  • 33 days holiday, including public holidays
  • Birthday off
  • Family healthcare
  • Life insurance
  • Employee assistance programme
  • Investment in learning and development
  • Regular team events and off-sites
  • A collaborative culture where documentation is treated as a first-class product

Benefits

Flexible Work Hours

Flexible, remote-first working

Health Insurance

Family healthcare

Team Events and Off-Sites

Regular team events and off-sites

Paid Time Off

Birthday off

BR-DGE builds a robust platform that empowers enterprise e-commerce and technology businesses to design, optimize, and scale their complex payment infrastructures. Our focus is on delivering reliable and high-quality solutions in a technically demanding landscape, where data-driven insights enhance both internal processes and the merchant experience.

View all jobs
Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Senior Data Engineer Q&A's
Report this job
Apply for this job