At R2, we believe that small and medium businesses are the productive engine of society. Small and medium businesses (SMBs) make up over 90% of companies in Latin America, yet they face a trillion-dollar credit gap. Our mission is to unlock SMBs’ potential by providing financial solutions that are tailored to their needs. We are reimagining the financial infrastructure of Latin America, where SMBs financial needs are satisfied without ever having to go to a bank.
R2 enables platforms in Latin America to embed financial services that SMBs can then leverage (starting with revenue-based financing). We are a tight-knit team coming from organizations such as Google, Amazon, Nubank, Uber, Capital One, Mercado Libre, Globant, and J.P. Morgan. We are entering a new phase of growth following a strategic investment from Ant International, focused on rapidly expanding our partner footprint, strengthening our credit and underwriting capabilities, and scaling our operations across multiple markets.
As a Data Architect, you will operate as a lead individual contributor, owning complex data systems end-to-end and shaping the data engineering strategy of the company. This role requires strong system-level thinking, deep technical expertise, and the ability to influence direction, standards, and execution across teams—without direct people management accountability.
You will act as a technical driver, setting best practices, mentoring others, and ensuring data platforms scale reliably with the business.
What you will do
- Architect, build, and own scalable ELT pipelines, data orchestration, and the enterprise data warehouse.
- Lead requirements analysis and solution design, ensuring an end-to-end view of the data ecosystem.
- Own the operation, reliability, and monitoring of production data pipelines.
- Drive data projects from ideation through execution, acting as the primary owner and decision-maker.
- Design and implement standardized, analytics-ready data models that support reporting, experimentation, and advanced analytics.
- Build reusable ingestion and integration frameworks for internal systems and external partners.
- Establish and continuously improve automation, orchestration, and observability across data workflows.
- Define and enforce data engineering standards, including modeling, testing, documentation, and deployment.
- Integrate AI-enabled tooling and agents where appropriate to improve productivity and data operations.
- Maintain high-quality technical documentation for architecture, pipelines, and processes.
- Act as a technical mentor for Data Engineers and Analytics Engineers.
- Provide guidance on design decisions, code quality, and architectural trade-offs.
- Lead by influence—reviewing designs, unblocking teams, and driving alignment.
- Support the maturation of the data team by modeling best practices and ownership mindset.
Who you are:
- 5–8+ years of experience in Data Engineering or closely related roles.
- Proven ability to design complex, production-grade data systems.
- Strong software engineering background, with deep expertise in Python.
- Advanced proficiency in SQL and dimensional / analytical data modeling.
- Hands-on experience with ELT patterns and tools such as Airbyte, Fivetran, or similar.
- Strong experience with data orchestration platforms, preferably Airflow.
- Deep experience with dbt for transformations and modeling.
- Comfortable owning systems in production, including monitoring, alerting, and incident response.
- Strong experience collaborating with Analytics, Product, Risk, and ML teams.
- Able to translate business requirements into scalable technical solutions.
- Highly autonomous, proactive, and accountable.
Tech Stack (Preferred)
- Snowflake (strong preference)
- Airflow
- dbt
- AWS
Experience with Snowflake-based architectures is a key differentiator. Profiles primarily centered on Databricks without Snowflake exposure may not be a strong fit.
Bonus Points
- Experience integrating AI/ML tooling or agents into data workflows.
- Experience designing self-serve analytics platforms.
- Experience with fintech, lending, or risk-related data models.
- Experience building multi-region or multi-entity data platforms.
Location: São Paulo, BR, Santiago CL, Buenos Aires Ar or Bogotá CO.