Hi, We’re AppFolio
We’re innovators, changemakers, and collaborators. We’re more than just a software company – we’re pioneers in cloud and AI who deliver magical experiences that make our customers’ lives easier. We’re revolutionizing how people do business in the real estate industry, and we want your ideas, enthusiasm, and passion to help us keep innovating.
As a Data Science Engineer on our Business Data Platform (BDP) team, you will own the operational backbone of our data infrastructure. You will play a critical role in ensuring resilient, secure, and scalable systems that power analytics, AI, experimentation, and business decision-making across the company. This role bridges platform engineering, data governance, and DevOps—enabling seamless data delivery and access with leading-class observability and compliance.
Your Impact
Own the design, build, and maintenance of AppFolio’s Business Data Platform’s observability and testing solutions that identify reliability, scalability, or data quality issues and strive to continuously improve the platform.
Build, monitor, and maintain CI/CD pipelines and orchestration tools to ensure timely and accurate data delivery.
Manage service accounts, roles, and permissions across Snowflake, dbt, and BI tools.
Own the creation of and performance of Data Pipeline SLOs.
Create and enforce data access controls, masking policies, and encryption standards (at rest and in transit).
Partner with InfoSec and Compliance to ensure auditability and privacy frameworks are implemented and upheld across the Business Data Platform.
Maintain lineage and metadata documentation across key data domains.
Implement and manage infrastructure-as-code for data platform components.
Drive automation of routine operations (e.g., environment provisioning, credential rotation, usage monitoring).
Partner with data engineers, data science engineers, data scientists, and analysts to improve platform usability and self-service capabilities.
Document and evangelize best practices for data access, job orchestration, and environment management.
Must Have
6+ years in DataOps, Data Platform Engineering, DevOps, or related roles within modern cloud data environments.
Proficiency in data engineering tools and technologies - SQL, Python, Airflow, Mulesoft, Linux scripting, and dbt
Strong Experience with cloud technology, especially AWS tech stack (s3, ec2, eks), Docker and Kubernetes
Experience with cloud data warehouse technology, such as Snowflake, including data security and governance
Advanced proficiency with Airflow
Strong knowledge of CI/CD pipelines, GitOps workflows, codespaces, and infrastructure-as-code (Terraform etc.).
Familiarity with security standards and practices for data (IAM, encryption, audit logging).
Familiar with Snowflake RBAC
Location
This role is a fully remote opportunity; however, you can find out more about our locations by visiting our website.
Compensation & Benefits
The compensation that we reasonably expect to pay for this role is $167,200 - $209,000 base pay. The actual compensation for this role will be determined by a variety of factors, including but not limited to the candidate’s skills, education, experience, and internal equity.
Please note that compensation is just one aspect of a comprehensive Total Rewards package. The compensation range listed here does not include additional benefits or any discretionary bonuses you may be eligible for based on your role and/or employment type.
Regular full-time employees are eligible for benefits - see here.