Data Operations Engineer
TLDR
Develop data pipelines and automate tasks to enhance data operations for a company dedicated to creating clean energy solutions and combating climate change.
- Work as an Engineer on our analytics engineering team, primarily developing in Python and leveraging a wide range of technologies, notably: AWS and GCP, Docker, Apache Airflow, Apache Spark, and PostgreSQL
- Take problems from inception all the way to completion - own the building, testing, deployment, and maintenance of the code that you work on
- Tackle complex problems that span a wide range of technical abilities, including:
- Developing data pipelines to transform and process data between systems
- Productionize machine learning pipelines leveraging billions of rows of data
- Scaling our software to handle the ever-growing customer data
- Implement monitoring, alerting, and logging systems for data pipelines
- Automate routine data operations tasks and optimize workflows for scalability and efficiency
- Work effectively on an Agile team and collaborate with data engineering, analytics, and DevOps teams to support data infrastructure.
- Build robust and well documented processes to facilitate data triage and associated fixes
- Participate in on-call rotation and incident response related to data system outages or failures
- Save the planet
- A minimum of 3 years of professional experience developing in a modern programming language (Python preferred)
- Solid knowledge of ETL and data integration
- A value for testing and developing quality software
- Strong critical thinking skills and a desire to work with ambiguous challenges
- Experience working in an Agile environment and a strong understanding of the full SDLC
- Strong troubleshooting skills that span the full-stack (front-end clients, APIs, networking, DNS, Linux, containers, databases, distributed systems, etc.)
- Experience deploying production applications on at least one major cloud provider (AWS, GCP, Azure)
- Experience writing and maintaining data pipelines and ETLs leveraging Spark or similar tooling
- Bachelor's degree in Computer Science, Engineering, Data Science, or a related field.
- Experience in the utility industry
- Experience working cross-functionally with design, product, customer success, sales, etc.
- Deep technical knowledge of Python, AWS/GCP, Docker, and/or PostgreSQL.
- Make a Meaningful Impact: Your work directly impacts our mission of decarbonization and building a more sustainable future.
- Grow Your Career: We offer ample advancement opportunities, robust learning and development programs, and a supportive team environment that fosters collaboration and innovation.
- Thrive: We offer comprehensive benefits, including flexible time off, generous parental leave, a wellness stipend, and work flexibility to help you thrive both personally and professionally.
- Belong to an Inclusive Community: We celebrate diversity and foster an inclusive workplace where everyone feels respected, empowered, and heard. Our Employee Resource Groups offer opportunities to connect with colleagues who share your interests and backgrounds.
- Be Part of a Growing Movement: Join a team of dedicated individuals who are passionate about creating a more sustainable future. We offer a collaborative environment where your ideas are valued and your contributions recognized. Together, we can build a brighter tomorrow.
Benefits
Flexible Work Hours
We offer work flexibility to help you thrive both personally and professionally.
Paid Parental Leave
We offer generous parental leave.
Wellness Stipend
We offer a wellness stipend.
TENDRIL builds software to manage energy resources in homes and businesses, integrating smart technologies like thermostats, electric vehicles, and solar panels. Our platform helps users generate, shift, or save energy, optimizing efficiency and enhancing grid reliability. We're focused on delivering solutions that support the transition to clean energy while reducing costs for consumers.
- Employees
- 51-200 employees
- Industry
- Internet Software & Services
- Total raised
- $130M raised