Our client is the world's first Performance Branding company, partnering with brands such as The North Face, Timberland, Movado Watches, and Jose Cuervo to drive business growth through innovative marketing strategies. Their integrated operating model collapses the traditional marketing silos between creative and media, performance and brand, and across media channels. With a full suite of offerings including media, creative, SEO, Lifecycle, Retail Media, Affiliate, and Influencer.
We are seeking an experienced Data Analytics Engineer with 2-6 years of experience to join their dynamic and growing team. The ideal candidate will have a strong background in Python, SQL, data warehousing platforms such as Snowflake or Google BigQuery, ETL processes, API integrations, and dbt.
Responsibilities
- Design, develop, and maintain scalable and robust ETL pipelines using Python, SQL, and other relevant technologies.
- Work with data warehousing platforms such as Snowflake or Google BigQuery to manage, optimize, and ensure data integrity and consistency.
- Utilize dbt for data modeling and transformation to support analytics and data science initiatives.
- Integrate various data sources, including third-party APIs, into our data ecosystem.
- Collaborate with data scientists, analysts, and other stakeholders to understand data needs and implement solutions.
- Monitor and ensure performance, uptime, and scalability of data systems and processes.
- Document, test, and maintain data workflows and codebase.
- Participate in code reviews, share knowledge, and mentor junior team members.
- Stay updated with the latest trends in data engineering and continuously seek opportunities to innovate and optimize current processes.
Requirements
- Bachelor’s degree in Computer Science, Engineering, Business/Finance, or a related field.
- 2-6 years of hands-on experience in data engineering, analytics, or a similar role.
- Proficiency in Python and SQL.
- Experience with data warehousing platforms such as Snowflake or Google BigQuery.
- Familiarity with cloud computing platforms such as AWS, Azure, or Google Cloud Platform.
- Solid understanding of ETL processes and tools (such as dbt).
- Familiarity with dbt for data modeling and transformation.
- Experience in integrating and working with APIs.
- Strong analytical and problem-solving skills.
- Effective communication skills, both written and verbal, with the ability to work in cross-functional teams.
- Strong attention to detail and a commitment to producing high-quality results.
Preferred requirements
- Master's degree in a related field.
- Experience with data visualization tools like Tableau, Power BI or Sigma Computing.
- Knowledge of other programming languages or tools relevant to the field.
- Understanding of data science methodologies
Additional Information
- This role is available in both Mexico City and Bogotá (Hybrid)
- Unlimited vacation policy
- Monthly phone/internet and food stipend
- Health insurance coverage
- Professional Development Program
interview process includes, but is not limited to the following
- Excel knowledge
- Typing speed Test