Senior Data Engineer - Freelance

AI overview

Design and maintain robust ETL and EDI solutions to empower data-driven decision-making across the organization, while collaborating with cross-functional teams.

Who we are: 

We are Boombit, a full-service agency and content studio that empowers companies to grow through strategy, creativity, technology services, and exceptional human talent. Since 2012, we have propelled companies forward by delivering quality, innovation, and purpose-driven work.


Job purpose:

We’re seeking a highly skilled Senior Data Engineer who thrives in complex data environments. You will design and maintain robust ETL and EDI solutions, ensuring seamless data flows across platforms. Your work will directly empower data-driven decision-making throughout the organization.


Job details: 

  • Location: 100% remote - open to Colombia based candidates.
  • Schedule: Monday to Friday, 8:00 AM - 5:00 PM Costa Rica time zone.
  • Work Model: Full-time flexible (freelance based on hours worked / on-demand engagement)
  • Language Proficiency: Spanish (Native) / English (C1+ written, spoken, and reading)
  • Availability: Immediate availability preferred


Key Responsibilities:

  • Participate in the creation, and enhancement of EDI and ETL processes using integration tools to support internal teams and business functions. 
  • Maintain and monitor the full ETL ecosystem, ensuring high availability, data quality, consistency, lineage, and compliance with security and privacy standards.
  • Collaborate with stakeholders to understand business logic and deliver effective integration solutions. 
  • Generate reports and queries from various databases to support data-driven decision-making. 
  • Develop and maintain SQL and NoSQL queries, dashboards, and reporting solutions using tools such as Looker, Power BI, Tableau, or custom visualizations to empower stakeholders with actionable insights.
  • Continuously improve and automate data workflows, proactively identifying opportunities to enhance efficiency, reliability, and resolve pipeline issues. 
  • Work with APIs and integration tools to facilitate seamless data exchange between systems.
  • Collaborate with Data Engineer Specialist for solution design, maintenance and evolution.
  • Collaborate cross-functionally with technical and non-technical stakeholders, clearly communicating data concepts, project progress, and technical constraints to ensure alignment and impact.
  • Continuously stay up to date with emerging trends, tools, and methodologies in data engineering, cloud platforms, and integration technologies.



Required Academic Background:

  • Bachelor’s degree in Computer Science, Data Engineering, or a related field.
  • Certifications in data architecture, cloud services, or integration tools are valued.


Required Skills and Experience:

  • 5+ years of proven experience in Data Engineering, Data Integration, or related fields, designing and maintaining robust data pipelines in high-volume environments.
  • Expertise in building and orchestrating ETL/ELT pipelines using tools such as Apache Airflow, dbt, Talend, or custom scripting (e.g., Python, SQL) across structured and unstructured data sources.
  • Experience with API testing tools and integration development.
  • Strong command of data lake architecture and distributed data processing, with practical experience managing large-scale datasets (500M+ records) and optimizing storage in platforms like AWS S3, Azure Data Lake, or Google Cloud Storage.
  • Advanced proficiency in SQL (T-SQL, PL/SQL) and stored procedures, with ability to design efficient queries, indexes, partitions, and transformations in high-performance environments.
  • Analytical mindset with strong debugging and problem-solving skills, capable of identifying bottlenecks, optimizing pipelines, and ensuring end-to-end data reliability.
  • Competence in reporting and data manipulation using Excel, including complex formulas, pivot tables, data validation, and integration with external data sources.
  • Excellent communication skills, with the ability to articulate technical concepts clearly to non-technical stakeholders and collaborate effectively across distributed teams.
  • Bilingual communication capacity, with native-level Spanish and intermediate English proficiency (C1), ensuring effective collaboration in regional and international production contexts.


Nice to have (Not exclusive)

  • Experience working in remote, distributed, or cross-cultural teams, with exposure to agile methodologies (Scrum, Kanban) and asynchronous collaboration tools (Jira, Confluence, Notion).
  • Familiarity with streaming data architectures (e.g., Kafka, AWS Kinesis, Apache Flink) and real-time processing frameworks.
  • Hands-on experience with modern data stacks (e.g., Snowflake, BigQuery, Redshift, Databricks) and orchestration platforms (e.g., Prefect, Dagster).
  • Relevant Certifications (non-exclusive):
    • AWS Certified Data Analytics – Specialty.
    • Microsoft Certified: Azure Data Engineer Associate.
    • Google Cloud Professional Data Engineer.


Core Competencies:

At our company, we believe that success is not just about technical proficiency but also how you work with others and approach challenges. As part of our team, you’ll be expected to demonstrate the following key competencies:

  • Excellent communication and interpersonal skills: Ability to effectively communicate and collaborate with clients and team members. 
  • Problem-solving and critical thinking: Ability to manage complex projects and find solutions to ensure project success. 
  • Organizational skills: Strong attention to detail and ability to manage multiple clients and projects simultaneously. 
  • Resilience and adaptability: Ability to navigate challenges and adapt to shifting client needs. 
  • Proactive and resourceful: Anticipate client needs and take initiative to ensure high-quality service delivery. 


            How to apply: 

            If you are passionate about building scalable data ecosystems and collaborating with cross-functional teams to drive real business impact, we’d love to connect with you. To apply, please follow these steps:

            • Submit Your CV: Upload an updated and detailed CV highlighting your experience in data engineering, ETL/EDI development, integration tools, cloud platforms, and technical stack.
            • Certifications (Optional but recommended): Include copies or links to relevant certifications (e.g., GCP, AWS, Azure, dbt, Snowflake) that validate your expertise in the field.
            Get hired quicker

            Be the first to apply. Receive an email whenever similar jobs are posted.

            Ace your job interview

            Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

            Senior Data Engineer Q&A's
            Report this job
            Apply for this job