Type: Contract, per-project.
Location: Remote — within LATAM with ET ±1 hour
Availability: Contractor (40 hours per week)
Job Title: Data / Platform Engineer (AWS & Data Pipelines)
We are looking for a highly motivated Data / Platform Engineer to join our team and help design, build and operate scalable data pipelines and cloud-based solutions.
In this role, you will work closely with engineering and product teams to build both streaming and batch data pipelines, contribute to system design, and help drive automation and monitoring across our data platform.
Key Responsibilities
Design, build, maintain and primarily operate scalable streaming and batch data pipelines, with a strong focus on maintenance, monitoring, troubleshooting and continuous improvement of existing pipelines.
Work with AWS services, including Redshift, EMR and ECS, to support data processing and analytics workloads.
Develop and maintain data workflows using Python and SQL.
Orchestrate and monitor pipelines using Apache Airflow.
Build and deploy containerized applications using Docker and Kubernetes.
Break down high-level system designs into well-defined, deliverable tasks with realistic estimates.
Collaborate with cross-functional teams in a fast-paced and distributed environment across the US and Europe.
Drive automation, observability and monitoring to improve reliability, performance and operational efficiency.
Support knowledge transfer and ownership handover as part of the planned transition to the consuming team.
Required Qualifications
Strong professional experience with Python and SQL.
Hands-on experience with AWS, specifically Redshift, EMR and ECS. AWS experience is mandatory (other cloud providers are not considered equivalent for this role).
Proven experience building and operating both streaming and batch data pipelines.
Professional experience with Apache Airflow, Docker and Kubernetes.
Ability to translate high-level system designs into actionable technical tasks and realistic estimates.
Comfortable working in dynamic and fast-paced environments and in distributed teams.
Strong interest in automation and monitoring.
Strong hands-on experience with Apache Spark.
Senior-level profile with strong autonomy, communication skills and ability to work effectively in distributed teams.
Proven ability to transfer knowledge and support ownership handovers.
Fluent or professional working proficiency in English (both written and spoken).
Nice to Have
Previous experience in the telecom industry.
Experience with machine learning systems and/or event-driven architectures.
Experience with Apache Iceberg.
(*) SOUTHWORKS only hires individuals from countries that are not blocked or sanctioned by the United States, including those identified on the United States Office of Foreign Asset Control (OFAC).
SOUTHWORKS: Global software development partner offering Development on Demand model for accelerated business growth.
Please mention you found this job on AI Jobs. It helps us get more startups to hire on our site. Thanks and good luck!
Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.
Platform Engineer Q&A's