Working At Voyc
Voyc, an award-winning leader in contact centre AI software, is seeking a Data Engineer with expertise in stream processing pipelines and a deep understanding of Elasticsearch to join our innovative and dynamic team. Our mission is to help financial services companies enhance customer service and ensure compliance by revolutionising the contact centre quality assurance process through cutting-edge AI technology. By transcribing and analysing customer interactions, our AI solution identifies potential problems, enabling us to handle customer complaints and regulatory compliance breaches efficiently. Our solution empowers financial institutions to monitor 100% of customer interactions, a critical need in an industry where compliance and customer service are paramount.
Responsibilities
As a Data Engineer at Voyc, specialising in analytis pipelines and Elasticsearch, you will play a pivotal role in advancing our data infrastructure and analytics capabilities. Your responsibilities will include:
- Designing, implementing, and maintaining robust data pipelines, ensuring the efficient and reliable flow of data across our systems.
- Developing and maintaining Elasticsearch clusters, fine-tuning them for high performance and scalability.
- Collaborating with cross-functional teams to extract, transform, and load (ETL) data into Elasticsearch for advanced analytics and search capabilities.
- Troubleshooting data pipeline and Elasticsearch issues, ensuring the integrity and availability of data for analytics and reporting.
- Participating in the design and development of data models and schemas to support business requirements.
- Continuously monitoring and optimising data pipeline and Elastic performance to meet growing data demands.
- Collaborating with data scientists and analysts to enable efficient data access and query performance.
- Contributing to the evaluation and implementation of new technologies and tools that enhance data engineering capabilities.
- Demonstrating strong analytical, problem-solving, and troubleshooting skills to address data-related challenges.
- Collaborating effectively with team members and stakeholders to ensure data infrastructure aligns with business needs.
- Embodying the company values of playing to win, putting people over everything, driving results, pursuing knowledge, and working together.
- Implementing standards, conventions and best practices.
Our Stack
As a Data Engineer with a focus on Kafka pipelines and Elastic, you will work with the following technologies:
Data Pipelines:
- Kafka / ksqlDB
- Python
- Redis
Data Storage and Analysis:
- Elasticsearch, cluster management and optimisation
- AWS S3
- PostgreSQL
DevOps:
Requirements
Skills and Requirements
To excel in this role, you should possess the following qualifications and skills:
- Proven experience in designing and implementing data pipelines.
- Experience with end-to-end testing of analytics pipelines.
- Expertise in managing and optimising Elasticsearch clusters, including performance tuning and scalability.
- Strong proficiency with data extraction, transformation, and loading (ETL) processes.
- Familiarity with data modeling and schema design for efficient data storage and retrieval.
- Good programming and scripting skills using languages like Python, Scala, or Java.
- Knowledge of DevOps and automation practices related to data engineering.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Strong analytical and problem-solving abilities, with a keen attention to detail.
- A commitment to staying up-to-date with the latest developments in data engineering and technology.
- Alignment with our company values and a dedication to driving positive change through data.
Bonus Points
But not required.
- Experience with data engineering in an agile / scrum environment.
- Familiarity with ksqlDB / Kafka or other stream processing frameworks.
- Familiarity of data lakes and the querying thereof.
- Experience with integrating machine learning models into data pipelines.
- Familiarity with other data-related technologies and tools.
Benefits
What’s in it for You?
- Flexible remote work options
- Competitive compensation package
- Impactful work that directly benefits customers and companies
- Inclusive and diverse work environment
- Opportunity to make a lasting difference while advancing your career
- Highly regarded company culture
- Investment in upskilling and professional development
If you're a passionate Data Engineer with expertise in Kafka pipelines and a thorough understanding of Elastic, looking to contribute to cutting-edge technology and make a difference in the financial services industry, we invite you to join our motivated and purpose-driven team at Voyc. Apply now to be part of our journey in transforming customer interactions and compliance monitoring through data innovation.