Home > Find Jobs

Job Search

A tropical beach
Voyc company logo

Voyc

South Africa

Posted on: 20 October 2023

Experience

n/a

Work

n/a

Employee Type

n/a

Salary Range

n/a

Senior Data Engineer

Senior Data EngineerVoyc, an award-winning leader in contact centre AI software, helps financial services companies be compliant and improve customer service by monitoring 100% of their customer interactions. Voyc uses artificial intelligence to transcribe and automatically analyse and review interactions to identify and handle problematic interactions (customer complaints and interactions with vulnerable customers) according to regulatory requirements. The AI solution “listens” for potential problems and compliance breaches and alerts users (supervisors and quality assurance teams) of calls that need further review and remediation.

Financial services companies have a huge compliance problem, particularly in the context of contact centre quality assurance. These companies struggle to monitor 100% of customer interactions due to the high cost of employing more and more quality assurance assessors. As a result, the average company monitors only 4-5% of customer interactions. This results in a large number of unchecked interactions, including complaints, compliance breaches and poor agent behaviour.

That’s where Voyc comes in! Voyc has built strong compliance models and cutting-edge software to revolutionise the contact centre quality assurance process. In 2021, Voyc was selected as the winner of the Accenture Blue Tulip Awards and the KPMG Digital Innovation Challenge. All of this has been achieved by our team of highly-motivated, diverse and purpose-driven individuals.


Job Description
If you're a passionate Senior Data Engineer with expertise in Kafka pipelines and a thorough understanding of Elastic, looking to contribute to cutting-edge technology and make a difference in the financial services industry?. If so, we have the perfect opportunity for you! Voyc is a SaaS start up focused on using cutting-edge AI, Machine Learning and other technology to deliver a product that promotes positive change in society by increasing Consistency and Care in call centres for regulated companies. You can find out more about us at voyc.ai and meet the team and dig into our values on our About Us page


To further our mission of promoting Consistency and Care, we’re looking for talented and motivated Senior Data Engineer to join our team. In this role, you will have the chance to work on exciting projects, collaborate with experienced developers, and contribute to a positive and dynamic company culture. If the prospect of solving unique and challenging problems appeals to you we encourage you to apply for this opportunity.


Responsibilities

As a Data Engineer at Voyc, specialising in Kafka pipelines and Elasticsearch, you will play a pivotal role in advancing our data infrastructure and analytics capabilities. Your responsibilities will include:

  • Designing, implementing, and maintaining robust data pipelines using Kafka, ensuring the efficient and reliable flow of data across our systems.
  • Leveraging your expertise in stream proccessing to optimise real-time data processing and streaming, enhancing our ability to analyse customer interactions.
  • Developing and maintaining Elasticsearch clusters, fine-tuning them for high performance and scalability.
  • Collaborating with cross-functional teams to extract, transform, and load (ETL) data into Elasticsearch for advanced analytics and search capabilities.
  • Troubleshooting data pipeline and Elasticsearch issues, ensuring the integrity and availability of data for analytics and reporting.
  • Participating in the design and development of data models and schemas to support business requirements.
  • Continuously monitoring and optimising data pipeline and Elastic performance to meet growing data demands.
  • Collaborating with data scientists and analysts to enable efficient data access and query performance.
  • Contributing to the evaluation and implementation of new technologies and tools that enhance data engineering capabilities.
  • Demonstrating strong analytical, problem-solving, and troubleshooting skills to address data-related challenges.
  • Collaborating effectively with team members and stakeholders to ensure data infrastructure aligns with business needs.
  • Embodying the company values of playing to win, putting people over everything, driving results, pursuing knowledge, and working together.

Requirements

To excel in this role, you should possess the following qualifications and skills:

  • Proven experience in designing and implementing data pipelines using stream processing technologies.
  • Experience with end-to-end testing of analytics pipelines.
  • In-depth expertise in managing and optimising Elasticsearch clusters, including performance tuning and scalability.
  • Strong proficiency with data extraction, transformation, and loading (ETL) processes.
  • Familiarity with data modeling and schema design for efficient data storage and retrieval.
  • Proficiency in troubleshooting data pipeline and Elastic-related issues to ensure data integrity and availability.
  • Solid programming and scripting skills, particularly in languages like Python, Scala, or Java.
  • Knowledge of DevOps and automation practices related to data engineering.
  • Excellent communication and collaboration skills to work effectively with cross-functional teams.
  • Strong analytical and problem-solving abilities, with a keen attention to detail.
  • A commitment to staying up-to-date with the latest developments in data engineering and technology.
  • Alignment with our company values and a dedication to driving positive change through data.


Our Stack

As a Data Engineer with a focus on Kafka pipelines and Elastic, you will work with the following technologies:

Data Pipelines:

  • Kafka / stream processing
  • Python
  • Redis

Data Storage and Analysis:

  • Elasticsearch
  • Elasticsearch clusters management and optimisation
  • PostgreSQL

DevOps:

  • AWS


Nice to have

But not required.

  • Experience with data engineering in an agile / scrum environment.
  • Familiarity with ksqlDB.
  • Familiarity of data lakes and the querying thereof.
  • Experience with integrating machine learning models into data pipelines.
  • Familiarity with other data-related technologies and tools.

Benefits

As well as the opportunity to learn and expand your skills while making the world a better place, working at Voyc offers the following

  • Caring, growth-focused team culture
  • Flexible working hours
  • Remote work
  • Company-sponsored lunches, travel and learning opportunities
  • An inclusive & representative workplace
  • Voyc has partnered with an outsourced HR company that specialises in career development and individual development plans.
  • 15 working days annual leave, increasing by 2 days per year.
  • 1 day off for personal administrative tasks every 2 months.
  • Equal paid maternity and paternity benefits.
  • Annual offsite planning and reflection week away in South Africa.
  • Visit the Amsterdam Head Office with flights and accommodation included.
  • Personal development allowance up to R6,000 per quarter.
  • Opportunity to participate in our Share option program.
  • Opportunity to relocate to The Netherlands upon certain conditions.

Note: All benefits are subject to specific terms and conditions.


Company Values

  • Playing to win
  • Putting people over everything
  • Driving results
  • Pursuing Knowledge
  • Working together

Tags

AWS
elasticsearch
excel
java
kafka
Apply to job