ABOUT US:
Revelator is a leading provider of business solutions for the music industry. Our all-inclusive music distribution platform, API, protocol, and web3 infrastructure, enhances efficiency in music distribution, financial reporting and simplifies royalty operations. We offer a wide range of services, including catalog management, supply chain, income tracking, rights management, and business intelligence. By leveraging our innovative solutions, music businesses can easily navigate the evolving landscape and capitalize on new opportunities.
THE ROLE:
The Data Ops Engineer is responsible for day-to-day technical development and delivery of data pipelines into Revelator’s data and analytics platform. You will ensure delivery of solutions based on the backbone of good architecture, best data engineering practices around operational efficiencies, security, reliability, performance and cost optimization.
Key Responsibilities:
- Design, build and optimize data engineering pipelines to extract data from different sources and applications and feed into cloud data platform.
- Build, test and productize data extraction, transformation and reporting solutions within cloud platform.
- Provide accurate and timely information that can be used in day to day operational and strategic decision making.
- Code, test, and document new or modified data models and ETL/ELT tools to create robust and scalable data assets for reporting and analytics.
- Contribute to our ambition to develop a best practice Data and Analytics platform, leveraging next generation cloud technologies.
- Define and build the data pipelines that will enable faster, better, data-informed decision-making within the business.
- Ensure data integrity within reports and dashboards by reviewing data, identifying and resolving gaps and inconsistencies, and escalating as required to foster a partnered approach to data accuracy for business reporting purposes.
Requirements:
- Bachelor's degree in Computer Science, Engineering, or related field.
- 5+ years of relevant work experience as a Data Ops/Data Integration Engineer, including:
- Building ETL/ELT solutions for large scale data pipelines.
- General expertise with SQL and database management (Azure SQL Server). Including performance optimization.
- Experience with CI/CD for data pipelines
- Using Data Ops to develop data flows and the continuous use of data.
- Data modeling
- Data analysis
- Developing technical and support documentation, translate business requirements and needs into reporting and models.
Required Technical Skills
- Azure Data Factory
- PowerBI, PowerBI scripting & automation
- Snowflake, Snowpipes
- .NET / C#
- Python
- SQL, Stored Procedures
Other Skills
- Excellent problem-solving skills and the ability to work independently.
- Strong teamwork and collaboration skills with the ability to lead and mentor junior developers.
- Exceptional communication skills, both written and verbal in English.
Please mention the word **ENTRANCING** and tag RMTg4LjE2Ni4xMDAuMTkx when applying to show you read the job post completely (#RMTg4LjE2Ni4xMDAuMTkx). This is a beta feature to avoid spam applicants. Companies can search these words to find applicants that read this and see they're human.
Tags
Apply to job