Data Engineer Middle/Senior

Tech Stack

AIRFLOW
DBT
SQL
PYTHON
CLOUD SERVICES

Job Description

NeoGames is a leader in the iLottery and iGaming space offering solutions spanning game studios, game aggregation, lotteries, online casino, sportsbook, bingo, and managed services offered through an industry leading core platform.The Data & BI team owns the group’s Data & Analytics platforms spanning Data Engineering,Analytical Engineering and Business Intelligence to lead the group’s data-driven modernisation both internally and for its clients.The Data Engineer will play a vital role as part of a cross-functional team to develop data pipelines to ingest, transform, distribute and expose data from the group’s Core Data Lake for integration, reporting, analytics, and automations.The chosen candidate needs to be passionate about building scalable data models and architecture for consumption by other teams with the aim of making it easy for BI, Analytics, Product and other data consumers to build data-driven solutions, features and insights.Responsibilities:Create data pipelines – both as batch and in real-time - to ingest data from dissimilar sourcesCollaborate with the other teams to address data sourcing and provision requirementsDesign and monitor robust, recoverable data pipelines following best practices with an eye out for performance, reliability, and monitoringInnovation drives us - carry out research and development and work on PoCs to propose, trial and adopt new processes and technologiesCoordinate with the Product & Technology teams to ensure all platforms collect and provide appropriate dataLiaise with the other teams to ensure reporting and analytics needs can be addressed by the central data lakeSupport the Data Quality and Security initiatives by building into the architecture the necessary data access, integrity, and accuracy controlsRequirements:3+ years of experience in Data EngineeringDegree in Computer Science, Software Development or EngineeringProficient in Python.

Past exposure to Java will be considered an assetUnderstanding of RDMS, Columnar and NoSQL engines & their performanceExperience with cloud architecture and tools: Microsoft Azure, Amazon or GCPExperience with orchestration tools such as Apache AirFlow, dbtPrior exposure to the Snowflake ecosystem will be considered an assetFamiliarity with Docker/Kubernetes and containerisationStrong background in stream data processing technologies such as NiFi, Kinesis, KafkaA grasp of DevOps concepts and tools including Terraform and Ansible are an advantageUnderstanding of distributed logging platforms - ideally the ELK stackSkills:Fluency in spoken and written English is essentialPassionate about data and on the lookout for opportunities to optimisePassionate about technology and eager to trial and recommend new tools or platformsWe offer:High-level compensation and regular performance based salary and career development reviewsPossibility to work in a big and successful companyPE accounting and supportMedical insurance (health), employee assistance programPaid vacation, holidays and sick leavesSport compensationEnglish classes with native speakers, training, conferences participationReferral programTeam buildings, corporate events