Our centralized Production Operations team is expanding and thus, we are keen to identify a Devops specialist who has a strong interest in Big Data.
GiG harnesses the power of Big Data across the iGaming ecosystem and this role would be key to help maintain and improve on the existing solution that is constantly evolving according to business requirements.
You’d be really excited to:
- Script different isolated environments such as an environment for QA, Production, Staging and User Acceptance Testing Environments in an agnostic way.
- Maintain and improve the governance of existing Data Stack, ranging from Data Engineering Tools, Data Science Modelling and also Business Intelligence reporting toolsets.
- Develop processes for the development teams so that features can be automatically tested and merged to one code base.
- Setting up and maintaining platforms such as ticketing systems to manage requests from different verticals within the company.
- Responsible for administration of permissions throughout the team.
- Doing POCs with different approaches for automation of deployments with cutting edge technologies such as Clickhouse, Apache Spark, Confluent platform (using mainly Avro, KSQL, Kafka Connect and Kafka), Apache Ignite, Apache NiFi and NiFi Registry.
- Suggest tools for increased productivity, security, reliability, and performance.
- Maintain and develop tools and agents such as Grafana for performance monitoring, security monitoring.
- Use Grafana for monitoring our applications.
- Help with CI/CD processes.
- Maintain kubernetes clusters that support our Spark and Spring containerised applications.
- Work with third parties for guidance and systems integration using best practices with tools such as Clickhouse or Kafka.
- Provide guidance and direction to the development team and follow best practices.
- Enhance current code stack to help with automation, versioning, tags and automation of Pipelines.
Who you are:
- Hold a Bachelor’s Degree in Computer Science or equivalent
- Experience in this role or automation with Apache Projects
- Have worked with both Linux and Windows environments
- A strong background in Big Data, technologies such as HDFS, Hadoop, HDFS, Apache Kafka, Apache Spark, Apache NiFi.
- A strong understanding of DevOps methodologies and concepts.
- Strong infrastructure knowledge on cloud platform providers such as Azure, GCP or AWS.
- Fluent with UNIX.