Data Engineer Cloudera CDP - Anankei

Brussels BE

Our Customer in the Telecom sector is looking for a Data Engineer Cloudera CDP (Streaming – Kafka).

As Data Engineer you will interact with the business and IT people (architects, functional analysts, project managers, data scientists, developers) and participate in the data transformation program. As part of this program, we are re-engineering our enterprise data platform and machine learning solutions and moving to a CDP technology stack (Cloudera Data Platform). In this re-engineering and migration, you will design and develop solutions for real-time data ingestion, processing and serving in a highly available environment. You will also be working on several generic frameworks e.g. framework for logging, for access management, …

You are an IT professional with a minimum of five years of experience with proven experience in big data engineering.
You have experience with the Cloudera products (HDFS, Ozone, Hive, Impala, Spark, Oozie, Atlas, Ranger, …). Experience in real-time technologies is mandatory (Nifi, Kafka, Flink, Spark Streaming, High Availability setup …).
You have experience with CI/CD (Git, Jenkins / Gitlab CI, Ansible, Nexus)
You have experience in Java and/or Python programming languages.
Experience in PowerBI, Scala, Docker and Kubernetes are an advantage.
You are fluent in English, both spoken and written. Knowledge of Dutch and/or French is an advantage.
You can work autonomously, you can cooperate effectively with different teams onsite and offshore, you are eager to learn new technologies, and you like sharing knowledge and documenting solutions with the right level of detail.


Kafka Specific experience:
Kafka Administrator profile:
• Install, setup and configure Kafka Cluster
• Administration of Kafka and Monitoring of health of Kafka infrastructure
• Ability to setup Kafka in a secure and resilience setup (Reliability, Resiliency and Kafka Security)
• Maintain and manage schema registry
• Implement broker and partition architecture
• Know-how of components like YARN, Zookeeper
• With know-how of a Kafka streaming profile is a plus

Kafka Streaming engineering profile:
• Design and implement broker and partition architecture
• Develop consumable topics and corresponding API/microservices to render data consumable.
• Configure Producers and Consumers
• Mastering all core components required in a Kafka eco-system, like Zookeeper, YARN
• Integration with Flink and Nifi
• Stream processing knowledge Kafka with Flink
• With know-how of a Kafka administration profile is a plus