print logo

Job Offer: Big Data Engineer - Jumia Food

Bachelor’s / Master’s / PhD degree in Computer Science or related field 3+ years of building data pipelines experience or equivalent are needed.
JUMIA Egypt | 21.11.2021


Job Description



Jumia is the leading e-commerce platform in Africa. It is built around a marketplace, Jumia Logistics, and JumiaPay. The marketplace helps millions of consumers and sellers to connect and transact. Jumia Logistics enables the delivery of millions of packages through our network of local partners.

JumiaPay facilitates the payments of online transactions for Jumia's ecosystem. With over 1 billion people and 500 million internet users in Africa, Jumia believes that e-commerce is making people's lives easier by helping them shop and pay for millions of products at the best prices wherever they live. E-commerce is also creating new opportunities for SMEs to grow, and job opportunities for a new generation to thrive.

With over 5,000 employees in more than 10 countries in Africa, Jumia is led by top talented leaders offering a great mix of local and international talents and is backed by very high-profile shareholders. Jumia is committed to creating a sustainable impact in Africa. Jumia offers unique opportunities in a vibrant and booming environment, creating new jobs, new skills, and empowering a new generation.
 


Main Responsibilities:



  • Contribute to the design and construction of the company data lake

  • Collect, store, process, and support the analysis of huge sets of data, both structured and unstructured

  • Choosing optimal solutions to use in big data use cases, then maintain, implement, monitor and integrate them with the data and IT architecture used across the company

  • Build and teach the company about big data technologies, participate actively in the journey setup, from the discovery phase until the corporate data-centric transformation

  • Build solutions with key concepts: security and privacy by design





Job Requirements



Requirements:



  • Knowledge of the Linux operating system (OS, networking, process level)
    Understanding of Big Data technologies (Hadoop, Hbase, Spark, Kafka, Flume, Hive, etc)

  • Understanding of one or more object-oriented programming languages (Java, C++, C#, Python)

  • Fluent in at least one scripting language (Shell, Python, Ruby, etc.)

  • Experience with at least one Hadoop distribution (Cloudera, MapR or preferably Hortonworks)

  • Experience building complex data processing pipelines using continuous integration tools

  • Experience with Cassandra, MongoDB or equivalent NoSQL databases

  • Experience developing in an Agile environment

  • Bachelor’s / Master’s / PhD degree in Computer Science or related field

  • 3+ years of building data pipelines experience or equivalent

  • Nice to have: Experience in designing big data/distributed systems

  • Nice to have: Experience creating and driving large scale ETL pipelines


We offer:



  • A unique experience in an entrepreneurial, yet structured environment

  • The opportunity to become part of a highly professional and dynamic team working around the world

  • An unparalleled personal and professional growth as our longer-term objective is to train the next generation of leaders for our future internet ventures


To apply click here.