VNG Career Site Header

Chia sẻ việc làm

  • Logo Footer
  • Logo Footer

Công việc liên quan

Tìm công việc

Data Engineer, Zalopay (Data Platform)

OfficialDataData Engineering25-PDP-2452
locationtp.hồ chí minh
Xem mô tả bằng
Tiếng Việt

Mô tả công việc

As a Big Data Platform in Zalopay, we are developing a high quality data warehouse that can address business problems, provide data insight to each business aspect, transform product to Data-driven decision making manner. 

We are looking for a motivated Data Engineer with to join our Data Platform team. In this role, you will be responsible for designing and building ETL pipelines that power user segmentation for personalization, targeting, and analytics use cases. You will also develop and maintain APIs that expose these segments to internal systems in a reliable, secure, and scalable way.

Responsibilities:
  • Build, monitor, and maintain robust ETL/ELT pipelines for processing and transforming data.
  • Implement a Customer Data Platform (CDP) to centralize and unify customer data across multiple touchpoints. Enable effective user segmentation for marketing, personalization, and analytics use cases.
  • Build and maintain a robust data access layer that allows consumers to retrieve segment data securely and efficiently.
  • Ensure API responses are fast, accurate, and scalable, even with large user datasets.
  • Implement access controls, rate limiting, and authentication mechanisms to protect sensitive user data.
  • Manage and maintain the data infrastructure, including tools using the process ETL pipeline and warehouse storage system.

Yêu cầu

  • Bachelor’s degree in computer science, engineering, mathematics, or a related technical discipline.
  • 2+ years experience working as a Data Engineer role, ETL with large amounts of data.
  • Proficiency in SQL and experience with database management systems.
  • Strong programming skills in Python, Scala or Java.
  • Knowledge with using the following software/tool in big data field:
  • Big data tool: Hadoop, Apache Spark, Presto, Kafka, etc.
  • NoSQL and OLAP database: MongoDB, Clickhouse, Elastic search, etc.
  • Data pipeline and workflow management tools: Luigi, Airflow, etc.
  • Stream processing systems (e.g., Kafka, Flink) for handling real-time data pipelines.
  • Experience building APIs that interact with data warehouses, NoSQL stores, or object storage to serve user-level data.
Soft Skills:
  • Ability to work independently and as part of a team.
  • Strong organizational skills and the ability to manage multiple projects simultaneously.
  • Proactive approach to identifying and solving problems.
  • Attention to Detail: High level of accuracy and attention to detail.