Mô tả công việc
Job description
Role Summary/Objective:
As the Senior Data Engineer (Architecture Lead), you will take a leading role in architecting, designing, and overseeing the implementation of robust and scalable data engineering solutions for our custom-built Loyalty/Promotion system and its integration with the enterprise Customer Data Platform (CDP). You will provide technical leadership in data pipeline design, data modeling within engineering contexts, and ensure our data infrastructure effectively supports advanced analytics, AI/ML initiatives, and the critical data needs of our upcoming airline venture. This role combines deep hands-on engineering expertise with a strong architectural vision for data solutions.
Key Responsibilities:
- Lead the technical design and architecture of data pipelines, data transformation processes, and data storage solutions for the Loyalty platform and CDP integrations, ensuring alignment with overall enterprise architecture guidelines.
- Define and champion best practices for data engineering, including data quality frameworks, pipeline development standards, CI/CD for data processes, and performance optimization.
- Oversee and contribute to the development, testing, and deployment of complex data pipelines, ensuring they are efficient, reliable, and scalable to handle large data volumes (e.g., from ~40 million annual customer interactions).
- Collaborate with Data Scientists, Backend Engineers (including the Lead BE), and Product Owners to understand data requirements and translate them into effective data engineering solutions.
- Provide technical leadership and mentorship to other Data Engineers (DE #1, DE #2) within the team.
- Drive the technical strategy for data ingestion, processing, and integration related to the CDP and Loyalty systems, in collaboration with the Head/PO and Enterprise Architects.
- Ensure data engineering solutions adhere to data governance, security, and compliance requirements.
- Evaluate and recommend new data engineering technologies, tools, and methodologies to enhance our capabilities.
Job requirements
- Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field.
- Minimum 7+ years of extensive hands-on experience in data engineering, with at least 2-3 years in a technical leadership or data engineering architecture-focused role.
- Proven expertise in designing and building large-scale batch and real-time data pipelines using technologies like Spark, Kafka, Airflow, etc.
- Strong proficiency in SQL and programming languages such as Python or Scala.
- Deep experience with Big Data technologies, cloud data platforms (AWS, Azure, GCP), and various database systems (SQL, NoSQL).
- Solid understanding of data modeling principles, data warehousing, data lakes, and data integration patterns.
- Experience with CI/CD practices for data pipelines and infrastructure-as-code is highly desirable.
- Excellent problem-solving, analytical, and technical leadership skills.
- Strong communication skills, with the ability to articulate complex technical designs and decisions.