Data Engineer Intern

You'll be redirected to
the company's application page
About the Team:
The Product & Technology department is composed of Product Managers, Engineers and Designers. Ownership, meritocracy and collaboration are at our core. We are not afraid to think differently, embrace new ideas and dream big. We empower ownership and share responsibility. We support each other to achieve and grow. Our goal is simple - to create products that delight our customers and readers.
Purpose of the position:
We are seeking a skilled Data Engineer to contribute to our business strategy projects. In this role, you will report to the lead engineer and collaborate closely with our engineering and product squad within the organisation. This role supports pipeline development on an agile team, delivering key datasets that empower the product and the go to market team to grow our traffic and our subscribers base.
In this role, you will:
- Assist in the full data pipeline development lifecycle, including helping to migrate datasets cross clouds, verify data integrity, deploy and schedule in UAT and production.
- Support the building and deployment of ETL pipelines in Airflow and data models in dbt.
- Participate in code reviews to learn best practices, ensure code quality, and engage in knowledge sharing among team members.
- Stay curious and up-to-date with technologies, trends, and best practices in applying AI to boost data engineering productivity and efficiency
- Assist across the applications lifecycle, gaining exposure to development, testing, documentation, and ongoing support.
Skills and Experience that will lead to success:
- Currently pursuing or recently graduated with a Degree in Computer Science, Data Science, Machine Learning, Artificial Intelligence or a related discipline.
- Academic projects, coursework, or previous internship experience related to software development or data engineering.
- Good command of Python and SQL to create scripts, debug data issues, and run automations
- Familiarity with or keen interest in learning data modeling and pipeline design (ETL, ELT, data warehousing).
- Experience with Airflow and dbt is a plus
- Knowledge of Kubernetes, Docker, CI/CD pipelines, microservice architecture, and Cloud concepts is an advantage.
- Ability to problem solve through collaboration and a willingness to ask the right questions.
- Good team player with a strong proactive willingness to learn.
Work location will be at Causeway Bay office.
Our Privacy Notice aims to comply with all relevant data privacy and protection laws. You should read the Privacy Notice in full at corp.scmp.com/privacy-policy.
Prep Tools
ACE YOUR INTERVIEW IN REAL-TIME
Silent AI Co-Pilot
Real-time interview help
"Why South China Morning Post?"
💡 Mention their Newspaper Publishing and your passion for Python
YOUR PERSONALIZED PREP ROADMAP
0-2 Data Engineer Intern
Interview Prep Plan
STUCK ON A QUESTION? PRACTICE IT
Practice Any Question
Get instant AI feedback
"How would you design a scalable system for South China Morning Post's use case?"