INTERNSHIP DETAILS

Data Scientist (Internship)

CompanyDayOne
LocationSingapore
Work ModeOn Site
PostedJanuary 23, 2026
Internship Information
Core Responsibilities
The intern will assist in designing and developing scalable ETL pipelines and collaborate with the team to integrate various data sources into Dataverse. They will also analyze and troubleshoot data pipeline performance to ensure reliability and scalability.
Internship Type
full time
Company Size
818
Visa Sponsorship
No
Language
English
Working Hours
40 hours
Apply Now →

You'll be redirected to
the company's application page

About The Company
DayOne is a data center pioneer that develops and operates next-gen digital infrastructure for industry leaders who demand reliable, cost-effective and quickly scalable solutions. Our cutting-edge facilities empower hyperscalers and large enterprises to achieve rapid deployment and enhance connectivity, driving transformative engagement and innovation as we shape the future of industries. DayOne's data centers are located across key markets, including Singapore, Johor (Malaysia), Batam (Indonesia), Greater Bangkok, Hong Kong SAR, Tokyo, and beyond. Headquartered in Singapore, DayOne’s leadership team draws on over two decades of industry experience and a track record of building Asia's largest data center business. With DayOne, they have created the SIJORI (Singapore, Johor, and Riau Islands) market as a global data center hub. As demand for strategically located and customized data centers rises, DayOne's entrepreneurial spirit, customer-first strategy, deep local partnerships, and agile executional capabilities uniquely position us to power the growth ambitions of leading hyperscalers and large enterprises around the world.
About the Role

Join DayOne – Shaping the Future of Data Infrastructure

DayOne is a global leader in the development and operation of high-performance data centers. As one of the fastest-growing companies in the industry, we’ve built a robust presence across Asia and Europe — and we’re just getting started.

As we expand into new international markets, we’re looking for talented, driven individuals to join us on this exciting journey. This is more than a job — it’s an opportunity to be a key contributor to our dynamic team and help shape the future of global data infrastructure.

If you're passionate about innovation, technology, and growth, we invite you to be part of DayOne’s next chapter.

Your responsibilities will include: 

  • Assisting in designing and developing scalable ETL pipelines to process large volumes of data efficiently, integrating data from systems like Autodesk Construction Cloud (ACC), SAP, and other software. 

  • Collaborating with the team to integrate various data sources into Dataverse, automating data flows for seamless analysis and reporting. 

  • Building and maintaining RESTful APIs to streamline data exchange between internal and external systems. 

  • Analyzing and troubleshooting data pipeline performance to ensure high reliability and scalability. 

  • Supporting data integration efforts to ensure the timely availability of clean, accurate, and consistent data across platforms. 

  • Participating in code reviews and ensuring the implementation of best practices in data engineering. 

Qualifications: 

  • Currently pursuing a degree in Computer Science, Data Engineering, Information Technology, or a related field. 

  • Proficiency in Python, with hands-on experience in building ETL pipelines. 

  • Familiarity with API design, development, and integration (RESTful APIs), especially for systems like Autodesk Construction Cloud (ACC) and SAP. 

  • Understanding of Dataverse as a data storage solution and best practices for handling large datasets. 

  • Strong problem-solving skills with a passion for building scalable data solutions. 

  • Ability to work collaboratively in a team environment and communicate technical concepts effectively. 

Bonus Points: 

  • Experience with cloud-based platforms like AWS, Azure, or Google Cloud. 

  • Knowledge of containerization technologies such as Docker. 

  • Exposure to data processing frameworks such as Apache Spark or Apache Kafka. 

  • Familiarity with CI/CD processes for data engineering tasks. 

DayOne is proud to be an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.

If you're ready to grow with one of the fastest-moving companies in the data center industry, apply now and be part of our global journey.

Key Skills
PythonETL PipelinesAPI DesignData IntegrationDataverseProblem-SolvingCollaborationCloud PlatformsDockerApache SparkApache KafkaCI/CD Processes
Categories
TechnologyData & AnalyticsEngineering