INTERNSHIP DETAILS

AI/Data Engineer Intern

CompanyDP Architects
LocationSingapore
Work ModeOn Site
PostedApril 16, 2026
Internship Information
Core Responsibilities
The intern will support data pipeline development, ingestion, and storage procedures to ensure data is ready for AI workflows. They will also be responsible for deploying digital solutions, ensuring system security, and maintaining application performance.
Internship Type
full time
Company Size
Not specified
Visa Sponsorship
No
Language
English
Working Hours
40 hours
Apply Now →

You'll be redirected to
the company's application page

About The Company

No description available for this Company.

About the Role

COMPANY:

Spatial Intelligence For Design Pte Ltd

JOB SUMMARY:
A hands-on AI/Data Engineering Intern to support our data pipeline and preparation work. You will be working closely with our AI Engineer to design data ingestion pipelines and storage procedures to ensure our data is clean, structured, and ready for use in AI workflows. You will also design algorithms for the actual AI workflows.

JOB RESPONSIBILITIES:
Data pipeline development:

  • Analyse and provide potential solutions for eliminating uncertainty in identifying objects in construction data;

  • Integrate pipeline with existing applications;

Deployment, Security and Maintenance:

  • Deploy digital solutions to various environments, ensuring scalability, security, and reliability;

  • Implement security measures for applications, such as securing APIs, managing authentication and authorization mechanisms, and ensuring data integrity;

  • Monitor, maintain, and improve the performance of deployed applications, including bug-fixing;

Technical Collaboration:

  • When needed, work with stakeholders to understand requirements and technical specifications to deliver functional solutions;

  • Collaborate with designers, architects, and other team members to ensure solutions meet user needs and project goals;

Testing and Quality Assurance:

  • Use test harnesses to test hypotheses and continuously validate efficacy of data verification pipeline and downstream AI workflows;

  • Participate in code reviews and contribute to the establishment of best practices;

Documentation:

  • Prepare and maintain clear documentation for developed solutions, including APIs, workflows, and deployment guidelines;

  • undefined

Continuous Learning: 

  • Stay updated with the latest technologies, frameworks, and tools to contribute innovative ideas to the team;

JOB REQUIREMENTS:
Bachelor’s degree in Computer Science, Software Engineering, or a related field;

  • Technical Skills:

    • Experience in:

      • Data analysis technologies:

        • Python;

        • pandas;

        • matplotlib;

      • Database technologies: Postgres, MongoDB;

      • Deployment technologies: Docker;

      • Version control: Git;

  • Problem-Solving Skills: Strong analytical and problem-solving skills with attention to detail;

  • Team Collaboration: Ability to work effectively in a team environment, communicate ideas clearly, and contribute to project success;

  • Learning Attitude: Willingness to learn and adapt to new technologies and frameworks;

GOOD TO HAVE:

  • Familiarity with machine learning methodologies and technologies such as Pytorch or TensorFlow;

  • Familiarity with graphics processing algorithms;

  • Familiarity with software development for AEC (Architecture, Engineering, and Construction) tools like Revit, Rhino, or Sketchup;

  • Knowledge of API design and development;

  • Exposure to CI/CD pipelines for software deployment;

  • Understanding of Agile methodologies and project management tools ([e.g., Jira, Trello, Kanban]);

  • A portfolio or GitHub repository showcasing past projects or code samples.

Key Skills
PythonPandasMatplotlibPostgresMongoDBDockerGitData analysisAPI developmentMachine learningPytorchTensorFlowSoftware developmentData pipeline developmentQuality assuranceTechnical documentation
Categories
TechnologyData & AnalyticsSoftwareEngineeringConstruction